DTN (Data Transmission Network)
**J****ob Description:**We are looking for a Architect at Largeto join our Engineering team. This role will work closely with the Chief Architect, Engineering leadership, and other technical staff and product owners to craft best-in-class solutions for innovating against DTN’s extraordinary data assets, and scaling its operations into the future using next-generation tools and approaches. **What you will bring to the role:**
8+ years data modeling and data engineering experience, with a focus on petabyte-scale data warehousing and processing. Commensurate experience with API design, cloud data pipelines, ETL systems, and related topics. Experience with Semantic Stack architectures a significant plus.Ability to lead and influence to aStrong experience building applications, ETL systems & data management systems using technologies within the AWS ecosystem. Experience with data cataloging & master-data-management Demonstrated ability to design and document system-level architectures, assess solution viability and tradeoffs, and present impactful analyses for decision-making on technical direction.Experience integrating with CI/CD tools such as Bitbucket Pipelines, Bamboo, Gitlab, Terraform, etc.
Strong practical experience with both Agile and SCRUM
Strong faculty with the AWS ecosystem, specifically with native-service components. Significant experience building, deploying and maintaining large-scale automated solutions on AWS.Data Science, Data Engineering and/or ML experience**Confidence-Driven:** We help customers move with clarity and conviction. We bring the data and operational knowledge leaders need to act. #J-18808-Ljbffr
8+ years data modeling and data engineering experience, with a focus on petabyte-scale data warehousing and processing. Commensurate experience with API design, cloud data pipelines, ETL systems, and related topics. Experience with Semantic Stack architectures a significant plus.Ability to lead and influence to aStrong experience building applications, ETL systems & data management systems using technologies within the AWS ecosystem. Experience with data cataloging & master-data-management Demonstrated ability to design and document system-level architectures, assess solution viability and tradeoffs, and present impactful analyses for decision-making on technical direction.Experience integrating with CI/CD tools such as Bitbucket Pipelines, Bamboo, Gitlab, Terraform, etc.
Strong practical experience with both Agile and SCRUM
Strong faculty with the AWS ecosystem, specifically with native-service components. Significant experience building, deploying and maintaining large-scale automated solutions on AWS.Data Science, Data Engineering and/or ML experience**Confidence-Driven:** We help customers move with clarity and conviction. We bring the data and operational knowledge leaders need to act. #J-18808-Ljbffr