Logo
Intellectt Inc

Senior Data Engineer (Washington)

Intellectt Inc, Washington, District Of Columbia, United States, 20022

Save Job

Job Description: Job Title: Senior Data Engineer Location: Northern Virginia /Metro Washington, DC (Locals Only) Only USC/GC is considered Contract

Overview The Senior Data Engineer will join a newly established Data Services team within a mission-driven nonprofit organization that delivers critical services to government customers. As the organization expands its use of data to drive decision-making across multiple business units, the Data Services team is building a modern data lakehouse platform to provide clean, timely, and accurate data. This role will play a foundational part in designing and operationalizing enterprise data pipelines and infrastructure that empower stakeholders with reliable insights.

Summary The Senior Data Engineer supports the development and maintenance of a system-wide analytics platform that enables secure, scalable access to enterprise data. This role owns end-to-end engineering efforts across ingestion, transformation, orchestration, and delivery using Azure and Microsoft Fabric technologies. The engineer will develop and optimize ETL / data pipelines, implement medallion architecture patterns, and ensure enterprise data assets are structured, integrated, and governed to meet the needs of diverse business units and external parties.

Key Responsibilities Data Pipeline Design & Development Design, develop, and implement end-to-end data ingestion and processing pipelines using Azure and Microsoft Fabric tools. Transform raw bronze data into silver (cleaned) and gold (curated, analytics-ready) layers following the Medallion architecture. Develop code and tooling to process and transform data into enterprise data models. Implement existing ETL frameworks, patterns, and standards used across the enterprise. Orchestration, Automation & Operations Schedule, orchestrate, and monitor automated and semi-automated data pipelines to ensure reliability and quality. Build automated workflows supporting ingestion and transformation observability. Ensure technical correctness, timeliness, and high-quality delivery of data services. Data Modeling, Integration & Governance Serve as a subject matter expert on enterprise data sources, structures, and definitions. Build and maintain data relationships, mappings, and linkages within the Enterprise Data Warehouse (EDW). Perform integration of data assets to support analytics and operational needs across multiple mission-driven departments. Create and manage large-scale data warehouses and lakehouse components to ensure efficient data access and retrieval. Collaboration & Communication Partner with analysts, business units, and data consumers to support exploration and decision-making. Communicate clearly and effectively with technical and non-technical stakeholders, including senior leadership and customers. Champion continuous improvement, accountability, and evidence-based decision-making within the Data Services team.

Qualifications Minimum five years of experience working with cloud data platforms such as Azure, Snowflake, AWS Redshift, or Databricks. Minimum six years of experience in SQL-based data processing. Minimum five years of application development experience using Python. At least two years of experience developing ETL pipelines within Microsoft Fabric. Strong working knowledge of data warehousing and data lake concepts, including medallion or similar architectural patterns. Demonstrated ability to deliver high-quality work on schedule and uphold team accountability standards. Proven track record of clear and continuous communication across technical and business audiences. Commitment to process improvement and leveraging information to enhance organizational performance.

Technical Skills Python: Intermediate to advanced proficiency for data processing, automation, and pipeline development. SQL: Intermediate to advanced proficiency for transformations, modeling, and performance optimization. Experience with Azure Data Factory, Microsoft Fabric Data Engineering, Delta Lake, or similar technologies. Familiarity with orchestration frameworks, metadata management, and modern ETL/ELT patterns.