Metronome LLC
Job Title:
Data Engineer Location:
Elkridge, MD or Annapolis Junction, MD (OnSite) Clearance:
Active TS/SCI with CI Polygraph Employment Type:
Full-Time Education: Bachelor's degree with 5+ years of experience, or
Master's degree with 3+ years of experience, or
10+ years of relevant professional experience in lieu of a degree Salary:
$110,000 Benefits:
Health, dental, vision, 401(k), PTO, and more Application:
Apply here or on our Careers Page @ Careers - Metronome, or email your resume to r.derring@wearemetronome.com
Overview
As a Data Engineer/Integrator, you'll work closely with a team of developers to fulfill data integration requirements. Your role involves writing and maintaining code using an Extract-Transform-Load (ETL) platform to ensure data is transformed into suitable formats as defined by IC ITE initiatives. You will interface with external teams and systems, employing various protocols, including HTML and SFT,P to collect data efficiently. You will enhance the ETL platform by adding features aimed at shortening timelines for future data integration efforts. Beyond coding, you'll develop and maintain software to ensure seamless integration into a fully functional system. Collaboration with external teams will be necessary to validate data ingest processes, and you'll be responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process. Key Responsibilities
Design, develop, and maintain ETL pipelines and data integration processes
Interface with external systems using protocols like HTML and SFTP for data collection
Enhance ETL platforms by adding features to accelerate integration timelines
Ensure software and system integration meet operational and performance requirements
Collaborate with external teams to validate data ingestion processes
Maintain comprehensive system architecture and development documentation
Active TS/SCI with CI Polygraph Specialization using the Databricks platform for building, deploying, and managing data and AI solutions Experience building ETL pipelines, managing data pipelines, and working with large datasets using Spark, Python, and SQL Experience with technologies such as Delta Lake, Delta Live Tables, and Databricks Workflows Bachelor's degree with 5+ years of experience, or Master's degree with 3+ years of experience, or 10+ years of relevant professional experience in lieu of a degree Experience collaborating with data scientists Familiarity with Advana
Strong Python programming skills
Solid SQL knowledge for querying and data manipulation
Cloud platform experience (AWS, Azure, etc.)
Data Engineer Location:
Elkridge, MD or Annapolis Junction, MD (OnSite) Clearance:
Active TS/SCI with CI Polygraph Employment Type:
Full-Time Education: Bachelor's degree with 5+ years of experience, or
Master's degree with 3+ years of experience, or
10+ years of relevant professional experience in lieu of a degree Salary:
$110,000 Benefits:
Health, dental, vision, 401(k), PTO, and more Application:
Apply here or on our Careers Page @ Careers - Metronome, or email your resume to r.derring@wearemetronome.com
Overview
As a Data Engineer/Integrator, you'll work closely with a team of developers to fulfill data integration requirements. Your role involves writing and maintaining code using an Extract-Transform-Load (ETL) platform to ensure data is transformed into suitable formats as defined by IC ITE initiatives. You will interface with external teams and systems, employing various protocols, including HTML and SFT,P to collect data efficiently. You will enhance the ETL platform by adding features aimed at shortening timelines for future data integration efforts. Beyond coding, you'll develop and maintain software to ensure seamless integration into a fully functional system. Collaboration with external teams will be necessary to validate data ingest processes, and you'll be responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process. Key Responsibilities
Design, develop, and maintain ETL pipelines and data integration processes
Interface with external systems using protocols like HTML and SFTP for data collection
Enhance ETL platforms by adding features to accelerate integration timelines
Ensure software and system integration meet operational and performance requirements
Collaborate with external teams to validate data ingestion processes
Maintain comprehensive system architecture and development documentation
Active TS/SCI with CI Polygraph Specialization using the Databricks platform for building, deploying, and managing data and AI solutions Experience building ETL pipelines, managing data pipelines, and working with large datasets using Spark, Python, and SQL Experience with technologies such as Delta Lake, Delta Live Tables, and Databricks Workflows Bachelor's degree with 5+ years of experience, or Master's degree with 3+ years of experience, or 10+ years of relevant professional experience in lieu of a degree Experience collaborating with data scientists Familiarity with Advana
Strong Python programming skills
Solid SQL knowledge for querying and data manipulation
Cloud platform experience (AWS, Azure, etc.)