Logo
Gravity IT Resources

Senior Data Engineer

Gravity IT Resources, Charlotte, North Carolina, United States, 28245

Save Job

To Apply for this Job Click Here Job Title: Senior Data Engineer

Req ID: CPGJP00001801

Location: Remote

Job-Type: Contract to Hire (must be a US Citizen or Green Card Holder)

Job Summary We are looking for a hands-on Senior Data Engineer with expertise in developing data ingestion pipelines. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance.

Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and DBT is a plus.

Job Responsibilities

Design, build, test, and implement scalable data pipelines using Python and SQL.

Maintain and optimize our Snowflake data warehouse’s performance, including data ingestion and query optimization.

Extract data from APIs using Python and AWS Lambda and automate workflows with AWS Airflow.

Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality.

Collaborate with other data engineers and architect to develop new pipelines and/or optimize existing ones.

Maintain code via CI/CD processes as defined in our Azure DevOps platform.

Job Qualifications

7+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion.

Expertise in Snowflake, including data ingestion and performance optimization.

Strong SQL skills for writing efficient queries and optimizing existing ones.

Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc.

Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc.

Highly self-motivated and detail-oriented with strong communication skills.

Familiarity with ETL/ELT processes.

Experience with Fivetran and DBT is a plus.

Screening Questions

Snowflake optimization: What factors affect query performance in Snowflake? Walk through your approach to investigating and resolving a slow-performing Snowflake query. What metrics do you analyze?

Batch processing/CDC implementation: Have you replicated large scale data from a source system to Snowflake? Walkthrough the overall architecture and replication approach/ what tools were used. Compare log-based CDC vs batch processing. What are the trade-offs? How do you handle batch reconciliation?

Data Validation/Quality: Explain your approach on data quality across different pipeline stages (i.e. when replicating data from a source system into Snowflake). Give tangible examples based on your previous projects. Explain your data monitoring strategy.

Equal Employment Opportunity Statement Gravity IT Resources is an Equal Opportunity Employer. We are committed to creating an inclusive environment for all employees and applicants. We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, disability, genetic information, veteran status, or any other legally protected characteristic. All employment decisions are based on qualifications, merit, and business needs.

#J-18808-Ljbffr