Nearshore Business Solutions
Data Engineer (Snowflake & dbt Specialist)
Nearshore Business Solutions, Snowflake, Arizona, United States, 85937
Job Title:
Data Engineer (Snowflake & dbt Specialist)
Location:
Remote – Latin America Preferred
Type of Contract:
Contractor, Full-Time
Salary Range:
4K to 6K USD/month
Language Requirements:
English – Advanced (Written & Spoken)
We are seeking an experienced
Data Engineer
with strong expertise in
Snowflake
and
dbt (Data Build Tool)
to join our dynamic data team. You will play a key role in building and optimizing a next-generation cloud data warehouse that powers analytics, reporting, and data science initiatives. Your work will directly enhance data quality, scalability, and accessibility across the organization.
Key Responsibilities
Design, develop, and maintain robust ELT/ETL data pipelines using modern methodologies within the
Snowflake
ecosystem.
Manage and optimize data transformation workflows using
dbt , including modular SQL modeling, data testing, and documentation.
Implement efficient
data ingestion
solutions using Snowpipe, Kafka, or third-party tools to process structured and semi-structured data.
Optimize
Snowflake performance
through warehouse tuning, query optimization, clustering, and materialized views.
Design and implement
dimensional data models
(Star and Snowflake schemas) to support reporting and analytical needs.
Configure and manage
Snowflake security
features, including RBAC, data masking, and compliance with governance standards.
Collaborate with Data Analysts, Data Scientists, and stakeholders to deliver scalable, data-driven solutions.
Implement and maintain
CI/CD pipelines
and
Infrastructure as Code (IaC)
for dbt and Snowflake using Git, Airflow, or Azure DevOps/AWS Glue.
Monitor pipeline health, ensuring data integrity, reliability, and performance through proactive alerting and logging.
Must-Have Qualifications
4 years of professional experience in
Data Engineering, Business Intelligence, or related roles .
Proven, hands-on expertise with
Snowflake
data warehouse architecture, design, and optimization.
Advanced proficiency with
dbt , including modeling, testing, macros, and package management.
Strong
SQL
skills for query design, optimization, and stored procedures (Snowflake Scripting or PL/SQL).
Proficiency in
Python
for automation, orchestration, and data transformation tasks.
Experience working with
cloud platforms
such as AWS, Azure or GCP, and related services (e.g., S3, ADLS).
Preferred Qualifications
Experience with
workflow orchestration tools
such as Apache Airflow, Prefect, or dbt Cloud.
Familiarity with
Data Vault 2.0
or advanced data modeling techniques.
Experience with
real-time data streaming technologies
like Kafka or Kinesis.
Bachelor’s degree in Computer Science, Engineering, or a related field.
Excellent communication and collaboration skills with a focus on data quality and operational excellence.
#J-18808-Ljbffr
Data Engineer (Snowflake & dbt Specialist)
Location:
Remote – Latin America Preferred
Type of Contract:
Contractor, Full-Time
Salary Range:
4K to 6K USD/month
Language Requirements:
English – Advanced (Written & Spoken)
We are seeking an experienced
Data Engineer
with strong expertise in
Snowflake
and
dbt (Data Build Tool)
to join our dynamic data team. You will play a key role in building and optimizing a next-generation cloud data warehouse that powers analytics, reporting, and data science initiatives. Your work will directly enhance data quality, scalability, and accessibility across the organization.
Key Responsibilities
Design, develop, and maintain robust ELT/ETL data pipelines using modern methodologies within the
Snowflake
ecosystem.
Manage and optimize data transformation workflows using
dbt , including modular SQL modeling, data testing, and documentation.
Implement efficient
data ingestion
solutions using Snowpipe, Kafka, or third-party tools to process structured and semi-structured data.
Optimize
Snowflake performance
through warehouse tuning, query optimization, clustering, and materialized views.
Design and implement
dimensional data models
(Star and Snowflake schemas) to support reporting and analytical needs.
Configure and manage
Snowflake security
features, including RBAC, data masking, and compliance with governance standards.
Collaborate with Data Analysts, Data Scientists, and stakeholders to deliver scalable, data-driven solutions.
Implement and maintain
CI/CD pipelines
and
Infrastructure as Code (IaC)
for dbt and Snowflake using Git, Airflow, or Azure DevOps/AWS Glue.
Monitor pipeline health, ensuring data integrity, reliability, and performance through proactive alerting and logging.
Must-Have Qualifications
4 years of professional experience in
Data Engineering, Business Intelligence, or related roles .
Proven, hands-on expertise with
Snowflake
data warehouse architecture, design, and optimization.
Advanced proficiency with
dbt , including modeling, testing, macros, and package management.
Strong
SQL
skills for query design, optimization, and stored procedures (Snowflake Scripting or PL/SQL).
Proficiency in
Python
for automation, orchestration, and data transformation tasks.
Experience working with
cloud platforms
such as AWS, Azure or GCP, and related services (e.g., S3, ADLS).
Preferred Qualifications
Experience with
workflow orchestration tools
such as Apache Airflow, Prefect, or dbt Cloud.
Familiarity with
Data Vault 2.0
or advanced data modeling techniques.
Experience with
real-time data streaming technologies
like Kafka or Kinesis.
Bachelor’s degree in Computer Science, Engineering, or a related field.
Excellent communication and collaboration skills with a focus on data quality and operational excellence.
#J-18808-Ljbffr