Sanrnd
San R&D Business Solutions LLC | Full time
Alpharetta, United States | Posted on 01/08/2026
Location:
Alpharetta, GA (Onsite) Interview:
In‑Person (Mandatory) Preference:
Only Local to Georgia Employment Type:
Contract (C2C) Visa Requirement:
All except OPT / CPT
About the Role We are seeking an experienced
DataEngineer
to join our team. The ideal candidate will have strong expertise in
Python, SQL, and Google Cloud Platform (GCP) , with hands‑on experience building and maintaining scalable data pipelines. This role involves working with large datasets, optimizing data workflows, and collaborating closely with data scientists, analysts, and engineering teams to deliver reliable, high‑quality data solutions.
Key Responsibilities
Design, build, and maintain
scalable, reliable data pipelines
to process large volumes of data
Develop, optimize, and maintain
ETL/ELT workflows
using Python, SQL, and cloud‑native tools
Write and optimize
complex SQL queries
for data transformation and analysis
Work extensively with
BigQuery
for data storage, processing, and performance optimization
Develop and maintain data pipelines using
Apache Spark, Apache Beam, and GCP Dataflow
Implement
data validation, quality checks, and monitoring
to ensure data accuracy and integrity
Troubleshoot and resolve pipeline failures, performance issues, and data inconsistencies
Collaborate with data scientists, analysts, and business stakeholders to understand data requirements
Apply
best practices
in data engineering, data modeling, and cloud architecture
Stay current with emerging data engineering tools and technologies
Required Skills & Qualifications
Bachelor’s degree in
Computer Science, Engineering , or a related field (or equivalent experience)
9+ years
of experience as a
Data Engineer
Strong proficiency in
Python
for data processing and automation
Advanced
SQL
skills, including complex query writing and performance optimization
Hands‑on experience with
Google Cloud Platform (GCP)
Strong experience with
BigQuery, Apache Spark, Apache Beam, GCP Dataflow
Working knowledge of
Java
for data processing or pipeline development
Solid understanding of
data warehousing concepts
and
data modeling
Strong problem‑solving skills and attention to detail
Excellent communication skills and ability to work in a team‑oriented environment
Preferred Skills
Experience with
data governance, data quality, and data security best practices
Familiarity with
CI/CD pipelines
and version control systems (e.g., Git)
Experience working in
cloud‑native data architectures
Exposure to additional cloud platforms (AWS or Azure)
Experience supporting
large‑scale, enterprise data environments
#J-18808-Ljbffr
Alpharetta, United States | Posted on 01/08/2026
Location:
Alpharetta, GA (Onsite) Interview:
In‑Person (Mandatory) Preference:
Only Local to Georgia Employment Type:
Contract (C2C) Visa Requirement:
All except OPT / CPT
About the Role We are seeking an experienced
DataEngineer
to join our team. The ideal candidate will have strong expertise in
Python, SQL, and Google Cloud Platform (GCP) , with hands‑on experience building and maintaining scalable data pipelines. This role involves working with large datasets, optimizing data workflows, and collaborating closely with data scientists, analysts, and engineering teams to deliver reliable, high‑quality data solutions.
Key Responsibilities
Design, build, and maintain
scalable, reliable data pipelines
to process large volumes of data
Develop, optimize, and maintain
ETL/ELT workflows
using Python, SQL, and cloud‑native tools
Write and optimize
complex SQL queries
for data transformation and analysis
Work extensively with
BigQuery
for data storage, processing, and performance optimization
Develop and maintain data pipelines using
Apache Spark, Apache Beam, and GCP Dataflow
Implement
data validation, quality checks, and monitoring
to ensure data accuracy and integrity
Troubleshoot and resolve pipeline failures, performance issues, and data inconsistencies
Collaborate with data scientists, analysts, and business stakeholders to understand data requirements
Apply
best practices
in data engineering, data modeling, and cloud architecture
Stay current with emerging data engineering tools and technologies
Required Skills & Qualifications
Bachelor’s degree in
Computer Science, Engineering , or a related field (or equivalent experience)
9+ years
of experience as a
Data Engineer
Strong proficiency in
Python
for data processing and automation
Advanced
SQL
skills, including complex query writing and performance optimization
Hands‑on experience with
Google Cloud Platform (GCP)
Strong experience with
BigQuery, Apache Spark, Apache Beam, GCP Dataflow
Working knowledge of
Java
for data processing or pipeline development
Solid understanding of
data warehousing concepts
and
data modeling
Strong problem‑solving skills and attention to detail
Excellent communication skills and ability to work in a team‑oriented environment
Preferred Skills
Experience with
data governance, data quality, and data security best practices
Familiarity with
CI/CD pipelines
and version control systems (e.g., Git)
Experience working in
cloud‑native data architectures
Exposure to additional cloud platforms (AWS or Azure)
Experience supporting
large‑scale, enterprise data environments
#J-18808-Ljbffr