Cyber Sphere
Data Architect Location-Hybrid 2days @ Chicago, IL or Battle Creek, Michigan ,
Cyber Sphere, Chicago, Illinois, United States, 60290
Data Architect
Location:
Hybrid 2 days @ Chicago, IL or Battle Creek, Michigan, IL (Need Locals Only)
Duration:
Contract
Experience:
Any working experience in Data Architecture with FMCG background
We are seeking a highly skilled Architect / Senior Data Engineer to design, build, and optimize our modern data ecosystem. The ideal candidate will have deep experience with AWS cloud services, Snowflake, and dbt, along with strong understanding of scalable data architecture, ETL/ELT development, and data modeling best practices.
Key Responsibilities
Architect, design, and implement
scalable, reliable, and secure data solutions using AWS, Snowflake, and dbt.
Develop
end-to-end data pipelines
(batch and streaming) to support analytics, machine learning, and business intelligence needs.
Lead the
modernization and migration
of legacy data systems to cloud-native architectures.
Define and enforce
data engineering best practices
including coding standards, CI/CD, testing, and monitoring.
Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions.
Optimize Snowflake performance through
query tuning, warehouse sizing, and cost management .
Establish and maintain
data governance, security, and compliance
standards across the data platform.
Mentor and guide junior data engineers, providing technical leadership and direction.
Required Skills & Qualifications
8+ years
of experience in Data Engineering, with at least
3+ years
in a cloud-native data environment.
Hands‑on expertise in
AWS services
such as S3, Glue, Lambda, Step Functions, Redshift, and IAM.
Strong experience with
Snowflake
– data modeling, warehouse design, performance optimization, and cost governance.
Proven experience with
dbt (data build tool)
– model development, documentation, and deployment automation.
Proficient in
SQL, Python, and ETL/ELT pipeline development .
Experience with
CI/CD pipelines , version control (Git), and workflow orchestration tools (Airflow, Dagster, Prefect, etc.).
Familiarity with
data governance and security best practices , including role‑based access control and data masking.
Strong understanding of
data modeling techniques
(Kimball, Data Vault, etc.) and
data architecture principles .
Preferred Qualifications
AWS Certification (e.g.,
AWS Certified Data Analytics – Specialty ,
Solutions Architect ).
Strong communication and collaboration skills, with a track record of working in agile environments.
Regards, Sai Srikar 7704565690 Email: sai@cysphere.net
Seniority Level Mid‑Senior level
Employment Type Contract
Job Functions Engineering and Information Technology
Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Cyber Sphere by 2x.
#J-18808-Ljbffr
Hybrid 2 days @ Chicago, IL or Battle Creek, Michigan, IL (Need Locals Only)
Duration:
Contract
Experience:
Any working experience in Data Architecture with FMCG background
We are seeking a highly skilled Architect / Senior Data Engineer to design, build, and optimize our modern data ecosystem. The ideal candidate will have deep experience with AWS cloud services, Snowflake, and dbt, along with strong understanding of scalable data architecture, ETL/ELT development, and data modeling best practices.
Key Responsibilities
Architect, design, and implement
scalable, reliable, and secure data solutions using AWS, Snowflake, and dbt.
Develop
end-to-end data pipelines
(batch and streaming) to support analytics, machine learning, and business intelligence needs.
Lead the
modernization and migration
of legacy data systems to cloud-native architectures.
Define and enforce
data engineering best practices
including coding standards, CI/CD, testing, and monitoring.
Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions.
Optimize Snowflake performance through
query tuning, warehouse sizing, and cost management .
Establish and maintain
data governance, security, and compliance
standards across the data platform.
Mentor and guide junior data engineers, providing technical leadership and direction.
Required Skills & Qualifications
8+ years
of experience in Data Engineering, with at least
3+ years
in a cloud-native data environment.
Hands‑on expertise in
AWS services
such as S3, Glue, Lambda, Step Functions, Redshift, and IAM.
Strong experience with
Snowflake
– data modeling, warehouse design, performance optimization, and cost governance.
Proven experience with
dbt (data build tool)
– model development, documentation, and deployment automation.
Proficient in
SQL, Python, and ETL/ELT pipeline development .
Experience with
CI/CD pipelines , version control (Git), and workflow orchestration tools (Airflow, Dagster, Prefect, etc.).
Familiarity with
data governance and security best practices , including role‑based access control and data masking.
Strong understanding of
data modeling techniques
(Kimball, Data Vault, etc.) and
data architecture principles .
Preferred Qualifications
AWS Certification (e.g.,
AWS Certified Data Analytics – Specialty ,
Solutions Architect ).
Strong communication and collaboration skills, with a track record of working in agile environments.
Regards, Sai Srikar 7704565690 Email: sai@cysphere.net
Seniority Level Mid‑Senior level
Employment Type Contract
Job Functions Engineering and Information Technology
Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Cyber Sphere by 2x.
#J-18808-Ljbffr