Principal GCP Data Engineer
Only on W2
Visa: USC, GC, H4ead
Required Skills: Experience with Airflow or Cloud Composer orchestration, developing new DAGs from scratch; building data ingestion and ETL pipelines from scratch; proficiency in SnapLogic for data pipelines and integrations; Python, SQL, Dataflow, Spark; data warehousing experience, Google BigQuery; understanding of data movement from source to warehouse to reporting layers; hands-on experience with building data pipelines and orchestration.
Nice to Have Skills: Kafka, Java, Apache Beam, Alteryx, etc.
- Experience in SnapLogic is strongly preferred as it is more amenable to upskilling.
- Proficiency in developing data ingestion and ETL pipelines from scratch using SnapLogic, Python, SQL, Dataflow, Spark.
- Experience in data warehousing, especially Google BigQuery.
- Support data modeling; not responsible for building visualizations.
- Understanding of analytics, data flow from source to warehouse to reporting, with a focus on pipeline building and orchestration.
- Proactive communication, inquisitiveness, problem-solving skills, and willingness to contribute suggestions and ask questions.
- Additional technologies like Kafka, Java, Apache Beam, Alteryx are considered nice to have.
Seniority level
- Mid-Senior level
Employment type
- Contract
Job function
- Information Technology
Industries
- IT Services and IT Consulting
Referrals increase your chances of interviewing at Chelsoft Solutions Co. by 2x.
Get notified about new Data Engineer jobs in Minneapolis, MN .
Locations and salary ranges vary; see the original post for details.
We’re unlocking community knowledge in a new way, with AI-assisted expert insights.
#J-18808-Ljbffr