Dev-Pro.net
Overview
Are you in Brazil, Argentina or Colombia? Join us as we actively recruit in these locations, offering a comfortable remote environment. Submit your CV in English, and we'll get back to you! We invite a Junior Data Engineer to join our dynamic team supporting a major enterprise client in modernizing their data platform. In this role, youll assist in migrating and transforming legacy data pipelines to a modern cloud environment. Youll work closely with senior engineers, architects, DevOps, QA, and product stakeholders, gaining hands-on data engineering experience and contributing to reliable, scalable data solutions. Whats in it for you
Join a supportive delivery team built on collaboration, transparency, and mutual respect Get hands-on exposure to a high-impact, real-world data platform transformation project Grow your skills with modern technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery Is that you?
0.5-1 years of experience in data engineering, data analytics, or software development Basic understanding of data warehouse concepts and ETL pipelines Good knowledge of SQL and willingness to learn Snowflake or similar data storage technologies Basic experience with Python for scripting or simple ETL tasks Experience with GCP platforms (BigQuery, GCS, Airflow, Dataflow, Dataproc, Pub/Sub) Understanding of version control (Git) and eagerness to learn CI/CD and IaC tools Degree in Computer Science, Data Engineering, or related field, or equivalent practical experience Strong communication and collaboration skills Upper-Intermediate English level Desirable
Basic exposure to streaming data pipelines and event-driven architectures Familiarity with basic scripting and containerization tools (Bash, Docker) Basic understanding of data lakehouse concepts (Iceberg tables) Awareness of data transformation tools like dbt Familiarity with AI-assisted tools like GitHub Copilot Key responsibilities and your contribution
In this role, you'll assist with key data engineering activities while supporting the team in delivering high-quality data solutions. Assist in reviewing and analyzing existing ETL solutions for migration to the new architecture Support the migration of batch and streaming data pipelines to the GCP Landing Zone Help build and maintain data transformations with dbt, supporting ELT pipelines in Snowflake Help with data jobs refactoring and mapping Assist in setting up and maintaining monitoring and alerting for data pipelines Contribute to migrating historical data to Iceberg tables with guidance from senior engineers Collaborate with senior engineers and stakeholders to understand requirements and implement solutions Participate in code reviews, team discussions, and technical planning to develop your skills Whats working at Dev.Pro like?
Dev.Pro is a global company that's been building great software since 2011. Our team values fairness, high standards, openness, and inclusivity for everyone no matter your background. We are 99.9% remote you can work from anywhere in the world. We offer 30 paid days off per year, 5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events. There is partially covered health insurance after probation, plus a wellness bonus after 6 months. We pay in U.S. dollars and cover all approved overtime. You can join English lessons and Dev.Pro University programs, and take part in online activities and team-building events. Our next steps
Submit a CV in English Intro call with a Recruiter Internal interview Client interview Offer Interested? Find out more
How we work LinkedIn Page Our website IG Page #J-18808-Ljbffr
Are you in Brazil, Argentina or Colombia? Join us as we actively recruit in these locations, offering a comfortable remote environment. Submit your CV in English, and we'll get back to you! We invite a Junior Data Engineer to join our dynamic team supporting a major enterprise client in modernizing their data platform. In this role, youll assist in migrating and transforming legacy data pipelines to a modern cloud environment. Youll work closely with senior engineers, architects, DevOps, QA, and product stakeholders, gaining hands-on data engineering experience and contributing to reliable, scalable data solutions. Whats in it for you
Join a supportive delivery team built on collaboration, transparency, and mutual respect Get hands-on exposure to a high-impact, real-world data platform transformation project Grow your skills with modern technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery Is that you?
0.5-1 years of experience in data engineering, data analytics, or software development Basic understanding of data warehouse concepts and ETL pipelines Good knowledge of SQL and willingness to learn Snowflake or similar data storage technologies Basic experience with Python for scripting or simple ETL tasks Experience with GCP platforms (BigQuery, GCS, Airflow, Dataflow, Dataproc, Pub/Sub) Understanding of version control (Git) and eagerness to learn CI/CD and IaC tools Degree in Computer Science, Data Engineering, or related field, or equivalent practical experience Strong communication and collaboration skills Upper-Intermediate English level Desirable
Basic exposure to streaming data pipelines and event-driven architectures Familiarity with basic scripting and containerization tools (Bash, Docker) Basic understanding of data lakehouse concepts (Iceberg tables) Awareness of data transformation tools like dbt Familiarity with AI-assisted tools like GitHub Copilot Key responsibilities and your contribution
In this role, you'll assist with key data engineering activities while supporting the team in delivering high-quality data solutions. Assist in reviewing and analyzing existing ETL solutions for migration to the new architecture Support the migration of batch and streaming data pipelines to the GCP Landing Zone Help build and maintain data transformations with dbt, supporting ELT pipelines in Snowflake Help with data jobs refactoring and mapping Assist in setting up and maintaining monitoring and alerting for data pipelines Contribute to migrating historical data to Iceberg tables with guidance from senior engineers Collaborate with senior engineers and stakeholders to understand requirements and implement solutions Participate in code reviews, team discussions, and technical planning to develop your skills Whats working at Dev.Pro like?
Dev.Pro is a global company that's been building great software since 2011. Our team values fairness, high standards, openness, and inclusivity for everyone no matter your background. We are 99.9% remote you can work from anywhere in the world. We offer 30 paid days off per year, 5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events. There is partially covered health insurance after probation, plus a wellness bonus after 6 months. We pay in U.S. dollars and cover all approved overtime. You can join English lessons and Dev.Pro University programs, and take part in online activities and team-building events. Our next steps
Submit a CV in English Intro call with a Recruiter Internal interview Client interview Offer Interested? Find out more
How we work LinkedIn Page Our website IG Page #J-18808-Ljbffr