Logo
HeadHR

Data Engineer

HeadHR, Snowflake, Arizona, United States, 85937

Save Job

Data Science # SQL # PL/SQL # ETL # BI # IPC # Data warehouse

Data Science # Big Data # Hadoop # Scala # Java # SQL # Spark

Data Science # SQL # Databricks # Python # AWS # Spark

Data Science # AWS # Redshift # Python # SQL # Spark # S3

Data Science # Python # Snowflake # AWS # SQL # ETL

Solid Python programming skills for building and maintaining data pipelines

Advanced SQL skills, including query optimization and performance tuning

Experience with ETL/ELT tools and data orchestration frameworks like Apache Airflow, dbt

Strong understanding of data modeling principles (dimensional modeling, star/snowflake schemas, normalization/denormalization)

Hands‑on experience in data warehousing (Snowflake, Redshift)

Experience with AWS (S3, Lambda, Kinesis, Batch, DynamoDB, Athena, Glue, etc.)

Experience with data processing tools: Spark (Databricks), Kafka

Understanding of data governance, lineage, and security best practices

Familiarity with Agile methodologies and working in cross‑functional teams

Strong analytical thinking, problem‑solving skills, and attention to detail

Excellent communication and teamwork skills

Strong verbal and written English communication skills

Nice to have:

Data Visualization: PowerBI or other data visualization tools

If you love solving complex data challenges, designing clean data architectures, and ensuring data is ready for analytics and machine learning — we’d love to hear from you! As a Data Engineer, you’ll work in a collaborative, agile environment. You’ll take ownership of designing and implementing data models, data warehouses, and scalable pipelines that enable data‑driven decision‑making.

You’ll work closely with Data Architects, Data Scientists, and Software Engineers to ensure that data is well‑structured, accurate, secure, and accessible for business and analytical needs.

We expect candidates to be located in one of the following cities: Warsaw, Wroclaw, Lodz, Gdansk or Bialystok.

Responsibilities

Building and optimizing data pipelines and ETL/ELT processes using modern orchestration tools

Working with structured and semi‑structured data from multiple sources

Designing and developing scalable data models, data marts, and data warehouses that support analytics and reporting

Ensuring data quality, lineage, and governance are implemented and maintained

Managing and optimizing data storage solutions (data lakes, data warehouses)

Continuously improving performance, reliability, and scalability of data solutions

Godel Technologies stands as a leading, next‑generation technology partner, distinguished by its unique agile delivery approach and an unparalleled pool of software engineering talent across Europe. At Godel, we actively address the software engineering skills challenges prevalent among UK‑based companies. Renowned as one of the UK's most respected technology companies, our commitment to excellence has been recognized through various accolades in the past year. In the last twelve months alone, Godel Technologies has secured prestigious awards, including: Winner, in collaboration with comparethemarket.com, of the Real IT Awards 2019 for "Partnership of the Year" and "Agile Project of the Year." Winner, partnering with Virgin Holidays, at the Computing.co.uk DevOps Excellence Awards 2018 for"Best Use of Agile." Winner, alongside OEConnection, of the 2017 European Software Testing Awards for "Best Retail Testing Project." Consistent recognition in the Sunday Times Tech Track 100 for the years 2017, 2018, 2019.

#J-18808-Ljbffr