Logo
Jaxel

Senior Data Engineer Data Engineer Worldwide

Jaxel, Snowflake, Arizona, United States, 85937

Save Job

Jaxel is seeking a Senior Data Engineer to join our cutting-edge team.

What You’ll Do Design and build

scalable, efficient ETL/ELT data pipelines

using modern tools (e.g., dbt, Airflow, Fivetran, custom Python jobs).

Develop and maintain

data models, schemas, and transformations

in

Snowflake

and other cloud data warehouses (e.g., BigQuery, Redshift).

Integrate data from

various sources

(internal systems, APIs, 3rd party data providers).

Work with stakeholders to understand data needs and deliver reliable,

clean, and well-documented datasets .

Build and maintain

data quality and validation checks

to ensure high trust in our data.

Optimize query performance and storage costs in Snowflake and other platforms.

Collaborate with analytics, product, and engineering teams to support data-driven features and reporting needs.

Implement and maintain

infrastructure-as-code , CI/CD workflows, and version control for data pipelines.

Ensure

data security, access control , and compliance with relevant policies (e.g., GDPR, HIPAA if applicable).

What You’ll Need 3+ years of experience as a Data Engineer or in a similar role.

Strong SQL skills and deep understanding of

data warehousing principles .

Hands-on experience with

Snowflake

and at least one other data warehouse (BigQuery, Redshift, etc.).

Experience with modern data pipeline tools such as

Airflow, dbt, Fivetran, Dagster , or custom Python/Scala jobs.

Proficiency in

Python

(or another scripting language) for data manipulation and orchestration.

Experience building and maintaining

production-grade ETL/ELT pipelines .

Familiarity with

cloud platforms

like AWS, GCP, or Azure (e.g., S3, Lambda, Cloud Functions).

Strong attention to

data quality, testing, and documentation

Additional requirements (optional)

Experience with

real-time / streaming data

(Kafka, Spark Streaming, etc.).

Exposure to

data governance, lineage , and metadata tools (e.g., Amundsen, DataHub).

Understanding of

data privacy, compliance , and

security best practices .

Familiarity with

infrastructure-as-code

(e.g., Terraform) and CI/CD pipelines (GitHub Actions, GitLab CI, etc.).

Experience collaborating in agile teams and using tools like Jira, Confluence, etc.

What We Offer Remote work opportunity.

#J-18808-Ljbffr