Logo
Motion Recruitment Partners LLC

Senior Azure Data Engineer

Motion Recruitment Partners LLC, Scottsdale, Arizona, us, 85261

Save Job

A fast-growing and innovative workforce travel technology company based in

Scottsdale, Arizona

is seeking a highly skilled

Senior Data Engineer

to join our expanding

Data Platform & Engineering

team. This

hybrid

role includes regular in-office collaboration with up to

20% flexibility to work from home . We’re looking for an experienced data engineer with a passion for building scalable, real-time data solutions that support product innovation and data-driven decision-making across the organization.

Qualifications:

Hold a bachelor’s degree in computer science, Information Systems, or a related field. Bring

5+ years

of hands-on experience in data engineering or a closely related role. Experienced in designing and maintaining

real-time and batch data pipelines

using modern ETL/ELT frameworks. Deep knowledge of

SQL, NoSQL , and

hybrid data storage

solutions, including PostgreSQL, Cosmos DB, and Data Lakes (e.g., Azure Data Lake, Delta Lake, Iceberg). Strong proficiency in

Python, Java, and/or Go

for data pipeline and API development. Skilled in working with

event-driven architectures , including Azure Event Hub, Service Bus, and Kafka. Experience with

API development

(REST, GraphQL, gRPC) to support Data-as-a-Product initiatives. Comfortable working with

Azure and Apache data platforms

(e.g., Databricks, Azure Fabric, Snowflake, Apache Hudi). Understanding of

data governance, lineage, and compliance

using tools like Microsoft Purview, OpenLineage, or Apache Ranger. Familiarity with

Infrastructure as Code (IaC)

practices using Bicep, Terraform, or CloudFormation. Nice to Have:

Experience supporting

machine learning workflows

with Azure ML, Databricks ML, or MLflow. Hands-on experience with

real-time data streaming

and notebooks (e.g., Jupyter, Synapse). Knowledge of

data monetization

and

self-serve data platforms . Exposure to

federated data governance

models. Daily Duties:

Design and build scalable,

cloud-native data infrastructure

that integrates with microservices. Develop and optimize

real-time and batch data pipelines

for ingestion, transformation, and delivery. Implement

data storage strategies

across SQL, NoSQL, and Data Lake technologies. Build and manage secure, documented

data APIs

that enable self-service access for internal and external users. Collaborate with product and business teams to define and deliver reliable data products. Implement

event-driven architectures

using Kafka or Azure messaging services. Ensure data

quality, security, lineage, and observability

across all pipelines. Work with DevSecOps teams to integrate

security and compliance

into CI/CD workflows.

#J-18808-Ljbffr