Glocomms
Data Engineer (Hybrid – Scottsdale, AZ or Relocation Required)
We are seeking a highly skilled Data Engineer to join a fast‑paced and collaborative team. This role is ideal for someone with strong experience in modern data engineering practices and a passion for building scalable, efficient data solutions.
Please note: Visa sponsorship is not available for this role. Candidates must be located in Scottsdale, AZ or willing to relocate. This is a hybrid position.
Base pay range $100,000 – $130,000 annually
Key Responsibilities
Design, develop, and maintain robust data pipelines using Azure Data Services (Azure Data Factory, Synapse, Azure SQL, etc.)
Implement ETL processes with Pentaho PDI and Airbyte for data ingestion and transformation
Optimize SQL queries for performance and efficient data retrieval
Build data solutions to support BI, analytics, and AI/ML initiatives
Administer and maintain Azure infrastructure, including security, monitoring, and performance tuning
Develop automation and transformation scripts using Python and integrate with APIs
Orchestrate workflows using Dagster or Apache Airflow
Create advanced reports and dashboards in Power BI, including paginated reports and complex DAX calculations
Collaborate with analysts, data scientists, and business stakeholders to deliver impactful solutions
Ensure data quality, governance, and security across platforms
Required Skills & Experience
3‑5 years of experience as a Data Engineer in Azure Cloud environments
Strong expertise in Azure Data Services (ADF, Synapse, Databricks, Azure SQL)
Proficiency with ETL tools, especially Pentaho PDI and Airbyte
Advanced SQL skills with a focus on query optimization
Experience with Power BI and Power Automate
Hands‑on Azure administration experience
Proficiency in Python for scripting and automation
Experience with DBT for data transformations
Familiarity with Dagster or Apache Airflow for orchestration
Strong understanding of data modeling (dimensional, star, snowflake schemas)
Ability to work effectively in an agile, collaborative environment
Preferred Qualifications
Experience with DevOps practices for CI/CD in data engineering
Exposure to AI/ML technologies on Azure
Knowledge of finance and operational reporting
Familiarity with data governance and compliance (e.g., GDPR, HIPAA)
Background in manufactured housing or property management is a plus
#J-18808-Ljbffr
We are seeking a highly skilled Data Engineer to join a fast‑paced and collaborative team. This role is ideal for someone with strong experience in modern data engineering practices and a passion for building scalable, efficient data solutions.
Please note: Visa sponsorship is not available for this role. Candidates must be located in Scottsdale, AZ or willing to relocate. This is a hybrid position.
Base pay range $100,000 – $130,000 annually
Key Responsibilities
Design, develop, and maintain robust data pipelines using Azure Data Services (Azure Data Factory, Synapse, Azure SQL, etc.)
Implement ETL processes with Pentaho PDI and Airbyte for data ingestion and transformation
Optimize SQL queries for performance and efficient data retrieval
Build data solutions to support BI, analytics, and AI/ML initiatives
Administer and maintain Azure infrastructure, including security, monitoring, and performance tuning
Develop automation and transformation scripts using Python and integrate with APIs
Orchestrate workflows using Dagster or Apache Airflow
Create advanced reports and dashboards in Power BI, including paginated reports and complex DAX calculations
Collaborate with analysts, data scientists, and business stakeholders to deliver impactful solutions
Ensure data quality, governance, and security across platforms
Required Skills & Experience
3‑5 years of experience as a Data Engineer in Azure Cloud environments
Strong expertise in Azure Data Services (ADF, Synapse, Databricks, Azure SQL)
Proficiency with ETL tools, especially Pentaho PDI and Airbyte
Advanced SQL skills with a focus on query optimization
Experience with Power BI and Power Automate
Hands‑on Azure administration experience
Proficiency in Python for scripting and automation
Experience with DBT for data transformations
Familiarity with Dagster or Apache Airflow for orchestration
Strong understanding of data modeling (dimensional, star, snowflake schemas)
Ability to work effectively in an agile, collaborative environment
Preferred Qualifications
Experience with DevOps practices for CI/CD in data engineering
Exposure to AI/ML technologies on Azure
Knowledge of finance and operational reporting
Familiarity with data governance and compliance (e.g., GDPR, HIPAA)
Background in manufactured housing or property management is a plus
#J-18808-Ljbffr