Canyon Associates
The Data Warehouse and Business Intelligence Architect will play a key role in designing and implementing the enterprise data platform CDO (Central Data Office) on Microsoft Fabric (Azure). This individual will collaborate closely with business, analyst, and IT teams to architect and deliver modern data solutions that enable actionable insights across Finance, Operations, Supply Chain, Business Development, and Marketing. The Architect will be responsible for building the foundational data warehouse, data engineering, data pipelines, and data models that power our analytics ecosystem.
The ideal candidate will have a strong technical background in Microsoft Fabric, Azure Data Services, SQL, Python, and data modeling, with hands-on experience integrating data from multiple source systems (using flat files, APIs, on-prem, and SaaS).
ROLE AND RESPONSIBILITIES Design end to end data architecture leveraging Microsoft Fabric capabilities (OneLake, Lakehouse, Warehouse, Pipelines, Dataflows Gen2) Design robust data models, schema definitions and ensure the data warehouse can support complex analytical workloads. Develop and maintain data integration to extract and transform data from multiple systems including flat files (CSV, Excel, JSON), APIs, partner data feeds, and third-party SaaS platforms. Ensure data quality, security, and compliance across all datasets, implementing Purview for lineage, classification, and access control. Establish and document data architecture standards, including folder structure, naming conventions, version control, CI/CD via Azure DevOps, and promotion workflows. Drive continuous improvement in data platform reliability, cost efficiency, and governance through automation and monitoring. Participate in data strategy reviews, architecture design sessions, and cross-departmental planning.
QUALIFICATIONS AND EDUCATION REQUIREMENTS Bachelors degree in Computer Science, Information Systems, or related field; Masters preferred. 7+ years of experience in data engineering, data architecture, or BI development, including 2+ years with Microsoft Fabric or Azure Synapse. Proven experience designing data warehouses, dimensional models, and ETL/ELT pipelines. Proficient in SQL (T-SQL) and Python/PySpark for data processing and automation. Experience with Power BI, Azure Data Factory or Fabric Pipelines, OneLake, and Lakehouse architecture. Experience integrating REST APIs, webhooks, and partner data. Experience with Delta Lake / Parquet optimization (Z-ordering, partitioning, compaction) Experience of Dimensional modeling (Kimball) and medallion architecture (bronze/silver/gold). Strong understanding of data governance, metadata management, and data security. Hands-on experience implementing CI/CD for data solutions using Git / Azure DevOps. Strong analytical, problem-solving, and communication skills. Self-starter with ability to work independently and manage multiple priorities.
The ideal candidate will have a strong technical background in Microsoft Fabric, Azure Data Services, SQL, Python, and data modeling, with hands-on experience integrating data from multiple source systems (using flat files, APIs, on-prem, and SaaS).
ROLE AND RESPONSIBILITIES Design end to end data architecture leveraging Microsoft Fabric capabilities (OneLake, Lakehouse, Warehouse, Pipelines, Dataflows Gen2) Design robust data models, schema definitions and ensure the data warehouse can support complex analytical workloads. Develop and maintain data integration to extract and transform data from multiple systems including flat files (CSV, Excel, JSON), APIs, partner data feeds, and third-party SaaS platforms. Ensure data quality, security, and compliance across all datasets, implementing Purview for lineage, classification, and access control. Establish and document data architecture standards, including folder structure, naming conventions, version control, CI/CD via Azure DevOps, and promotion workflows. Drive continuous improvement in data platform reliability, cost efficiency, and governance through automation and monitoring. Participate in data strategy reviews, architecture design sessions, and cross-departmental planning.
QUALIFICATIONS AND EDUCATION REQUIREMENTS Bachelors degree in Computer Science, Information Systems, or related field; Masters preferred. 7+ years of experience in data engineering, data architecture, or BI development, including 2+ years with Microsoft Fabric or Azure Synapse. Proven experience designing data warehouses, dimensional models, and ETL/ELT pipelines. Proficient in SQL (T-SQL) and Python/PySpark for data processing and automation. Experience with Power BI, Azure Data Factory or Fabric Pipelines, OneLake, and Lakehouse architecture. Experience integrating REST APIs, webhooks, and partner data. Experience with Delta Lake / Parquet optimization (Z-ordering, partitioning, compaction) Experience of Dimensional modeling (Kimball) and medallion architecture (bronze/silver/gold). Strong understanding of data governance, metadata management, and data security. Hands-on experience implementing CI/CD for data solutions using Git / Azure DevOps. Strong analytical, problem-solving, and communication skills. Self-starter with ability to work independently and manage multiple priorities.