NDL International Inc.
Sr. Data & Analytics Engineer
US-MA-Boston
Overview At Shawmut Design and Construction, we take pride in the culture we've built as a 100% employee‑owned company—one that's been recognized with more than 85 Best Place to Work awards. We've been honored as a National Fortune Best Workplace, a Fortune Best Workplace for Women, Millennials, and Parents, and one of America's Best Employers by Forbes—along with numerous regional recognitions across our 11 offices nationwide.
Here's a glimpse into what we offer:
Health, Dental, and Vision Insurance.
Employee Stock Ownership Plan (ESOP) Be an employee‑owner!
401(K) with Company Match Receive a company match up to 4% of your eligible pay.
Generous Paid Time Off vacation and sick time, 12 holidays, summer Fridays, and a yearly volunteer day.
The Extras Cell phone, laptop, tuition reimbursement, pet insurance, financial planning services, and more.
Responsibilities The Senior Data & Analytics Engineer focuses on ingesting and transforming project, financial, and operational data for construction projects. Designs and maintains pipelines and models within Microsoft Fabric ecosystem (Data Factory, Notebooks, Lakehouse, Power BI) to enable real‑time insights for project teams and executives.
Delivers well‑defined, transformed, tested, documented, and code‑reviewed datasets for analysis
Designs the structure and layout of data systems, including databases, warehouses, and lakes
Builds and optimizes pipelines using Azure Data Factory, Fabric Pipelines and Notebooks
Creates robust data models and architectures to support analytics initiatives
Collaborates with business stakeholders to understand their analytics needs and deliver comprehensive models and datasets
Defines and manages standards, guidelines, and processes to ensure data quality
Identifies and implements optimizations to continually enhance query performance, reduce processing time, and increase overall productivity
Designs, develops, and maintains data pipelines to ensure efficient and reliable ETL processes
Collaborates with project teams to standardize KPIs for cost, schedule, and risk
Ensures data quality and accuracy by implementing data validation, monitoring, and error-handling processes
Qualifications
Experience : A minimum of 5 years of experience in data analytics, data engineering, software engineering, or a similar role.
Expertise in data modeling, ETL development, and data analysis
Experience with construction industry data and project-based analytics is desired
Experience with Apache Spark or Databricks
Education : Bachelor's degree in computer science, data science, engineering, or related field required.
Additional Role Specific Skills :
Strong ability in SQL for data extraction and manipulation, and proficiency in data warehousing concepts/tools specifically Microsoft Fabric components (Data Factory, Lakehouse, OneLake)
Familiarity with Azure-based data platforms for data storage and processing
Substantial programming ability using languages/tools such as Python, Scala, SQL for data manipulation and scripting
Solid understanding of relevant data governance, data quality, and data security best practices
Familiarity with construction ERP system (CMiC), HCM (Workday), and CRM (Dynamics) integrations
Strong problem‑solving skills, and the ability to think critically and analytically
Knowledge of ETL processes, data integration, and data warehousing concepts
Familiarity with data visualization tools such as Power BI
Excellent communication skills to effectively collaborate with cross‑functional teams and present insights to business stakeholders
#J-18808-Ljbffr
Overview At Shawmut Design and Construction, we take pride in the culture we've built as a 100% employee‑owned company—one that's been recognized with more than 85 Best Place to Work awards. We've been honored as a National Fortune Best Workplace, a Fortune Best Workplace for Women, Millennials, and Parents, and one of America's Best Employers by Forbes—along with numerous regional recognitions across our 11 offices nationwide.
Here's a glimpse into what we offer:
Health, Dental, and Vision Insurance.
Employee Stock Ownership Plan (ESOP) Be an employee‑owner!
401(K) with Company Match Receive a company match up to 4% of your eligible pay.
Generous Paid Time Off vacation and sick time, 12 holidays, summer Fridays, and a yearly volunteer day.
The Extras Cell phone, laptop, tuition reimbursement, pet insurance, financial planning services, and more.
Responsibilities The Senior Data & Analytics Engineer focuses on ingesting and transforming project, financial, and operational data for construction projects. Designs and maintains pipelines and models within Microsoft Fabric ecosystem (Data Factory, Notebooks, Lakehouse, Power BI) to enable real‑time insights for project teams and executives.
Delivers well‑defined, transformed, tested, documented, and code‑reviewed datasets for analysis
Designs the structure and layout of data systems, including databases, warehouses, and lakes
Builds and optimizes pipelines using Azure Data Factory, Fabric Pipelines and Notebooks
Creates robust data models and architectures to support analytics initiatives
Collaborates with business stakeholders to understand their analytics needs and deliver comprehensive models and datasets
Defines and manages standards, guidelines, and processes to ensure data quality
Identifies and implements optimizations to continually enhance query performance, reduce processing time, and increase overall productivity
Designs, develops, and maintains data pipelines to ensure efficient and reliable ETL processes
Collaborates with project teams to standardize KPIs for cost, schedule, and risk
Ensures data quality and accuracy by implementing data validation, monitoring, and error-handling processes
Qualifications
Experience : A minimum of 5 years of experience in data analytics, data engineering, software engineering, or a similar role.
Expertise in data modeling, ETL development, and data analysis
Experience with construction industry data and project-based analytics is desired
Experience with Apache Spark or Databricks
Education : Bachelor's degree in computer science, data science, engineering, or related field required.
Additional Role Specific Skills :
Strong ability in SQL for data extraction and manipulation, and proficiency in data warehousing concepts/tools specifically Microsoft Fabric components (Data Factory, Lakehouse, OneLake)
Familiarity with Azure-based data platforms for data storage and processing
Substantial programming ability using languages/tools such as Python, Scala, SQL for data manipulation and scripting
Solid understanding of relevant data governance, data quality, and data security best practices
Familiarity with construction ERP system (CMiC), HCM (Workday), and CRM (Dynamics) integrations
Strong problem‑solving skills, and the ability to think critically and analytically
Knowledge of ETL processes, data integration, and data warehousing concepts
Familiarity with data visualization tools such as Power BI
Excellent communication skills to effectively collaborate with cross‑functional teams and present insights to business stakeholders
#J-18808-Ljbffr