S M Software Solutions Inc
775865 - Architect (Azure Data Warehouse Developer)
S M Software Solutions Inc, Phila, Pennsylvania, United States
Overview
Project Overview: Support of a Data Modernization Initiative, with the vision that all public health policies and interventions are driven by data, and the mission to provide all internal and external public health decision makers with accessible, timely, reliable, and meaningful data to drive policies and interventions. The Enterprise Data Warehouse (EDW) is responding to DOH’s need for centralized data and state of the art data analysis services by modernizing its data portfolio, architecture, and statistical analysis capabilities aimed at improving public health surveillance, interventions, future outbreak prevention, outcomes, and research. Architect / Azure DW Developer position will support both the existing business and reporting requirements of individual DOH / DDAP systems and program areas, and the construction of a modern data warehouse that will serve DOH / DDAP from an enterprise perspective. The primary objective of this engagement is for the selected candidate to serve as the data warehouse developer supporting the analysis and reporting needs of the DOH / DDAP, and the design and construction of a modern EDW in Azure. This position’s scope includes the modernization of DOH operations; plan, coordinate and respond to data reporting needs, set standards and define framework; assist with large volume data processing, statistical analysis of large datasets; revamping the EDW into Microsoft’s Azure Cloud utilizing Azure Databricks, Delta Lake and Synapse, including compute, storage and application fabric, as well as services for infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) and serverless technologies; create a centralized data model; support for DOH projects like ELC Enhanced Detection Expansion, Data Modernization Initiative, PA NEDSS NextGen, PA LIMS Replacement, Reporting Hub, Verato UMPI, COVID-19 response, and onboarding additional DOH systems into the EDW.
Estimated Start Date: 09/22/2025 Estimated End Date: 06/30/2026
Responsibilities
Manage assignments and track progress against agreed upon timelines. Plan, organize, prioritize, and manage work efforts, coordinating with the EDW and other teams. Participate in status reviews, process reviews, deliverable reviews, and software quality assurance work product reviews with the appropriate stakeholders. Participate in business and technical requirements gathering. Perform research on potential solutions and provide recommendations to the EDW and DOH. Develop and implement solutions that meet business and technical requirements. Participate in testing of implemented solution(s). Build and maintain relationships with key stakeholders and customer representatives. Give presentations for the EDW, other DOH offices, and agencies involved with this project. Develops and maintains processes and procedural documentation. Ensure project compliance with relative federal and commonwealth standards and procedures. Conduct training and transfer of knowledge sessions for system and code maintenance. Complete weekly timesheet reporting in PeopleFluent/VectorVMS by COB each Friday. Complete weekly project status updates in Daptiv if necessary. This will be dependent on a project being entered in Daptiv. Provide weekly personal status reporting by COB Friday submitted on SharePoint. Utilize a SharePoint site for project and operational documentation; review existing documentation.
Qualifications
Senior level resource with advanced, specialized knowledge and experience in data warehousing, database, and programming concepts and technology. Proven experience in development, maintenance, testing, and deployment of Azure production systems and projects. Design, development, testing, and implementation of data lakes, databases, ETL/ELT programs, applications, and reports. Experience working with business analysts, application developers, DBAs, network and system staff to achieve project objectives. Significant, hands-on technical experience with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python. Experience with SQL Server and Azure Synapse; ETL/ELT using SQL Server Integration Services and other tools; SQL Server, T-SQL, scripts, queries. Azure DevOps CI/CD Pipeline Release Manager experience with robust, scalable pipelines and Monorepo-based CI/CD. Knowledge of data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering techniques. Experience with data mining architecture, modeling standards, reporting and data analysis methodologies. Experience with data engineering, database file systems optimization, APIs, and analytics as a service; translating business requirements into optimized designs. Advanced knowledge of relational and dimensional databases, star schema concepts, and data warehousing terminology. Ability to create and maintain technical documentation, diagrams, flowcharts, instructions, manuals, test plans, and test cases; follow SDLC best practices and participate in peer code reviews. Strong organizational skills, ability to balance multiple projects with minimal supervision, and effective communication skills. Ability to present complex technical concepts and data to varied audiences. More than 5 years of relevant experience. 4-year college degree in computer science or related field with advanced study preferred.
Preferred Experience
Experience in the public health or healthcare industry with various health datasets.
How To Apply
If you are interested in this opportunity, please submit the following documents to hrteam@thethinkbeyond.com: Updated Resume in Word format (Mandatory) Expected hourly rate (Mandatory) Note: Applications without the mandatory documents cannot be processed. If this role is not suitable for you, please forward this message to anyone who may be interested. Thank you for considering this opportunity. If you have any questions, call or text at (512) 800-8781.
#J-18808-Ljbffr
Project Overview: Support of a Data Modernization Initiative, with the vision that all public health policies and interventions are driven by data, and the mission to provide all internal and external public health decision makers with accessible, timely, reliable, and meaningful data to drive policies and interventions. The Enterprise Data Warehouse (EDW) is responding to DOH’s need for centralized data and state of the art data analysis services by modernizing its data portfolio, architecture, and statistical analysis capabilities aimed at improving public health surveillance, interventions, future outbreak prevention, outcomes, and research. Architect / Azure DW Developer position will support both the existing business and reporting requirements of individual DOH / DDAP systems and program areas, and the construction of a modern data warehouse that will serve DOH / DDAP from an enterprise perspective. The primary objective of this engagement is for the selected candidate to serve as the data warehouse developer supporting the analysis and reporting needs of the DOH / DDAP, and the design and construction of a modern EDW in Azure. This position’s scope includes the modernization of DOH operations; plan, coordinate and respond to data reporting needs, set standards and define framework; assist with large volume data processing, statistical analysis of large datasets; revamping the EDW into Microsoft’s Azure Cloud utilizing Azure Databricks, Delta Lake and Synapse, including compute, storage and application fabric, as well as services for infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) and serverless technologies; create a centralized data model; support for DOH projects like ELC Enhanced Detection Expansion, Data Modernization Initiative, PA NEDSS NextGen, PA LIMS Replacement, Reporting Hub, Verato UMPI, COVID-19 response, and onboarding additional DOH systems into the EDW.
Estimated Start Date: 09/22/2025 Estimated End Date: 06/30/2026
Responsibilities
Manage assignments and track progress against agreed upon timelines. Plan, organize, prioritize, and manage work efforts, coordinating with the EDW and other teams. Participate in status reviews, process reviews, deliverable reviews, and software quality assurance work product reviews with the appropriate stakeholders. Participate in business and technical requirements gathering. Perform research on potential solutions and provide recommendations to the EDW and DOH. Develop and implement solutions that meet business and technical requirements. Participate in testing of implemented solution(s). Build and maintain relationships with key stakeholders and customer representatives. Give presentations for the EDW, other DOH offices, and agencies involved with this project. Develops and maintains processes and procedural documentation. Ensure project compliance with relative federal and commonwealth standards and procedures. Conduct training and transfer of knowledge sessions for system and code maintenance. Complete weekly timesheet reporting in PeopleFluent/VectorVMS by COB each Friday. Complete weekly project status updates in Daptiv if necessary. This will be dependent on a project being entered in Daptiv. Provide weekly personal status reporting by COB Friday submitted on SharePoint. Utilize a SharePoint site for project and operational documentation; review existing documentation.
Qualifications
Senior level resource with advanced, specialized knowledge and experience in data warehousing, database, and programming concepts and technology. Proven experience in development, maintenance, testing, and deployment of Azure production systems and projects. Design, development, testing, and implementation of data lakes, databases, ETL/ELT programs, applications, and reports. Experience working with business analysts, application developers, DBAs, network and system staff to achieve project objectives. Significant, hands-on technical experience with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python. Experience with SQL Server and Azure Synapse; ETL/ELT using SQL Server Integration Services and other tools; SQL Server, T-SQL, scripts, queries. Azure DevOps CI/CD Pipeline Release Manager experience with robust, scalable pipelines and Monorepo-based CI/CD. Knowledge of data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering techniques. Experience with data mining architecture, modeling standards, reporting and data analysis methodologies. Experience with data engineering, database file systems optimization, APIs, and analytics as a service; translating business requirements into optimized designs. Advanced knowledge of relational and dimensional databases, star schema concepts, and data warehousing terminology. Ability to create and maintain technical documentation, diagrams, flowcharts, instructions, manuals, test plans, and test cases; follow SDLC best practices and participate in peer code reviews. Strong organizational skills, ability to balance multiple projects with minimal supervision, and effective communication skills. Ability to present complex technical concepts and data to varied audiences. More than 5 years of relevant experience. 4-year college degree in computer science or related field with advanced study preferred.
Preferred Experience
Experience in the public health or healthcare industry with various health datasets.
How To Apply
If you are interested in this opportunity, please submit the following documents to hrteam@thethinkbeyond.com: Updated Resume in Word format (Mandatory) Expected hourly rate (Mandatory) Note: Applications without the mandatory documents cannot be processed. If this role is not suitable for you, please forward this message to anyone who may be interested. Thank you for considering this opportunity. If you have any questions, call or text at (512) 800-8781.
#J-18808-Ljbffr