JPS Tech Solutions
Job Category:
Data Engineer
Job Type:
Hybrid
Job Location:
Atlanta Georgia
Compensation:
Depends on Experience
W2:
W2-Contract Only; Kindly note that applications on a C2C basis will not be considered for this role.
JPS-4131 |
Posted On : 07/29/2025 |
Closes On : 08/06/2025
Job Description
Job Description:
Short Description:
Under general supervision, combines strong technical skills with knowledge of the database administration. Works on one or more projects of high complexity.
Job Description:
The Dept. of Early Care & Development (DECAL) is seeking a highly skilled and proactive Data Engineer to join our dynamic team and support the modernization of our data estate. This role is integral to the migration from legacy systems and the development of scalable, secure, and efficient data solutions using modern technologies, particularly Microsoft Fabric and Azure-based platforms. The successful candidate will contribute to data infrastructure design, data modeling, pipeline development, and visualization delivery to enable data-driven decision-making across the enterprise.
Work Location & Attendance Requirements:
Must be physically located in metro Atlanta. On-site: Tuesday to Thursday, per manager’s discretion Mandatory in-person meetings: All Hands Enterprise Applications On-site meetings DECAL All Staff Work arrangementsare subject to management’s discretion
Key Responsibilities:
Design, build, and maintain scalable ETL/ELT data pipelines using Microsoft Fabric and Azure Databricks. Implement medallion architecture (Bronze, Silver, Gold) to support data lifecycle and data quality. Support the sunsetting of legacy SQL-based infrastructure and SSRS, ensuring data continuity and stakeholder readiness. Create and manage notebooks (e.g., Fabric Notebooks, Databricks) for data transformation using Python, SQL, and Spark. Build and deliver curated datasets and analytics models to support Power BI dashboards and reports. Develop dimensional and real-time data models for analytics use cases. Collaborate with data analysts, stewards, and business stakeholders to deliver fit-for-purpose data assets. Apply data governance policies including row-level security, data masking, and classification in line with Microsoft Purview or Unity Catalog. Ensure monitoring, logging, and CI/CD automation using Azure DevOps for data workflows. Provide support during data migration and cutover events, ensuring minimal disruption.
Technical Stack:
Microsoft Fabric Azure Databricks SQL Server / SQL Managed Instances Power BI (including semantic models and datasets) SSRS (for legacy support and decommissioning)
Qualifications:
Bachelor’s degree in Computer Science, Information Systems, or related field 5+ years of experience in data engineering roles, preferably in government or regulated environments Proficiency in SQL, Python, Spark. Hands-on experience with Microsoft Fabric (Dataflows, Pipelines, Notebooks, OneLake) Experience with Power BI data modeling and dashboard development Familiarity with data governance tools (Microsoft Purview, Unity Catalog) Solid understanding of ETL/ELT pipelines, data warehousing concepts, and schema design Strong communication and collaboration skills.
Preferred Qualifications:
Certifications such as Microsoft Certified: Fabric Analytics Engineer or Azure Data Engineer Associate Knowledge of CI/CD automation with Azure DevOps Familiarity with data security and compliance (e.g., FIPS 199, NIST) Experience managing sunset and modernization of legacy reporting systems like SSRS
Soft Skills:
Strong analytical thinking and problem-solving abilities Ability to collaborate across multidisciplinary teams Comfort in fast-paced and evolving technology environments
This role is critical to our shift toward a modern data platform and offers the opportunity to influence our architectural decisions and technical roadmap.
Skill Matrix:-
Skill
Required / Desired
Amount
of Experience
Experience in data engineering roles, preferably in government or regulated environments
Required
Years
Hands-on experience with Microsoft Fabric (Dataflows, Pipelines, Notebooks, OneLake)
Required
Years
Experience with Power BI data modeling and dashboard development
Required
Years
Familiarity with data governance tools (Microsoft Purview, Unity Catalog)
Required
Years
Solid understanding of ETL/ELT pipelines, data warehousing concepts, and schema design
Required
Years
Bachelor’s degree in Computer Science, Information Systems, or related field
Required
Apply Online
Your Name *
Your Phone Number *
Your Email Address *
Job Id *
What is your current visa status? *
Select
Enter Other Valid Visa
W2 or C2C *
Select
Where are you currently located at? *
How many years of relevant experience you have? *
Do you require h1b sponsorship? *
Select
Upload Resume *
Choose a file
No file chosen.
Facebook X LinkedIn WhatsApp #J-18808-Ljbffr
Data Engineer
Job Type:
Hybrid
Job Location:
Atlanta Georgia
Compensation:
Depends on Experience
W2:
W2-Contract Only; Kindly note that applications on a C2C basis will not be considered for this role.
JPS-4131 |
Posted On : 07/29/2025 |
Closes On : 08/06/2025
Job Description
Job Description:
Short Description:
Under general supervision, combines strong technical skills with knowledge of the database administration. Works on one or more projects of high complexity.
Job Description:
The Dept. of Early Care & Development (DECAL) is seeking a highly skilled and proactive Data Engineer to join our dynamic team and support the modernization of our data estate. This role is integral to the migration from legacy systems and the development of scalable, secure, and efficient data solutions using modern technologies, particularly Microsoft Fabric and Azure-based platforms. The successful candidate will contribute to data infrastructure design, data modeling, pipeline development, and visualization delivery to enable data-driven decision-making across the enterprise.
Work Location & Attendance Requirements:
Must be physically located in metro Atlanta. On-site: Tuesday to Thursday, per manager’s discretion Mandatory in-person meetings: All Hands Enterprise Applications On-site meetings DECAL All Staff Work arrangementsare subject to management’s discretion
Key Responsibilities:
Design, build, and maintain scalable ETL/ELT data pipelines using Microsoft Fabric and Azure Databricks. Implement medallion architecture (Bronze, Silver, Gold) to support data lifecycle and data quality. Support the sunsetting of legacy SQL-based infrastructure and SSRS, ensuring data continuity and stakeholder readiness. Create and manage notebooks (e.g., Fabric Notebooks, Databricks) for data transformation using Python, SQL, and Spark. Build and deliver curated datasets and analytics models to support Power BI dashboards and reports. Develop dimensional and real-time data models for analytics use cases. Collaborate with data analysts, stewards, and business stakeholders to deliver fit-for-purpose data assets. Apply data governance policies including row-level security, data masking, and classification in line with Microsoft Purview or Unity Catalog. Ensure monitoring, logging, and CI/CD automation using Azure DevOps for data workflows. Provide support during data migration and cutover events, ensuring minimal disruption.
Technical Stack:
Microsoft Fabric Azure Databricks SQL Server / SQL Managed Instances Power BI (including semantic models and datasets) SSRS (for legacy support and decommissioning)
Qualifications:
Bachelor’s degree in Computer Science, Information Systems, or related field 5+ years of experience in data engineering roles, preferably in government or regulated environments Proficiency in SQL, Python, Spark. Hands-on experience with Microsoft Fabric (Dataflows, Pipelines, Notebooks, OneLake) Experience with Power BI data modeling and dashboard development Familiarity with data governance tools (Microsoft Purview, Unity Catalog) Solid understanding of ETL/ELT pipelines, data warehousing concepts, and schema design Strong communication and collaboration skills.
Preferred Qualifications:
Certifications such as Microsoft Certified: Fabric Analytics Engineer or Azure Data Engineer Associate Knowledge of CI/CD automation with Azure DevOps Familiarity with data security and compliance (e.g., FIPS 199, NIST) Experience managing sunset and modernization of legacy reporting systems like SSRS
Soft Skills:
Strong analytical thinking and problem-solving abilities Ability to collaborate across multidisciplinary teams Comfort in fast-paced and evolving technology environments
This role is critical to our shift toward a modern data platform and offers the opportunity to influence our architectural decisions and technical roadmap.
Skill Matrix:-
Skill
Required / Desired
Amount
of Experience
Experience in data engineering roles, preferably in government or regulated environments
Required
Years
Hands-on experience with Microsoft Fabric (Dataflows, Pipelines, Notebooks, OneLake)
Required
Years
Experience with Power BI data modeling and dashboard development
Required
Years
Familiarity with data governance tools (Microsoft Purview, Unity Catalog)
Required
Years
Solid understanding of ETL/ELT pipelines, data warehousing concepts, and schema design
Required
Years
Bachelor’s degree in Computer Science, Information Systems, or related field
Required
Apply Online
Your Name *
Your Phone Number *
Your Email Address *
Job Id *
What is your current visa status? *
Select
Enter Other Valid Visa
W2 or C2C *
Select
Where are you currently located at? *
How many years of relevant experience you have? *
Do you require h1b sponsorship? *
Select
Upload Resume *
Choose a file
No file chosen.
Facebook X LinkedIn WhatsApp #J-18808-Ljbffr