DRC Systems
Join to apply for the
Data Architect
role at
DRC Systems .
Job Title:
Data Architect
Job Location:
Tallahassee, FL (on-site)
Job Duration:
Long Term Contract
Job Tasks And Activities The Data Architects, under the working job title of Extract, Transform, Load (ETL) Architects, will serve as the principal line of communication for the project team. The ETL Architects will drive the development of data integration pipelines, enabling efficient, reliable access to critical data within the Correction Information Management System (CIMS) Data Warehouse/Data Lake on Azure. They will work with Azure Data Factory (ADF), Azure Databricks, Azure Synapse, Power BI, and Azure Purview. The ETL Architects will be at the forefront of transforming complex data into actionable insights. They will be responsible for ensuring data integrity, security, and performance, while meeting mission-critical needs. The specific duties and responsibilities of this position are as follows:
ETL Pipeline Design And Development
Lead the design and development of high-performing ETL processes to integrate and transform data across disparate sources.
Deliver efficient, reliable pipelines that meet business needs and maintain the highest standards of security. and
Utilize ADF to automate and streamline data workflows, ensuring smooth transitions from source to target.
Data Integration And Transformation
Build and manage complex ETL workflows that extract, transform, and load data for downstream analytics and reporting, ensuring data is accurate, timely, and secure. and
Take ownership of data quality and validation, creating resilient ETL processes that ensure only trusted data reaches its destination.
Cloud Platform Expertise
Leverage the full power of the Azure ecosystem-ADF, Databricks, Synapse, and Purview-to manage and process high volumes of structured and unstructured data, delivering solutions that are scalable and performance-optimized. and
Integrate large datasets into Azure Synapse Analytics, enabling analytics teams to deliver data-driven insights that support the Department's mission.
Performance Optimization
Continuously optimize ETL jobs to minimize latency and maximize throughput. and
Ensure the architecture supports fast, reliable data access for end-users and systems, meeting stringent performance metrics.
Security And Compliance
Embed security and compliance best practices in every step of the ETL process..
Protect sensitive data by adhering to industry standards and ensuring compliance with the Department's data governance policies. and
Use Azure Purview to enforce data governance, track data lineage, and ensure that data handling meets the highest standards of integrity.
Collaboration And Stakeholder Engagement
Partner with cross-functional teams (e.g., data engineers, analysts, business stakeholders, and security experts) to design and implement ETL solutions that meet the Department's evolving needs. and
Act as a technical leader and mentor, helping guide junior team members and providing expert guidance on data processing and transformation best practices.
Documentation And Best Practices
Develop and maintain clear, detailed documentation for ETL processes, ensuring the team can consistently deliver high-quality, reliable solutions. and
Establish and enforce best practices for data handling, ETL development, and security, driving a culture of excellence and accountability.
Required Experience
Seven (7) or more years of experience in ETL development and data engineering.
Three (3) or more years of hands-on experience working with ADF, Azure Cloud, Azure Databricks, Azure Synapse Analytics, and Azure Purview.
Proven track record of building and optimizing large-scale ETL pipelines for high-performance, high-availability environments.
Extensive expertise in Spark, Python, and/or Scala for large-scale data transformations.
Strong Structured Query Language (SQL) proficiency and experience working with complex data structures.
In-depth knowledge of data governance, security protocols, and role-based access control (RBAC) within the Azure ecosystem. and
Ability to design ETL processes that are resilient, efficient, and fully compliant with regulatory standards.
Preferred Experience
Possession of a Microsoft Office Certification as an Azure Data Engineer Associate, Azure Solutions Architect Expert, and Azure Fundamentals. and
Azure Databricks Certification as a Data Engineer Associate.
Seniority level Mid-Senior level
Employment type Full-time
Job function Engineering and Information Technology
Industries IT Services and IT Consulting
#J-18808-Ljbffr
Data Architect
role at
DRC Systems .
Job Title:
Data Architect
Job Location:
Tallahassee, FL (on-site)
Job Duration:
Long Term Contract
Job Tasks And Activities The Data Architects, under the working job title of Extract, Transform, Load (ETL) Architects, will serve as the principal line of communication for the project team. The ETL Architects will drive the development of data integration pipelines, enabling efficient, reliable access to critical data within the Correction Information Management System (CIMS) Data Warehouse/Data Lake on Azure. They will work with Azure Data Factory (ADF), Azure Databricks, Azure Synapse, Power BI, and Azure Purview. The ETL Architects will be at the forefront of transforming complex data into actionable insights. They will be responsible for ensuring data integrity, security, and performance, while meeting mission-critical needs. The specific duties and responsibilities of this position are as follows:
ETL Pipeline Design And Development
Lead the design and development of high-performing ETL processes to integrate and transform data across disparate sources.
Deliver efficient, reliable pipelines that meet business needs and maintain the highest standards of security. and
Utilize ADF to automate and streamline data workflows, ensuring smooth transitions from source to target.
Data Integration And Transformation
Build and manage complex ETL workflows that extract, transform, and load data for downstream analytics and reporting, ensuring data is accurate, timely, and secure. and
Take ownership of data quality and validation, creating resilient ETL processes that ensure only trusted data reaches its destination.
Cloud Platform Expertise
Leverage the full power of the Azure ecosystem-ADF, Databricks, Synapse, and Purview-to manage and process high volumes of structured and unstructured data, delivering solutions that are scalable and performance-optimized. and
Integrate large datasets into Azure Synapse Analytics, enabling analytics teams to deliver data-driven insights that support the Department's mission.
Performance Optimization
Continuously optimize ETL jobs to minimize latency and maximize throughput. and
Ensure the architecture supports fast, reliable data access for end-users and systems, meeting stringent performance metrics.
Security And Compliance
Embed security and compliance best practices in every step of the ETL process..
Protect sensitive data by adhering to industry standards and ensuring compliance with the Department's data governance policies. and
Use Azure Purview to enforce data governance, track data lineage, and ensure that data handling meets the highest standards of integrity.
Collaboration And Stakeholder Engagement
Partner with cross-functional teams (e.g., data engineers, analysts, business stakeholders, and security experts) to design and implement ETL solutions that meet the Department's evolving needs. and
Act as a technical leader and mentor, helping guide junior team members and providing expert guidance on data processing and transformation best practices.
Documentation And Best Practices
Develop and maintain clear, detailed documentation for ETL processes, ensuring the team can consistently deliver high-quality, reliable solutions. and
Establish and enforce best practices for data handling, ETL development, and security, driving a culture of excellence and accountability.
Required Experience
Seven (7) or more years of experience in ETL development and data engineering.
Three (3) or more years of hands-on experience working with ADF, Azure Cloud, Azure Databricks, Azure Synapse Analytics, and Azure Purview.
Proven track record of building and optimizing large-scale ETL pipelines for high-performance, high-availability environments.
Extensive expertise in Spark, Python, and/or Scala for large-scale data transformations.
Strong Structured Query Language (SQL) proficiency and experience working with complex data structures.
In-depth knowledge of data governance, security protocols, and role-based access control (RBAC) within the Azure ecosystem. and
Ability to design ETL processes that are resilient, efficient, and fully compliant with regulatory standards.
Preferred Experience
Possession of a Microsoft Office Certification as an Azure Data Engineer Associate, Azure Solutions Architect Expert, and Azure Fundamentals. and
Azure Databricks Certification as a Data Engineer Associate.
Seniority level Mid-Senior level
Employment type Full-time
Job function Engineering and Information Technology
Industries IT Services and IT Consulting
#J-18808-Ljbffr