GTN Technical Staffing
Job Title:
Digital Data Architect
Role Overview A leading organization is seeking a hands‑on Digital Data Architect to design and implement modern data architectures that support analytics, AI/ML initiatives, and digital transformation. This individual contributor role emphasizes technical execution, delivering scalable, secure, and high‑performing data solutions across cloud and hybrid environments.
Key Responsibilities
Design conceptual, logical, and physical data models for both enterprise‑wide and domain‑specific solutions.
Architect data platforms using technologies such as Synapse, Databricks, Data Lake, and Cosmos DB.
Develop robust ETL/ELT pipelines utilizing Data Factory, Event Hubs, and streaming frameworks.
Enable both real‑time and batch data processing for analytics and operational systems.
Implement data cataloging, lineage tracking, and quality frameworks with tools like Purview.
Ensure adherence to security and regulatory standards including GDPR and CCPA.
Optimize data storage strategies, query performance, and cost efficiency across cloud platforms.
Collaborate with solution architects, data engineers, and analytics teams to deliver comprehensive solutions.
Provide technical leadership and promote best practices for data platform adoption.
Required Skills & Qualifications
Bachelor's degree in Computer Science, Information Systems, or a related discipline.
8+ years of experience in data engineering or architecture roles with strong hands‑on capabilities.
Expertise in cloud data platforms (preference for Azure; experience with AWS or GCP is a plus).
Proficiency in data modeling across relational, NoSQL, and big data environments.
Deep understanding of ETL/ELT and streaming data frameworks.
Strong command of SQL, Python, and Spark.
Hands‑on experience with Synapse, Databricks, Data Lake, Cosmos DB, Data Factory, Event Hubs, and Kafka.
Familiarity with business intelligence tools such as Power BI or Tableau.
Preferred Qualifications
Experience implementing Data Mesh or Data Fabric architectures.
Exposure to machine learning pipeline integration.
Relevant certifications such as Data Engineer Associate or Solutions Architect Expert (Azure).
KPIs and Success Metrics
Timely delivery of scalable data architectures.
Measurable reduction in data processing time and improved cost efficiency.
Enhanced data quality and governance compliance, as reflected in audit outcomes.
Increased adoption of modern data platforms and tools across internal teams.
Consistent achievement of service‑level agreements for batch and real‑time data pipelines.
#J-18808-Ljbffr
Digital Data Architect
Role Overview A leading organization is seeking a hands‑on Digital Data Architect to design and implement modern data architectures that support analytics, AI/ML initiatives, and digital transformation. This individual contributor role emphasizes technical execution, delivering scalable, secure, and high‑performing data solutions across cloud and hybrid environments.
Key Responsibilities
Design conceptual, logical, and physical data models for both enterprise‑wide and domain‑specific solutions.
Architect data platforms using technologies such as Synapse, Databricks, Data Lake, and Cosmos DB.
Develop robust ETL/ELT pipelines utilizing Data Factory, Event Hubs, and streaming frameworks.
Enable both real‑time and batch data processing for analytics and operational systems.
Implement data cataloging, lineage tracking, and quality frameworks with tools like Purview.
Ensure adherence to security and regulatory standards including GDPR and CCPA.
Optimize data storage strategies, query performance, and cost efficiency across cloud platforms.
Collaborate with solution architects, data engineers, and analytics teams to deliver comprehensive solutions.
Provide technical leadership and promote best practices for data platform adoption.
Required Skills & Qualifications
Bachelor's degree in Computer Science, Information Systems, or a related discipline.
8+ years of experience in data engineering or architecture roles with strong hands‑on capabilities.
Expertise in cloud data platforms (preference for Azure; experience with AWS or GCP is a plus).
Proficiency in data modeling across relational, NoSQL, and big data environments.
Deep understanding of ETL/ELT and streaming data frameworks.
Strong command of SQL, Python, and Spark.
Hands‑on experience with Synapse, Databricks, Data Lake, Cosmos DB, Data Factory, Event Hubs, and Kafka.
Familiarity with business intelligence tools such as Power BI or Tableau.
Preferred Qualifications
Experience implementing Data Mesh or Data Fabric architectures.
Exposure to machine learning pipeline integration.
Relevant certifications such as Data Engineer Associate or Solutions Architect Expert (Azure).
KPIs and Success Metrics
Timely delivery of scalable data architectures.
Measurable reduction in data processing time and improved cost efficiency.
Enhanced data quality and governance compliance, as reflected in audit outcomes.
Increased adoption of modern data platforms and tools across internal teams.
Consistent achievement of service‑level agreements for batch and real‑time data pipelines.
#J-18808-Ljbffr