Logo
GB Corp

Data Architecture

GB Corp, Snowflake, Arizona, United States, 85937

Save Job

Job Description - Data Architecture (25000232) Overview

Develop and maintain a scalable, secure, and high-performance data architecture aligned with business needs. Ensure seamless data flow between systems, integrating structured and unstructured data sources. Define and implement best practices for data storage, processing, and security. Select and optimize data platforms, tools, and technologies for efficiency and cost-effectiveness. Data Modeling

Design and implement data models (dimensional, relational, Data Vault, NoSQL) to support operational and analytical use cases. Ensure consistency, accuracy, and integrity of data models across the organization. Continuously refine models to improve performance and adaptability to evolving business requirements. Establish and enforce data modeling standards and governance. Data Engineering and Pipeline Development

Build, optimize, and maintain automated ETL/ELT pipelines to process large-scale data. Troubleshoot, debug and upgrade existing ETL solutions. Enhance data infrastructure to support efficient storage, retrieval, and transformation. Implement automation to reduce manual processes and increase workflow efficiency. Ensure data pipelines are robust, scalable, and support real-time and batch processing. Identify data discrepancies and develop data metrics. Innovation & Technology Adoption

Stay up to date with the latest trends in data architecture, engineering, and analytics to identify and implement innovative solutions. Research and integrate cutting-edge tools and platforms to enhance data management capabilities and streamline processes. Encourage a team culture focused on continuous improvement, supporting experimentation with new technologies and methodologies. Develop and execute a technology roadmap to ensure the team is using the best tools available to meet both current and future data needs. Optimization & Performance Management

Define and enforce data governance policies, ensuring compliance with GDPR, and other regulations. Monitor and optimize database performance, query execution, and storage efficiency. Manage data quality, metadata, and lineage to ensure reliability and transparency. Implement proactive monitoring and alerting for data infrastructure health. Monitor performance and quality control plans to identify improvements. Collaboration with Cross-Functional Teams

Work closely with data scientists, business analysts, and BI teams to understand data requirements and design solutions that enable advanced analytics, reporting, and decision-making. Act as a bridge between business units and IT, ensuring that data architecture and engineering solutions meet both technical and business needs. Lead efforts to optimize data workflows and processes across departments, driving efficiency and alignment in data operations. Qualifications

Educational Requirements: Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field. Advanced degrees provide deeper technical and strategic insights, enabling the professional to drive complex data solutions and innovations at the organizational level. Certified Data Management Professional (CDMP) – Preferred for expertise in data architecture and governance is preferred. Required Industry Experience: 8+ years of experience in Data Architecture, Data Engineering, and Data Modeling with a proven track record of leading large-scale enterprise data initiatives. Expertise in designing, implementing, and optimizing complex data ecosystems in hybrid environments. Strong background in data governance, security, compliance, and risk management across enterprise systems. Hands-on experience managing data integration, migration, and transformation projects in a large-scale business setting. Technological Requirements

Extended knowledge of modern data architecture approaches, including Data Mesh, Data Fabric, and Data Lakehouse. Strong expertise in data modeling techniques (conceptual, logical, physical, dimensional, and semantic models). Deep understanding of data integration patterns, streaming data platforms, and event-driven architectures. Proficiency with cloud data platforms (AWS, Azure, GCP) and modern ecosystem tools (Databricks, Snowflake, BigQuery, Synapse). Strong command of ETL/ELT frameworks, APIs, and microservices for data delivery. Hands-on experience with metadata management, master data management (MDM), and data catalog solutions. Expertise in data governance, lineage, and quality frameworks. Strong knowledge of compliance, security, and privacy requirements (e.g., GDPR). Deep expertise in architecting, optimizing, and managing both relational (e.g., Oracle, SQL Server, PostgreSQL) and NoSQL (e.g., MongoDB, Cassandra) databases.

#J-18808-Ljbffr