Logo
Equifax

Global Platform Big Data Architect

Equifax, Alpharetta, Georgia, United States, 30239

Save Job

To adhere to corporate location policies, this resource will be required to be local to the surrounding Atlanta, GA. You must adhere to our Return To Office (RTO) / weekly onsite requirements (Tuesday, Wednesday, and Thursday).

What you will do

Data Fabric Vision & Strategy: Define and champion the architectural vision and strategy for our enterprise-wide Data Fabric platform, enabling seamless data discovery, access, integration, and governance across disparate data sources.

Architectural Leadership: Lead the design and architecture of highly scalable, resilient, and cost-effective data solutions leveraging a diverse set of big data and cloud-native services in GCP and AWS.

Technical Guidance & Mentorship: Provide expert architectural guidance, technical leadership, and mentorship to multiple engineering teams, ensuring adherence to architectural principles, best practices, and design patterns.

Platform Development & Evolution: Drive the selection, implementation, and continuous improvement of core data platform components, tools, and frameworks.

Cloud‑Native Expertise: Leverage deep understanding of GCP and AWS data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, S3, EMR, Kinesis, Redshift, Glue, Athena) to design optimal solutions.

Data Governance & Security: Architect and implement robust data governance, security, privacy, and compliance measures within the data platform, ensuring data integrity and regulatory adherence.

Performance & Optimization: Identify and address performance bottlenecks, optimize data pipelines, and ensure efficient resource utilization across cloud environments.

Innovation & Research: Stay abreast of emerging big data and cloud technologies, evaluate their potential impact, and recommend their adoption where appropriate.

Cross‑Functional Collaboration: Collaborate closely with data scientists, data engineers, analytics teams, product managers, and other stakeholders to understand data requirements and translate them into architectural designs.

Documentation & Standards: Develop and maintain comprehensive architectural documentation, standards, and guidelines for data platform development.

Proof‑of‑Concepts (POCs): Lead and execute proof‑of‑concepts for new technologies and architectural patterns to validate their feasibility and value.

What Experience You Will Need

Bachelor's or Master's degree in Computer Science, Software Engineering, or a related quantitative field.

10+ years of progressive experience in data architecture, big data engineering, or cloud platform engineering.

5+ years of hands‑on experience specifically designing and building large‑scale data platforms in a cloud environment.

Expertise in designing and implementing data lakes, data warehouses, and data marts in cloud environments.

Proficiency in at least one major programming language for data processing (e.g., Python, Scala, Java).

Deep understanding of distributed data processing frameworks (e.g., Apache Spark, Flink).

Experience with various data modeling techniques (dimensional, relational, NoSQL).

Solid understanding of DevOps principles, CI/CD pipelines, and infrastructure as code (e.g., Terraform, CloudFormation).

Experience with real‑time data streaming technologies (e.g., Kafka, Kinesis, Pub/Sub).

Strong understanding of data governance, data quality, and metadata management concepts.

Excellent communication, presentation, and interpersonal skills with the ability to articulate complex technical concepts to both technical and non‑technical audiences.

Proven ability to lead and influence technical teams without direct authority.

What could set you apart

Strong, demonstrable experience with Google Cloud Platform (GCP) big data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, Cloud Functions). GCP certifications (e.g., Professional Data Engineer, Professional Cloud Architect).

Strong, demonstrable experience with Amazon Web Services (AWS) big data services (e.g., S3, EMR, Kinesis, Redshift, Glue, Athena, Lambda). AWS certifications (e.g., Solutions Architect Professional, Big Data Specialty).

Experience with data mesh principles and implementing domain‑oriented data architectures.

Familiarity with other cloud platforms (e.g., Azure) or on‑premise data technologies.

Experience with containerization technologies (e.g., Docker, Kubernetes).

Knowledge of machine learning operationalization (MLOps) principles and platforms.

Contributions to open‑source big data projects.

#J-18808-Ljbffr