Acxiom
Overview
As a Databricks Platform & Solutions Architect at Acxiom, you will lead the design, implementation, and modernization of enterprise data ecosystems on the Databricks platform. You will work closely with business stakeholders, engineers, and partner teams to architect scalable, secure, and high-performance data solutions. This role requires deep expertise in Databricks (Delta Lake, Unity Catalog, MLFlow), cloud-native data modernization strategies, and experience migrating legacy data systems to modern cloud data platforms. You will translate strategic concepts into detailed, actionable technical designs and guide the cloud delivery team through the full implementation lifecycle. This individual contributor role combines visionary thinking with hands-on execution to deliver measurable business outcomes and client success. This role can be located almost anywhere in the U.S. What You Will Do
Architect and design scalable Databricks solutions for enterprise data processing, analytics, machine learning, and business intelligence workloads. Lead complex data modernization initiatives, including assessments, roadmap creation, and execution of legacy-to-cloud migrations (on-prem Hadoop, EDWs, etc.). Define end-to-end architecture for ETL/ELT pipelines, data lakes, lakehouses, and real-time streaming platforms using Delta Lake and Databricks-native tools. Partner with client and internal teams to define data architecture patterns, security models (Unity Catalog, row/column-level access), and data governance standards. Drive presales efforts by shaping solution strategies, crafting client proposals, contributing to RFP responses, authoring statements of work (SOWs), and developing tailored demos that showcase technical capabilities and business value. Implement CI/CD practices for Databricks notebooks/jobs using DevOps principles and infrastructure-as-code (e.g., Terraform, GitHub Actions, Azure DevOps). Develop and publish reference architectures, reusable frameworks, and accelerators to enable consistent platform adoption across teams. Mentor engineering teams in best practices for data engineering, lakehouse design, performance tuning, and workload optimization in Databricks. Stay current with evolving Databricks platform capabilities and contribute to internal Centers of Excellence and capability building. Design and implement data governance, access control, secure data sharing strategies and data clean rooms using Unity Catalog and Delta Sharing to enable compliant, cross-platform collaboration across partners.
Qualifications
Bachelors or Masters in Computer Science, Engineering, Information Systems, or a related field. 15+ years of data architecture experience, with at least 5+ years on the Databricks platform. Strong expertise in Apache Spark, Delta Lake, Databricks SQL, Unity Catalog, and MLFlow. Demonstrated experience in data migration projects (e.g., Teradata, Hadoop, Oracle to Databricks). Proficient in Python, SQL, and Spark (PySpark), and hands-on experience with REST APIs and automation. Solid understanding of modern data architectures (data lakehouse, streaming-first pipelines, real-time analytics). Experience integrating Databricks with AWS/GCP/Azure, Snowflake, and enterprise tools (Informatica, dbt, Airflow, etc.). Experience with security and compliance controls on Databricks: encryption, auditing, access controls. Familiarity with cost optimization and performance tuning best practices in Databricks environments.
What Will Set You Apart
Databricks or cloud certifications (e.g., Databricks Certified Data Engineer, AWS/GCP/Azure Architect). Experience with industry use cases in marketing analytics, clean rooms, customer 360, or real-time segmentation. Demonstrated success in solutioning roles within client-facing teams and agile delivery models.
Location
Remote, Continental US (preferably in Texas or Arkansas). About Acxiom
Acxiom is an equal opportunity employer, including disability and protected veteran status (EOE/Vet/Disabled) and does not discriminate in recruiting, hiring, training, promotion or other employment decisions. Note
Only essential notices are included here; other location-specific or marketing content has been omitted to focus on the role and requirements. #J-18808-Ljbffr
As a Databricks Platform & Solutions Architect at Acxiom, you will lead the design, implementation, and modernization of enterprise data ecosystems on the Databricks platform. You will work closely with business stakeholders, engineers, and partner teams to architect scalable, secure, and high-performance data solutions. This role requires deep expertise in Databricks (Delta Lake, Unity Catalog, MLFlow), cloud-native data modernization strategies, and experience migrating legacy data systems to modern cloud data platforms. You will translate strategic concepts into detailed, actionable technical designs and guide the cloud delivery team through the full implementation lifecycle. This individual contributor role combines visionary thinking with hands-on execution to deliver measurable business outcomes and client success. This role can be located almost anywhere in the U.S. What You Will Do
Architect and design scalable Databricks solutions for enterprise data processing, analytics, machine learning, and business intelligence workloads. Lead complex data modernization initiatives, including assessments, roadmap creation, and execution of legacy-to-cloud migrations (on-prem Hadoop, EDWs, etc.). Define end-to-end architecture for ETL/ELT pipelines, data lakes, lakehouses, and real-time streaming platforms using Delta Lake and Databricks-native tools. Partner with client and internal teams to define data architecture patterns, security models (Unity Catalog, row/column-level access), and data governance standards. Drive presales efforts by shaping solution strategies, crafting client proposals, contributing to RFP responses, authoring statements of work (SOWs), and developing tailored demos that showcase technical capabilities and business value. Implement CI/CD practices for Databricks notebooks/jobs using DevOps principles and infrastructure-as-code (e.g., Terraform, GitHub Actions, Azure DevOps). Develop and publish reference architectures, reusable frameworks, and accelerators to enable consistent platform adoption across teams. Mentor engineering teams in best practices for data engineering, lakehouse design, performance tuning, and workload optimization in Databricks. Stay current with evolving Databricks platform capabilities and contribute to internal Centers of Excellence and capability building. Design and implement data governance, access control, secure data sharing strategies and data clean rooms using Unity Catalog and Delta Sharing to enable compliant, cross-platform collaboration across partners.
Qualifications
Bachelors or Masters in Computer Science, Engineering, Information Systems, or a related field. 15+ years of data architecture experience, with at least 5+ years on the Databricks platform. Strong expertise in Apache Spark, Delta Lake, Databricks SQL, Unity Catalog, and MLFlow. Demonstrated experience in data migration projects (e.g., Teradata, Hadoop, Oracle to Databricks). Proficient in Python, SQL, and Spark (PySpark), and hands-on experience with REST APIs and automation. Solid understanding of modern data architectures (data lakehouse, streaming-first pipelines, real-time analytics). Experience integrating Databricks with AWS/GCP/Azure, Snowflake, and enterprise tools (Informatica, dbt, Airflow, etc.). Experience with security and compliance controls on Databricks: encryption, auditing, access controls. Familiarity with cost optimization and performance tuning best practices in Databricks environments.
What Will Set You Apart
Databricks or cloud certifications (e.g., Databricks Certified Data Engineer, AWS/GCP/Azure Architect). Experience with industry use cases in marketing analytics, clean rooms, customer 360, or real-time segmentation. Demonstrated success in solutioning roles within client-facing teams and agile delivery models.
Location
Remote, Continental US (preferably in Texas or Arkansas). About Acxiom
Acxiom is an equal opportunity employer, including disability and protected veteran status (EOE/Vet/Disabled) and does not discriminate in recruiting, hiring, training, promotion or other employment decisions. Note
Only essential notices are included here; other location-specific or marketing content has been omitted to focus on the role and requirements. #J-18808-Ljbffr