Logo
Acxiom

Senior Director, Cloud Solutions Architect - Databricks

Acxiom, Conway, Arkansas, us, 72035

Save Job

As a Senior Director, Cloud Solutions Architect at Acxiom, you will play a pivotal role in designing, implementing, and modernizing enterprise data ecosystems on the Databricks platform. You will collaborate closely with business stakeholders, engineers, and partner teams to craft scalable, secure, and high-performance data solutions. Your expertise in Databricks (Delta Lake, Unity Catalog, MLFlow) and cloud-native data modernization strategies will be essential as you guide the migration of legacy data systems to contemporary cloud data platforms. This influential position requires a unique blend of visionary thinking and technical proficiency, alongside hands-on execution skills to deliver measurable business outcomes and ensure long-term client success. This role allows for flexibility, as you can work from nearly anywhere in the U.S. Key Responsibilities: Architect and design scalable Databricks solutions for enterprise data processing, analytics, machine learning, and business intelligence. Lead complex data modernization initiatives, including assessments, roadmap creation, and the execution of legacy-to-cloud migrations. Define comprehensive architecture for ETL/ELT pipelines, data lakes, lakehouses, and real-time streaming platforms using Delta Lake and Databricks-native tools. Collaborate with clients and internal teams to establish data architecture patterns, security models, and data governance standards. Drive presales efforts by devising solution strategies, crafting compelling client proposals, contributing to RFP responses, and creating tailored demos. Implement CI/CD practices for Databricks notebooks/jobs using DevOps principles and infrastructure-as-code. Develop and share reference architectures and reusable frameworks to support consistent platform adoption across teams. Mentor engineering teams on best practices for data engineering, lakehouse design, performance tuning, and workload optimization. Keep abreast of evolving Databricks platform capabilities and contribute to internal Centers of Excellence. Design and implement data governance, access control, and secure data sharing strategies using Unity Catalog and Delta Sharing. Qualifications: Bachelor's or Master's in Computer Science, Engineering, Information Systems, or a related field. 15+ years of data architecture experience, with at least 5+ years on the Databricks platform. Strong expertise in Apache Spark, Delta Lake, Databricks SQL, Unity Catalog, and MLFlow. Demonstrated experience in data migration projects (e.g., Teradata, Hadoop, Oracle to Databricks). Proficient in Python, SQL, and Spark (PySpark), with hands-on experience in REST APIs and automation. Solid understanding of modern data architectures including data lakehouses and streaming-first pipelines. Experience integrating Databricks with AWS/GCP/Azure, Snowflake, and enterprise tools. Knowledge of security and compliance controls in Databricks. Familiarity with cost optimization and performance tuning best practices in Databricks. What Sets You Apart: Databricks or cloud certifications (e.g., Databricks Certified Data Engineer, AWS/GCP/Azure Architect). Experience with industry use cases in marketing analytics, clean rooms, customer 360, or real-time segmentation. Proven success in solutioning roles within client-facing teams and agile delivery models. This position offers the opportunity to work remotely within the continental U.S. (preferably in Texas or Arkansas). Acxiom is an equal opportunity employer and provides an inclusive work environment.