Confidential Jobs
Overview
Location:
Santa Clara, CA (Fully Onsite) – Need Local
Duration:
6+ Months
Experience:
14+ years
We’re seeking a visionary
Data Architect
with deep expertise in
Databricks
to lead the design, implementation, and optimization of our enterprise data architecture. You’ll be instrumental in shaping scalable data solutions that empower analytics, AI, and business intelligence across the organization.
If you thrive in a fast-paced environment, love solving complex data challenges, and have a passion for cloud-native platforms like
AWS Databricks , we want to hear from you.
Responsibilities
Design and implement robust, scalable, and secure data architectures using
Databricks , Spark, Delta Lake, and cloud-native tools.
Collaborate with data engineers, analysts, and business stakeholders to define data models, pipelines, and governance strategies.
Develop and maintain data lakehouses, ensuring optimal performance and cost-efficiency.
Define best practices for data ingestion, transformation, and storage using
Databricks notebooks,
jobs, and workflows.
Architect solutions for real-time and batch data processing.
Ensure data quality, lineage, and compliance with internal and external standards.
Lead migration efforts from legacy systems to modern cloud-based data platforms.
Mentor junior team members and evangelize data architecture principles across the organization.
Qualifications
12+ years of experience in data architecture, with 5+ years hands-on in Databricks.
Strong Experience in Snowflake
Experience in cloud platforms AWS, especially AWS Databricks.
Strong proficiency in Apache Spark, Delta Lake, and PySpark.
Experience with data modeling, ETL/ELT pipelines, and data warehousing.
Familiarity with CI/CD, DevOps, and Infrastructure as Code (Terraform, ARM templates).
Knowledge of data governance, security, and compliance frameworks.
Excellent communication and stakeholder management skills.
Preferred Qualifications
Databricks Certified Data Engineer or Architect.
Experience with MLflow, Unity Catalog, and Lakehouse architecture.
Background in machine learning, AI, or advanced analytics.
Experience with tools like Apache Airflow, dbt, or Power BI/Tableau.
Seniority level
Mid-Senior level
Employment type
Contract
Job function
Information Technology
Industries
Manufacturing
#J-18808-Ljbffr
Santa Clara, CA (Fully Onsite) – Need Local
Duration:
6+ Months
Experience:
14+ years
We’re seeking a visionary
Data Architect
with deep expertise in
Databricks
to lead the design, implementation, and optimization of our enterprise data architecture. You’ll be instrumental in shaping scalable data solutions that empower analytics, AI, and business intelligence across the organization.
If you thrive in a fast-paced environment, love solving complex data challenges, and have a passion for cloud-native platforms like
AWS Databricks , we want to hear from you.
Responsibilities
Design and implement robust, scalable, and secure data architectures using
Databricks , Spark, Delta Lake, and cloud-native tools.
Collaborate with data engineers, analysts, and business stakeholders to define data models, pipelines, and governance strategies.
Develop and maintain data lakehouses, ensuring optimal performance and cost-efficiency.
Define best practices for data ingestion, transformation, and storage using
Databricks notebooks,
jobs, and workflows.
Architect solutions for real-time and batch data processing.
Ensure data quality, lineage, and compliance with internal and external standards.
Lead migration efforts from legacy systems to modern cloud-based data platforms.
Mentor junior team members and evangelize data architecture principles across the organization.
Qualifications
12+ years of experience in data architecture, with 5+ years hands-on in Databricks.
Strong Experience in Snowflake
Experience in cloud platforms AWS, especially AWS Databricks.
Strong proficiency in Apache Spark, Delta Lake, and PySpark.
Experience with data modeling, ETL/ELT pipelines, and data warehousing.
Familiarity with CI/CD, DevOps, and Infrastructure as Code (Terraform, ARM templates).
Knowledge of data governance, security, and compliance frameworks.
Excellent communication and stakeholder management skills.
Preferred Qualifications
Databricks Certified Data Engineer or Architect.
Experience with MLflow, Unity Catalog, and Lakehouse architecture.
Background in machine learning, AI, or advanced analytics.
Experience with tools like Apache Airflow, dbt, or Power BI/Tableau.
Seniority level
Mid-Senior level
Employment type
Contract
Job function
Information Technology
Industries
Manufacturing
#J-18808-Ljbffr