Veridian Tech Solutions, Inc.
Data Architect with Databricks & MDM
Veridian Tech Solutions, Inc., Woodbridge, New Jersey, United States
Data Architect with Databricks & MDM (Contract-W2)
Location: Iselin, NJ (Hybrid)
Job Summary We are seeking a highly skilled Databricks Architect with strong expertise in MDM SQL, Python, Datawarehouse, and Cloud ETL tools to join our data team. The ideal candidate will design, implement, and optimize large‑scale data pipelines, ensuring scalability, reliability, and performance. This role involves working closely with multiple teams and business stakeholders to deliver cutting‑edge data solutions.
Key Responsibilities
Build and maintain scalable ETL/ELT pipelines using Databricks.
Leverage
PySpark/Spark
and SQL to transform and process large datasets.
Integrate data from multiple sources including Azure Blob Storage, ADLS, and other relational/non‑relational systems.
Work closely with multiple teams to prepare data for dashboards and BI tools.
Collaborate with cross‑functional teams to understand business requirements and deliver tailored data solutions.
Optimize Databricks workloads for cost efficiency and performance.
Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
Implement and manage data security, access controls, and governance standards using Unity Catalog.
Ensure compliance with organizational and regulatory data policies.
Leverage Databricks Asset Bundles for seamless deployment of jobs, notebooks, and configurations across environments.
Manage version control for Databricks artifacts and collaborate with the team to maintain development best practices.
Technical Skills
Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime, etc.)
Proficiency in Azure Cloud Services.
Solid understanding of Spark and PySpark for big data processing.
Experience with relational databases.
Knowledge of Databricks Asset Bundles and GitLab.
Preferred Experience
Familiarity with Databricks runtimes and advanced configurations.
Knowledge of streaming frameworks such as Spark Streaming.
Seniority Level Mid-Senior level
Employment Type Contract
Job Function Consulting, Business Development, and Information Technology
Industries IT Services and IT Consulting, Software Development, and Business Consulting and Services
#J-18808-Ljbffr
Job Summary We are seeking a highly skilled Databricks Architect with strong expertise in MDM SQL, Python, Datawarehouse, and Cloud ETL tools to join our data team. The ideal candidate will design, implement, and optimize large‑scale data pipelines, ensuring scalability, reliability, and performance. This role involves working closely with multiple teams and business stakeholders to deliver cutting‑edge data solutions.
Key Responsibilities
Build and maintain scalable ETL/ELT pipelines using Databricks.
Leverage
PySpark/Spark
and SQL to transform and process large datasets.
Integrate data from multiple sources including Azure Blob Storage, ADLS, and other relational/non‑relational systems.
Work closely with multiple teams to prepare data for dashboards and BI tools.
Collaborate with cross‑functional teams to understand business requirements and deliver tailored data solutions.
Optimize Databricks workloads for cost efficiency and performance.
Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
Implement and manage data security, access controls, and governance standards using Unity Catalog.
Ensure compliance with organizational and regulatory data policies.
Leverage Databricks Asset Bundles for seamless deployment of jobs, notebooks, and configurations across environments.
Manage version control for Databricks artifacts and collaborate with the team to maintain development best practices.
Technical Skills
Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime, etc.)
Proficiency in Azure Cloud Services.
Solid understanding of Spark and PySpark for big data processing.
Experience with relational databases.
Knowledge of Databricks Asset Bundles and GitLab.
Preferred Experience
Familiarity with Databricks runtimes and advanced configurations.
Knowledge of streaming frameworks such as Spark Streaming.
Seniority Level Mid-Senior level
Employment Type Contract
Job Function Consulting, Business Development, and Information Technology
Industries IT Services and IT Consulting, Software Development, and Business Consulting and Services
#J-18808-Ljbffr