Optomi
This range is provided by Optomi. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay range
$60.00/hr - $80.00/hr Optomi, in partnership with a big 4 leader, is seeking both a Databricks Architect and a Databricks Engineer. They are looking for experience in building, optimizing, and managing modern data pipelines and analytics platforms. The ideal candidate will be hands-on with Databricks SQL (DBSQL), Delta Lake, Unity Catalog, PySpark, and Python, and will have strong expertise in performance tuning and large-scale data engineering best practices. This role requires close collaboration with client stakeholders, architects, and business teams. Key Responsibilities
Design, build, and optimize ETL/ELT pipelines on Databricks Lakehouse platform using PySpark, Delta Lake, and Databricks SQL. Implement data modeling and manage large-scale data sets for analytics, reporting, and machine learning workloads. Configure and manage Unity Catalog for centralized data governance, security, and lineage. Write high-performance PySpark and SQL code ensuring scalability and cost efficiency. Apply performance tuning techniques for queries, jobs, and pipelines to optimize compute utilization. Collaborate with data architects, analysts, and business teams to understand requirements and deliver reliable data solutions. Establish best practices for data quality, lineage, and metadata management. Work on data migration, ingestion frameworks, and streaming/batch data pipelines. Ensure compliance with data security, governance, and privacy standards. Core Databricks Expertise
Databricks SQL (DBSQL) – advanced query development and performance optimization. Delta Lake – ACID transactions, schema evolution, time travel, and optimization. Unity Catalog – access controls, lineage, and catalog/schema/table management. Strong coding skills in PySpark and Python. Experience with data pipelines, transformations, and orchestration. Job Details
100% Remote - client based out of NY W2 Only - must be able to work W2, no C2C or sponsorship Type: Contract - 6 months with high likelihood to extend. Seniority level: Mid-Senior level Employment type: Contract Job function: Information Technology
#J-18808-Ljbffr
$60.00/hr - $80.00/hr Optomi, in partnership with a big 4 leader, is seeking both a Databricks Architect and a Databricks Engineer. They are looking for experience in building, optimizing, and managing modern data pipelines and analytics platforms. The ideal candidate will be hands-on with Databricks SQL (DBSQL), Delta Lake, Unity Catalog, PySpark, and Python, and will have strong expertise in performance tuning and large-scale data engineering best practices. This role requires close collaboration with client stakeholders, architects, and business teams. Key Responsibilities
Design, build, and optimize ETL/ELT pipelines on Databricks Lakehouse platform using PySpark, Delta Lake, and Databricks SQL. Implement data modeling and manage large-scale data sets for analytics, reporting, and machine learning workloads. Configure and manage Unity Catalog for centralized data governance, security, and lineage. Write high-performance PySpark and SQL code ensuring scalability and cost efficiency. Apply performance tuning techniques for queries, jobs, and pipelines to optimize compute utilization. Collaborate with data architects, analysts, and business teams to understand requirements and deliver reliable data solutions. Establish best practices for data quality, lineage, and metadata management. Work on data migration, ingestion frameworks, and streaming/batch data pipelines. Ensure compliance with data security, governance, and privacy standards. Core Databricks Expertise
Databricks SQL (DBSQL) – advanced query development and performance optimization. Delta Lake – ACID transactions, schema evolution, time travel, and optimization. Unity Catalog – access controls, lineage, and catalog/schema/table management. Strong coding skills in PySpark and Python. Experience with data pipelines, transformations, and orchestration. Job Details
100% Remote - client based out of NY W2 Only - must be able to work W2, no C2C or sponsorship Type: Contract - 6 months with high likelihood to extend. Seniority level: Mid-Senior level Employment type: Contract Job function: Information Technology
#J-18808-Ljbffr