Danta Technologies
Databricks Architect
Role
Databricks Architect
at
Danta Technologies . Remote work. Rate: $50/hr on W2 and $63/hr on C2C. Skills Needed
8+ years of experience in data engineering, with at least 3 years on Databricks. Strong proficiency in PySpark, SQL, and Delta Lake. Hands‑on experience with GCP Dataproc. Responsibilities
Administration :
Lead the installation and configuration of Databricks on GCP cloud platforms. Monitor platform health, performance, and cost optimization. Implement governance, logging, and auditing mechanisms.
Development / Enhancements :
Design and develop scalable ETL/ELT pipelines using PySpark, SQL, and Delta Lake. Collaborate with data engineers and analysts to enhance data workflows and models. Optimize existing notebooks and jobs for performance and reliability.
Operations, Support & Troubleshooting :
Provide L2/L3 support for Databricks-related issues and incidents.
Troubleshoot cluster failures, job errors, and performance bottlenecks.
Maintain technical documentation for platform setup, operations, and development standards.
Contact
Rahul Thakur Phone: 760-349-0078 Email: rahul@dantatechnologies.net EEO Statement
All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance. Benefits
Danta offers a competitive compensation package to all W2 employees, including healthcare insurance options (Dental, Medical, Vision), major holidays and paid sick leave as per state law. Seniority Level
Entry level Employment Type
Contract Job Function
Information Technology Industries
IT Services and IT Consulting
#J-18808-Ljbffr
Role
Databricks Architect
at
Danta Technologies . Remote work. Rate: $50/hr on W2 and $63/hr on C2C. Skills Needed
8+ years of experience in data engineering, with at least 3 years on Databricks. Strong proficiency in PySpark, SQL, and Delta Lake. Hands‑on experience with GCP Dataproc. Responsibilities
Administration :
Lead the installation and configuration of Databricks on GCP cloud platforms. Monitor platform health, performance, and cost optimization. Implement governance, logging, and auditing mechanisms.
Development / Enhancements :
Design and develop scalable ETL/ELT pipelines using PySpark, SQL, and Delta Lake. Collaborate with data engineers and analysts to enhance data workflows and models. Optimize existing notebooks and jobs for performance and reliability.
Operations, Support & Troubleshooting :
Provide L2/L3 support for Databricks-related issues and incidents.
Troubleshoot cluster failures, job errors, and performance bottlenecks.
Maintain technical documentation for platform setup, operations, and development standards.
Contact
Rahul Thakur Phone: 760-349-0078 Email: rahul@dantatechnologies.net EEO Statement
All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance. Benefits
Danta offers a competitive compensation package to all W2 employees, including healthcare insurance options (Dental, Medical, Vision), major holidays and paid sick leave as per state law. Seniority Level
Entry level Employment Type
Contract Job Function
Information Technology Industries
IT Services and IT Consulting
#J-18808-Ljbffr