Programmers.io
Role:
Databricks Architect
Job type:
Contract
Job Description:
Primary Skills: Databricks, PySpark
Proficiency in PySpark and Databricks (Delta Lake, clusters, jobs).
Experience in architecting designs for integrating DBX with Delta Lake.
Hands-on with Apache Airflow (DAG design, monitoring).
Strong in AWS services: S3, EC2, Lambda, IAM.
Strong SQL and Python for transformations and orchestration.
Knowledge of Lakehouse architecture (Delta Lake) and data modeling.
Experience in ETL/ELT and data warehousing best practices.
Responsibilities
Design and implement Databricks-based data processing architectures.
Collaborate with data engineers and data scientists to deliver scalable ETL/ELT pipelines.
Develop and optimize data models and lakehouse patterns using Delta Lake.
Oversee orchestration with Apache Airflow and ensure reliability of data workflows.
Work with AWS services to implement secure and scalable data solutions.
Qualifications
Proficiency in PySpark and Databricks (Delta Lake, clusters, jobs).
Experience architecting designs for integrating Databricks with Delta Lake.
Hands-on with Apache Airflow (DAG design, monitoring).
Strong knowledge of AWS services: S3, EC2, Lambda, IAM.
Strong SQL and Python for transformations and orchestration.
Knowledge of Lakehouse architecture (Delta Lake) and data modeling.
Experience in ETL/ELT and data warehousing best practices.
Best Regards,
Technical Recruiter
#J-18808-Ljbffr
Databricks Architect
Job type:
Contract
Job Description:
Primary Skills: Databricks, PySpark
Proficiency in PySpark and Databricks (Delta Lake, clusters, jobs).
Experience in architecting designs for integrating DBX with Delta Lake.
Hands-on with Apache Airflow (DAG design, monitoring).
Strong in AWS services: S3, EC2, Lambda, IAM.
Strong SQL and Python for transformations and orchestration.
Knowledge of Lakehouse architecture (Delta Lake) and data modeling.
Experience in ETL/ELT and data warehousing best practices.
Responsibilities
Design and implement Databricks-based data processing architectures.
Collaborate with data engineers and data scientists to deliver scalable ETL/ELT pipelines.
Develop and optimize data models and lakehouse patterns using Delta Lake.
Oversee orchestration with Apache Airflow and ensure reliability of data workflows.
Work with AWS services to implement secure and scalable data solutions.
Qualifications
Proficiency in PySpark and Databricks (Delta Lake, clusters, jobs).
Experience architecting designs for integrating Databricks with Delta Lake.
Hands-on with Apache Airflow (DAG design, monitoring).
Strong knowledge of AWS services: S3, EC2, Lambda, IAM.
Strong SQL and Python for transformations and orchestration.
Knowledge of Lakehouse architecture (Delta Lake) and data modeling.
Experience in ETL/ELT and data warehousing best practices.
Best Regards,
Technical Recruiter
#J-18808-Ljbffr