Logo
Varite

Database Developer - III

Varite, Richmond, Virginia, United States, 23214

Save Job

Pay Rate Range: $75 - $80/Hr Must be a US Citizen/Green Card holder

Qualifications: Data Modeler:

Design and implement scalable and efficient data models within the data mesh architecture, considering factors such as domain-driven design, data as a product, and federated data governance Work closely with data architects, data engineers, business users and translate business needs into technical solutions, and communicate data model designs effectively Leverage Databricks for data engineering tasks such as data processing, data validation and data orchestration Optimize data pipelines and ensure reliable and efficient data processing, high performance, and scalability Implement data validation rules and data quality checks to ensure data integrity and consistency

Data Mesh Data Modeler with Databricks Expertise:

Skilled Data Mesh Data Modeler with Data Engineering expertise in Databricks Lead the design and implementation of data models and data products within the Data Mesh Architecture Design, implement and optimize Data Pipelines Design, implement and manage the lifecycle of Data Products

General Requirements:

Previous experience in data products modeling within a data mesh architecture Strong hands-on expertise in Databricks and Spark Proficiency in SQL and Python Problem-solving and troubleshooting skills Strong communication skills

Responsibilities:

Design and implement scalable and efficient data models within the data mesh architecture, considering factors such as domain-driven design, data as a product, and federated data governance Work closely with data architects, data engineers, business users and translate business needs into technical solutions, and communicate data model designs effectively Leverage Databricks for data engineering tasks such as data processing, data validation and data orchestration Optimize data pipelines and ensure reliable and efficient data processing, high performance, and scalability Implement data validation rules and data quality checks to ensure data integrity and consistency