Jobs via Dice
We are seeking a talented data professional to design, develop, and maintain scalable data solutions using
Snowflake . The ideal candidate brings expertise in
data modeling
(particularly
Data Vault architecture ) and
database administration (DBA) , with additional skills in
AWS ,
Python ,
Spark , and
Apache Airflow
to support cloud-based data pipelines and analytics. Experience with
SQL Server DBA
is a strong plus. Key Responsibilities
Design and implement scalable data solutions using Snowflake. Perform data modeling, including Data Vault, to optimize data organization and retrieval. Optimize Snowflake environments for performance, scalability, and security. Build ETL/ELT pipelines using Python and Spark for data integration and transformation. Leverage AWS services (e.g., S3, Glue, Lambda) for cloud-based data storage and processing. Create task orchestration workflows with Apache Airflow. Collaborate with cross-functional teams to gather requirements and deliver tailored solutions. Ensure data integrity, security (e.g., RBAC, encryption), and compliance with industry standards. Required Skills
Primary
Proficiency in Snowflake development (e.g., data pipelines, Snowpipe, stages, stored procedures). Strong experience in data modeling, including Data Vault architecture. Solid knowledge of database administration principles. Secondary
Expertise in AWS services (e.g., S3, Glue, Lambda). Proficiency in Python for scripting and automation. Experience with Spark for distributed data processing. Familiarity with Apache Airflow for orchestration. Desirable
Experience with SQL Server DBA.
#J-18808-Ljbffr
Snowflake . The ideal candidate brings expertise in
data modeling
(particularly
Data Vault architecture ) and
database administration (DBA) , with additional skills in
AWS ,
Python ,
Spark , and
Apache Airflow
to support cloud-based data pipelines and analytics. Experience with
SQL Server DBA
is a strong plus. Key Responsibilities
Design and implement scalable data solutions using Snowflake. Perform data modeling, including Data Vault, to optimize data organization and retrieval. Optimize Snowflake environments for performance, scalability, and security. Build ETL/ELT pipelines using Python and Spark for data integration and transformation. Leverage AWS services (e.g., S3, Glue, Lambda) for cloud-based data storage and processing. Create task orchestration workflows with Apache Airflow. Collaborate with cross-functional teams to gather requirements and deliver tailored solutions. Ensure data integrity, security (e.g., RBAC, encryption), and compliance with industry standards. Required Skills
Primary
Proficiency in Snowflake development (e.g., data pipelines, Snowpipe, stages, stored procedures). Strong experience in data modeling, including Data Vault architecture. Solid knowledge of database administration principles. Secondary
Expertise in AWS services (e.g., S3, Glue, Lambda). Proficiency in Python for scripting and automation. Experience with Spark for distributed data processing. Familiarity with Apache Airflow for orchestration. Desirable
Experience with SQL Server DBA.
#J-18808-Ljbffr