Logo
Kasmo Global

Snowflake Data Engineer

Kasmo Global, Texas City, Texas, us, 77592

Save Job

Job Description:

The ideal candidate will have expertise in the following areas

1. Snowflake, Snowpark: The candidate should have a deep understanding of Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics. 2. AWS services (Airflow): The candidate should have hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows and pipelines. 3. AWS services (Lambda): Proficiency in AWS Lambda for serverless computing and event-driven architecture is essential for this role. 4. AWS services (Glue): The candidate should be well-versed in AWS Glue for ETL (Extract, Transform, Load) processes and data integration. 5. Python: Strong programming skills in Python are required for developing data pipelines, data transformations, and automation tasks. 6. DBT: Experience with DBT (Data Build Tool) for modeling data and creating data transformation pipelines is a plus.Responsibilities:- Design, develop, and maintain data pipelines and ETL processes using Snowflake, AWS services, Python, and DBT.- Collaborate with data scientists and analysts to understand data requirements and implement solutions.- Optimize data workflows for performance, scalability, and reliability.- Troubleshoot and resolve data-related issues in a timely manner.- Stay updated on the latest technologies and best practices in data engineering. Qualifications:- Bachelor's degree in Computer Science, Engineering, or related field.- Proven experience in data engineering roles with a focus on Snowflake, AWS services, Python, and DBT.- Strong analytical and problem-solving skills.- Excellent communication and teamwork abilities.- AWS certifications (e.g., AWS Certified Data Analytics - Specialty) are a plus. If you meet the above requirements and are passionate about data engineering and analytics, we encourage you to apply for this exciting opportunity.