Fults & Associates LLC
Data Engineer- US Citizen or Greencard ONLY
Fults & Associates LLC, Granite Heights, Wisconsin, United States
We’re not able to work with sponsors, provide, or transfer sponsorship at any time for this position. Must be U.S. Citizen or Greencard holder to be qualified for this position.
Excellent communication skills is REQUIRED
Seeking an experienced Data Engineer
for a 6 Month Contract-to-hire Qualifications
Education
Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent experience).
Experience
3+ years of experience in Data Engineering or a related role. 2+ years of hands-on experience with
Snowflake , including data modeling, query optimization, and managing
Snowflake environments . Proven experience building and maintaining
ETL/ELT
pipelines using tools like
Apache Airflow, Informatica, Talend, Spring Batch
or similar. Experience with
cloud platforms (e.g., AWS, Azure, or GCP)
and their
integration with Snowflake . Experience in the
Financial industry
or with credit union data environments is a plus. Needs someone to think and solve problems. Huge Plus- Financial Industry specifically in Credit cards / Financial Services / Credit Union Need someone that doesn’t require a lot of handholding
Technical Skills:
Proficiency in
SQL and experience with Python, Java, or Scala for data processing and Kotlin. Strong understanding of data warehousing concepts and best practices. Experience with
version control systems (e.g., Git) and CI/CD pipelines for data workflows. PLUS- Knowledge of data governance, security, and compliance standards (e.g., GDPR, CCPA)- They don’t have CIS (He is CIS)- Would like someone involved in labeling data, PII, things like that. Requires more care so there’s no problem with data leakage or anything like that. Doesn’t deal with HIPPA but someone well versed in that would be nice. Familiarity with
BI tools (e.g., Looker Studio, Tableau, Power BI) is a plus. Python is a MUST HAVE Python and JDM (Java Development Methodology) Scala or Kotlin is a big plus.
Key Responsibilities: Data Pipeline Development:
Design, build, and optimize scalable data pipelines using Snowflake and related technologies to ingest, transform, and store data from various sources. Data Integration:
Collaborate with internal teams and external partners to integrate data from diverse systems, ensuring data quality, consistency, and accessibility. Data Warehousing:
Leverage Snowflake to implement data warehousing solutions, including data modeling, schema design, and performance optimization. ETL/ELT Processes:
Develop and maintain ETL/ELT workflows to support analytics, reporting, and machine learning initiatives. Data Quality & Governance:
Implement data quality checks, monitoring, and governance practices to ensure accuracy, security, and compliance with regulatory standards. Performance Optimization:
Monitor and optimize the performance of data pipelines and Snowflake queries to ensure efficient processing and cost-effectiveness. Collaboration:
Work closely with data analysts, data scientists, and business stakeholders to understand requirements and deliver tailored data solutions. Documentation:
Create and maintain comprehensive documentation for data pipelines, processes, and architecture. Innovation:
Stay current with industry trends and emerging technologies to continuously improve data infrastructure. Soft Skills:
Excellent problem-solving and analytical skills. Strong communication and collaboration abilities to work with cross-functional teams. Ability to manage multiple priorities in a fast-paced environment. Detail-oriented with a commitment to delivering high-quality solutions.
#J-18808-Ljbffr
for a 6 Month Contract-to-hire Qualifications
Education
Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent experience).
Experience
3+ years of experience in Data Engineering or a related role. 2+ years of hands-on experience with
Snowflake , including data modeling, query optimization, and managing
Snowflake environments . Proven experience building and maintaining
ETL/ELT
pipelines using tools like
Apache Airflow, Informatica, Talend, Spring Batch
or similar. Experience with
cloud platforms (e.g., AWS, Azure, or GCP)
and their
integration with Snowflake . Experience in the
Financial industry
or with credit union data environments is a plus. Needs someone to think and solve problems. Huge Plus- Financial Industry specifically in Credit cards / Financial Services / Credit Union Need someone that doesn’t require a lot of handholding
Technical Skills:
Proficiency in
SQL and experience with Python, Java, or Scala for data processing and Kotlin. Strong understanding of data warehousing concepts and best practices. Experience with
version control systems (e.g., Git) and CI/CD pipelines for data workflows. PLUS- Knowledge of data governance, security, and compliance standards (e.g., GDPR, CCPA)- They don’t have CIS (He is CIS)- Would like someone involved in labeling data, PII, things like that. Requires more care so there’s no problem with data leakage or anything like that. Doesn’t deal with HIPPA but someone well versed in that would be nice. Familiarity with
BI tools (e.g., Looker Studio, Tableau, Power BI) is a plus. Python is a MUST HAVE Python and JDM (Java Development Methodology) Scala or Kotlin is a big plus.
Key Responsibilities: Data Pipeline Development:
Design, build, and optimize scalable data pipelines using Snowflake and related technologies to ingest, transform, and store data from various sources. Data Integration:
Collaborate with internal teams and external partners to integrate data from diverse systems, ensuring data quality, consistency, and accessibility. Data Warehousing:
Leverage Snowflake to implement data warehousing solutions, including data modeling, schema design, and performance optimization. ETL/ELT Processes:
Develop and maintain ETL/ELT workflows to support analytics, reporting, and machine learning initiatives. Data Quality & Governance:
Implement data quality checks, monitoring, and governance practices to ensure accuracy, security, and compliance with regulatory standards. Performance Optimization:
Monitor and optimize the performance of data pipelines and Snowflake queries to ensure efficient processing and cost-effectiveness. Collaboration:
Work closely with data analysts, data scientists, and business stakeholders to understand requirements and deliver tailored data solutions. Documentation:
Create and maintain comprehensive documentation for data pipelines, processes, and architecture. Innovation:
Stay current with industry trends and emerging technologies to continuously improve data infrastructure. Soft Skills:
Excellent problem-solving and analytical skills. Strong communication and collaboration abilities to work with cross-functional teams. Ability to manage multiple priorities in a fast-paced environment. Detail-oriented with a commitment to delivering high-quality solutions.
#J-18808-Ljbffr