RIT Solutions, Inc.
Mid-Level Data Engineer
Domain: Payments & Financial Services
3 days hybrid in St. Louis, MO
LOCALS ONLY 6 months
Our client, an a top-tier Management Consulting firm is seeking a Senior Data Engineer to design, develop, and optimize large-scale data solutions. You will play a key role in building and enhancing data pipelines, enabling efficient data movement, and supporting advanced analytics and reporting initiatives across business domains. Requirements and Qualifications
: • Lead the end-to-end design and development of
data warehousing solutions
to support enterprise-wide analytics. • Develop and optimize
ETL/ELT pipelines
leveraging
SQL, Hadoop, Spark, and Python . • Ensure scalability, performance, and reliability of data systems. • Mentor and guide junior engineers, providing technical leadership and code reviews. • Collaborate with business stakeholders, architects, and analysts to translate requirements into robust data solutions. • Work with
pache NiFi
for data ingestion pipelines and explore automation opportunities. • Contribute to cloud migration and adoption strategies, leveraging
cloud-native data platforms
(AWS, Azure, or GCP). • Ensure data quality, governance, and security standards are maintained across pipelines. 5-7 years of experience in
Data Engineering and Data Warehousing
projects. • Strong expertise in
SQL
and
programming
(Python or Java/Scala). • Solid hands-on experience in
Hadoop ecosystem
and
pache Spark . • Exposure to
pache NiFi
for data flow management is a strong plus. • Good understanding of
cloud concepts
and modern data architectures. • Strong problem-solving skills and ability to optimize performance in large datasets. • Excellent communication and stakeholder management skills.
LOCALS ONLY 6 months
Our client, an a top-tier Management Consulting firm is seeking a Senior Data Engineer to design, develop, and optimize large-scale data solutions. You will play a key role in building and enhancing data pipelines, enabling efficient data movement, and supporting advanced analytics and reporting initiatives across business domains. Requirements and Qualifications
: • Lead the end-to-end design and development of
data warehousing solutions
to support enterprise-wide analytics. • Develop and optimize
ETL/ELT pipelines
leveraging
SQL, Hadoop, Spark, and Python . • Ensure scalability, performance, and reliability of data systems. • Mentor and guide junior engineers, providing technical leadership and code reviews. • Collaborate with business stakeholders, architects, and analysts to translate requirements into robust data solutions. • Work with
pache NiFi
for data ingestion pipelines and explore automation opportunities. • Contribute to cloud migration and adoption strategies, leveraging
cloud-native data platforms
(AWS, Azure, or GCP). • Ensure data quality, governance, and security standards are maintained across pipelines. 5-7 years of experience in
Data Engineering and Data Warehousing
projects. • Strong expertise in
SQL
and
programming
(Python or Java/Scala). • Solid hands-on experience in
Hadoop ecosystem
and
pache Spark . • Exposure to
pache NiFi
for data flow management is a strong plus. • Good understanding of
cloud concepts
and modern data architectures. • Strong problem-solving skills and ability to optimize performance in large datasets. • Excellent communication and stakeholder management skills.