Logo
AgileEngine

Data Engineer ID43228

AgileEngine, Tampa, Florida, us, 33646

Save Job

Overview

Data Engineer (ID43228) at AgileEngine – an Inc. 5000 company creating award‑winning software for Fortune 500 brands and startups across 17+ industries. AgileEngine ranks among leaders in application development and AI/ML, and a people‑first culture has earned multiple Best Place to Work awards. Why Join Us

If you’re looking for a place to grow, make an impact, and work with people who care, we’d love to meet you! About the Role

As a Middle Data Engineer, you’ll build and optimize ETL pipelines and cloud data solutions that provide reliable insights for data scientists and analysts. You’ll tackle complex data challenges, collaborate with cross‑functional teams, and grow your expertise in Python, Airflow, Spark, and AWS in a dynamic, innovative environment. What You Will Do

Build and support ETL pipelines Monitor data pipelines, identify bottlenecks, optimize data processing and storage for performance and cost‑effectiveness Collaborate effectively with cross‑functional teams including data scientists, analysts, software engineers, and business stakeholders Work with Terraform to build AWS infrastructure Analyze sources and build Cloud Data Warehouse and Data Lake solutions Must Haves

You must be authorized to work for ANY employer in the US (e.g., Green card holders, TN visa holders, GC EAD, H4 EAD, U4U with EAD) 3+ years of professional experience with Python 3+ years in a Data Engineering role Proficiency in Python, SQL, and optionally Scala for working with data processing frameworks like Spark and libs like Pandas Proficiency in designing, deploying, and managing data pipelines using Apache Airflow Ability to design, develop, and optimize ETL processes to move and transform data from various sources into the data warehouse, ensuring data quality, reliability, and efficiency Knowledge of big data technologies and frameworks such as Apache Spark Extensive hands‑on experience with AWS services relevant to data engineering, including Amazon MWAA, S3, RDS, EMR, Lambda, Glue, Redshift, Data Pipeline, and DynamoDB Deep understanding and practical experience in building and optimizing cloud data warehousing solutions Excellent communication skills to collaborate effectively with cross‑functional teams Bachelor’s degree in computer science/engineering or equivalent experience Upper‑intermediate English level Nice to Haves

Familiarity with the fintech industry, understanding of financial data, regulatory requirements, and business processes specific to the domain Documentation skills to document data pipelines, architecture designs, and best practices for knowledge sharing and future reference GCP services relevant to data engineering Snowflake; OpenSearch, Elasticsearch; Jupyter for analyze data; Bitbucket, Bamboo; Terraform Perks and Benefits

Professional growth: Accelerate your journey with mentorship, TechTalks, and personalized growth roadmaps Competitive compensation: USD‑based compensation and budgets for education, fitness, and team activities A selection of exciting projects: Join projects with modern solutions development and top‑tier clients that include Fortune 500 enterprises and leading product brands Flextime: Tailor your schedule for an optimal work‑life balance, with options to work from home and go to the office – whatever makes you the happiest and most productive

#J-18808-Ljbffr