Jobs via Dice
Dice is the leading career destination for tech experts at every stage of their careers. Our client, INNOVIT USA INC, is seeking the following. Apply via Dice today!
Hiring W2 Candidates Only
Visa:
Open To Any Visa Type With Valid Work Authorization In the USA
Job Summary We are seeking a Data Engineer to design, develop, and optimize data pipelines and architectures that support data analytics, reporting, and machine learning workloads. The ideal candidate will have strong experience with big data tools, cloud data platforms, and modern data engineering practices.
Key Responsibilities
Design, build, and maintain scalable data pipelines and ETL/ELT processes for structured and unstructured data.
Develop and optimize data models, ensuring high performance and data integrity across systems.
Integrate data from diverse sources such as APIs, databases, flat files, and third-party systems.
Collaborate with data scientists, analysts, and business teams to understand data needs and deliver quality data solutions.
Implement data quality, validation, and governance frameworks to ensure and consistent data.
Work with cloud data services (AWS, Azure, Google Cloud Platform) and big data frameworks (Spark, Hadoop, Databricks, etc.).
Automate and monitor data workflows for fault tolerance, scalability, and security.
Participate in code reviews, documentation, and process improvement initiatives.
Ensure compliance with data privacy and security standards (e.g., GDPR, HIPAA, SOC 2).
Required Skills And Qualifications
Bachelor's or Master's degree in Computer Science, Information Systems, or related field.
3+ years of experience in data engineering or data integration roles.
Strong programming skills in Python, SQL, or Scala.
Hands-on experience with ETL tools (e.g., Apache Airflow, AWS Glue, Informatica, Talend).
Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery, Synapse).
Knowledge of cloud platforms AWS, Azure, or Google Cloud Platform data services (e.g., S3, Data Lake, Dataproc, Data Factory).
Familiarity with data modeling (star schema, snowflake schema) and database systems (PostgreSQL, MySQL, NoSQL).
Proficient with version control (Git) and CI/CD pipelines for data workflows.
Strong problem‑solving and collaboration skills.
Preferred / Nice-to-Have
Experience with streaming data (Kafka, Kinesis, Pub/Sub).
Exposure to containerization and orchestration (Docker, Kubernetes).
Understanding of machine learning data pipelines and feature engineering.
Knowledge of data governance tools (e.g., Collibra, Alation) and metadata management.
Seniority level Mid-Senior level
Employment type Full-time
Job function Information Technology
Industries Software Development
Referrals increase your chances of interviewing at Jobs via Dice by 2x
Get notified about new Data Engineer jobs in Indianapolis, IN.
#J-18808-Ljbffr
Hiring W2 Candidates Only
Visa:
Open To Any Visa Type With Valid Work Authorization In the USA
Job Summary We are seeking a Data Engineer to design, develop, and optimize data pipelines and architectures that support data analytics, reporting, and machine learning workloads. The ideal candidate will have strong experience with big data tools, cloud data platforms, and modern data engineering practices.
Key Responsibilities
Design, build, and maintain scalable data pipelines and ETL/ELT processes for structured and unstructured data.
Develop and optimize data models, ensuring high performance and data integrity across systems.
Integrate data from diverse sources such as APIs, databases, flat files, and third-party systems.
Collaborate with data scientists, analysts, and business teams to understand data needs and deliver quality data solutions.
Implement data quality, validation, and governance frameworks to ensure and consistent data.
Work with cloud data services (AWS, Azure, Google Cloud Platform) and big data frameworks (Spark, Hadoop, Databricks, etc.).
Automate and monitor data workflows for fault tolerance, scalability, and security.
Participate in code reviews, documentation, and process improvement initiatives.
Ensure compliance with data privacy and security standards (e.g., GDPR, HIPAA, SOC 2).
Required Skills And Qualifications
Bachelor's or Master's degree in Computer Science, Information Systems, or related field.
3+ years of experience in data engineering or data integration roles.
Strong programming skills in Python, SQL, or Scala.
Hands-on experience with ETL tools (e.g., Apache Airflow, AWS Glue, Informatica, Talend).
Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery, Synapse).
Knowledge of cloud platforms AWS, Azure, or Google Cloud Platform data services (e.g., S3, Data Lake, Dataproc, Data Factory).
Familiarity with data modeling (star schema, snowflake schema) and database systems (PostgreSQL, MySQL, NoSQL).
Proficient with version control (Git) and CI/CD pipelines for data workflows.
Strong problem‑solving and collaboration skills.
Preferred / Nice-to-Have
Experience with streaming data (Kafka, Kinesis, Pub/Sub).
Exposure to containerization and orchestration (Docker, Kubernetes).
Understanding of machine learning data pipelines and feature engineering.
Knowledge of data governance tools (e.g., Collibra, Alation) and metadata management.
Seniority level Mid-Senior level
Employment type Full-time
Job function Information Technology
Industries Software Development
Referrals increase your chances of interviewing at Jobs via Dice by 2x
Get notified about new Data Engineer jobs in Indianapolis, IN.
#J-18808-Ljbffr