Logo
Amtex Systems

Data Integration Specialist

Amtex Systems, Denver, Colorado, United States, 80285

Save Job

Overview

Title: Data Integration Specialist Location: Remote Duration: Full Time Job Summary: We are looking for a detail-oriented and technically skilled Data Integration Engineer to design, develop, and manage robust data integration solutions. The ideal candidate will have hands-on experience in integrating data across disparate systems, building ETL/ELT pipelines, and ensuring the accuracy, quality, and consistency of enterprise data. You will play a key role in enabling seamless data flow between systems to support business intelligence, analytics, and operational needs. Responsibilities Design and implement data integration workflows between internal and external systems, including APIs, databases, SaaS applications, and cloud platforms. Develop and maintain scalable ETL/ELT pipelines for structured and unstructured data using tools like Informatica, Talend, SSIS, Apache NiFi, or custom Python/SQL scripts. Build and manage real-time and batch data pipelines leveraging technologies like Kafka, Spark Streaming, or AWS Glue. Ensure high data quality, accuracy, and consistency during data ingestion and transformation. Implement data validation, cleansing, deduplication, and monitoring mechanisms. Contribute to metadata management, data lineage, and data catalog initiatives. Collaborate with data engineers, business analysts, data scientists, and application teams to understand integration needs and deliver effective solutions. Troubleshoot and resolve data integration and pipeline issues in a timely manner. Provide documentation and knowledge transfer for developed solutions. Support data movement across hybrid environments (on-prem, cloud, third-party systems). Work with DevOps or platform teams to ensure scalability, security, and performance of data integration infrastructure.

Qualifications

Bachelors degree in Computer Science, Information Systems, Engineering, or a related field. 48 years of experience in data integration, data engineering, or ETL development roles. Strong experience with integration tools such as Informatica, Talend, MuleSoft, SSIS, or Boomi. Proficient in SQL, Python, and scripting for data manipulation and automation. Experience with cloud data platforms (AWS, Azure, GCP) and services such as AWS Glue, Azure Data Factory, or Google Cloud Dataflow. Familiarity with REST/SOAP APIs, JSON, XML, and flat file integrations.

Preferred Skills

Experience with message queues or data streaming platforms (Kafka, RabbitMQ, Kinesis). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery). Knowledge of data security, privacy, and compliance best practices (HIPAA, GDPR, etc.). Prior experience in industries like healthcare, fintech, or e-commerce is a plus.

Soft Skills

Strong problem-solving and debugging skills. Excellent communication and collaboration abilities. Ability to manage multiple priorities and deliver in a fast-paced environment. Attention to detail and a commitment to delivering high-quality work.

#J-18808-Ljbffr