Cyber 1 Armor
Ab Initio ETL Developer
Location:
Dallas, TX (Onsite)
Duration:
12 Months
Job Type:
W2 contracts
Required Skills
Design, develop, and deploy ETL processes using Ab Initio GDE.
Build high-performance data integration and transformation pipelines.
Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development.
Develop and optimize graphs for batch and real-time processing.
Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources.
Implement continuous flows, web services, and message-based integration with Ab Initio.
Continuous Flows (Co-Op & GDE).
Plans and Psets.
Conduct-It for job scheduling and orchestration.
Graphs and Parameter Sets.
Only candidates authorized to work in the United States.
Only applicants currently local to Dallas, TX or willing to relocate will be considered.
Nice to Have Skills
Exposure to AWS, Azure, or Google Cloud Platform for cloud-based data solutions.
Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus.
Containerization (Docker, Kubernetes) knowledge desirable.
Monitoring & Security
Job monitoring and scheduling experience (Control-M, Autosys, or similar).
Familiarity with security standards, encryption, and access management.
#J-18808-Ljbffr
Dallas, TX (Onsite)
Duration:
12 Months
Job Type:
W2 contracts
Required Skills
Design, develop, and deploy ETL processes using Ab Initio GDE.
Build high-performance data integration and transformation pipelines.
Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development.
Develop and optimize graphs for batch and real-time processing.
Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources.
Implement continuous flows, web services, and message-based integration with Ab Initio.
Continuous Flows (Co-Op & GDE).
Plans and Psets.
Conduct-It for job scheduling and orchestration.
Graphs and Parameter Sets.
Only candidates authorized to work in the United States.
Only applicants currently local to Dallas, TX or willing to relocate will be considered.
Nice to Have Skills
Exposure to AWS, Azure, or Google Cloud Platform for cloud-based data solutions.
Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus.
Containerization (Docker, Kubernetes) knowledge desirable.
Monitoring & Security
Job monitoring and scheduling experience (Control-M, Autosys, or similar).
Familiarity with security standards, encryption, and access management.
#J-18808-Ljbffr