Macpower Digital Assets Edge
Informatica Developer
Macpower Digital Assets Edge, Owings Mills, Maryland, United States, 21117
Requirements:
7+ years of experience in
Informatica PowerCenter . 10+ years of experience in
big data and distributed computing . Strong hands-on experience with
PySpark, Apache Spark, and Python . Strong hands-on experience with
SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc.) . Proficiency in
data modeling and ETL workflows . Proficiency with
workflow schedulers like Airflow . Hands-on experience with
WS cloud-based data platforms . Experience in
DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes)
is a plus. Strong
problem-solving skills
and ability to
lead a team . Responsibilities:
Design, develop, modify, configure, and debug
Informatica workflows
using
PowerCenter and Power Exchange CDC tools . Lead the
design, development, and maintenance
of data integration solutions using
Informatica , ensuring
data quality . Troubleshoot and resolve technical issues , debug, tune, and optimize code for performance. Manage new requirements, review existing jobs, perform
gap analysis , and fix performance issues. Document all ETL mappings, sessions, and workflows . Handle tickets and perform
problem ticket analysis
in an
gile/POD approach . MUST HAVE:
10+ years of experience in
big data and distributed computing . 7+ years of experience in
Informatica PowerCenter . Experience with
PySpark, Apache Spark, and Python . Experience with
SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc.) . Experience with
WS cloud-based data platforms . Experience in
DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes)
is a plus.
7+ years of experience in
Informatica PowerCenter . 10+ years of experience in
big data and distributed computing . Strong hands-on experience with
PySpark, Apache Spark, and Python . Strong hands-on experience with
SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc.) . Proficiency in
data modeling and ETL workflows . Proficiency with
workflow schedulers like Airflow . Hands-on experience with
WS cloud-based data platforms . Experience in
DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes)
is a plus. Strong
problem-solving skills
and ability to
lead a team . Responsibilities:
Design, develop, modify, configure, and debug
Informatica workflows
using
PowerCenter and Power Exchange CDC tools . Lead the
design, development, and maintenance
of data integration solutions using
Informatica , ensuring
data quality . Troubleshoot and resolve technical issues , debug, tune, and optimize code for performance. Manage new requirements, review existing jobs, perform
gap analysis , and fix performance issues. Document all ETL mappings, sessions, and workflows . Handle tickets and perform
problem ticket analysis
in an
gile/POD approach . MUST HAVE:
10+ years of experience in
big data and distributed computing . 7+ years of experience in
Informatica PowerCenter . Experience with
PySpark, Apache Spark, and Python . Experience with
SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc.) . Experience with
WS cloud-based data platforms . Experience in
DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes)
is a plus.