Noblesoft Technologies
Overview
Job Role:
Informatica Developer Location:
Denver CO (Remote allowed) Responsibilities
Project Leadership: Take overall ownership of two major projects - the first focused on building and integrating a suite of new data pipelines and expanding the data model of an existing core application and the second centered on similar pipeline development as well as architecting and deploying near real-time data feeds. Solution Architecture: Design and implement scalable high-performance data solutions using Informatica platform tools ensuring reliability maintainability and efficiency. Stakeholder Collaboration: Engage with business analysts data stewards application owners and cross-functional technical teams to gather requirements align deliverables and communicate project status and risks. Technical Leadership: Provide expert guidance and mentorship to junior developers and team members establishing coding standards best practices and robust documentation protocols. Pipeline Development: Lead the hands-on creation of new ETL pipelines including extraction from diverse sources transformation logic and loading into target data stores. Schema and Field Expansion: Analyze existing data models recommend new field additions and oversee schema evolution with minimal disruption to application performance and integrity. Near Real-Time Data Feeds: Architect and build data feeds capable of processing and delivering information to business applications with minimal latency leveraging best-of-breed Informatica tools and methodologies. Required Qualifications
Education: Bachelors degree or higher in Computer Science Information Technology Engineering or related discipline. Experience: Minimum 5 years of hands-on experience as an Informatica Developer with a proven track record of delivering complex data integration projects and leading technical teams. Technical Proficiency: Expert-level proficiency with Informatica PowerCenter Informatica Intelligent Cloud Services (IICS) or similar ETL / data integration platforms. Data Pipeline Mastery: Strong experience designing developing and optimizing ETL pipelines for large-scale complex environments. Real-Time Data Experience: Familiarity with architecting and delivering near real-time data feeds streaming technologies or event-driven architectures. Database Expertise: Advanced knowledge of relational and non-relational databases (e.g. SQL Server Oracle PostgreSQL NoSQL) data modeling and schema design. Programming Skills: Proficiency in SQL PL / SQL and at least one scripting language (Python Shell etc.). Project Management: Experience using project management tools and methodologies (Agile Scrum Kanban) to deliver projects on time and within scope. Analytical Skills: Exceptional problem-solving abilities attention to detail and a track record for delivering high-quality solutions. Communication: Excellent verbal and written communication skills; able to translate complex technical concepts for non-technical stakeholders. Leadership: Demonstrated ability to lead and motivate a technical team set expectations and drive results. Preferred Qualifications
Masters degree in a relevant field. Prior experience working on projects involving the evolution of existing applications and integration of new fields / data sources. Experience implementing streaming data solutions (Kafka Spark Streaming etc.) in an enterprise context. Exposure to cloud platforms (AWS Azure Google Cloud) especially cloud-based data integration tools. Industry certifications in Informatica data engineering or cloud technologies. Experience working in regulated industries (finance healthcare etc.) with strict data governance requirements. Key Skills
Anti Money Laundering Entry Level Sales Asset Application Engineering Electrical & Instrumentation Employment Type:
Full Time Experience:
years Vacancy:
1
#J-18808-Ljbffr
Job Role:
Informatica Developer Location:
Denver CO (Remote allowed) Responsibilities
Project Leadership: Take overall ownership of two major projects - the first focused on building and integrating a suite of new data pipelines and expanding the data model of an existing core application and the second centered on similar pipeline development as well as architecting and deploying near real-time data feeds. Solution Architecture: Design and implement scalable high-performance data solutions using Informatica platform tools ensuring reliability maintainability and efficiency. Stakeholder Collaboration: Engage with business analysts data stewards application owners and cross-functional technical teams to gather requirements align deliverables and communicate project status and risks. Technical Leadership: Provide expert guidance and mentorship to junior developers and team members establishing coding standards best practices and robust documentation protocols. Pipeline Development: Lead the hands-on creation of new ETL pipelines including extraction from diverse sources transformation logic and loading into target data stores. Schema and Field Expansion: Analyze existing data models recommend new field additions and oversee schema evolution with minimal disruption to application performance and integrity. Near Real-Time Data Feeds: Architect and build data feeds capable of processing and delivering information to business applications with minimal latency leveraging best-of-breed Informatica tools and methodologies. Required Qualifications
Education: Bachelors degree or higher in Computer Science Information Technology Engineering or related discipline. Experience: Minimum 5 years of hands-on experience as an Informatica Developer with a proven track record of delivering complex data integration projects and leading technical teams. Technical Proficiency: Expert-level proficiency with Informatica PowerCenter Informatica Intelligent Cloud Services (IICS) or similar ETL / data integration platforms. Data Pipeline Mastery: Strong experience designing developing and optimizing ETL pipelines for large-scale complex environments. Real-Time Data Experience: Familiarity with architecting and delivering near real-time data feeds streaming technologies or event-driven architectures. Database Expertise: Advanced knowledge of relational and non-relational databases (e.g. SQL Server Oracle PostgreSQL NoSQL) data modeling and schema design. Programming Skills: Proficiency in SQL PL / SQL and at least one scripting language (Python Shell etc.). Project Management: Experience using project management tools and methodologies (Agile Scrum Kanban) to deliver projects on time and within scope. Analytical Skills: Exceptional problem-solving abilities attention to detail and a track record for delivering high-quality solutions. Communication: Excellent verbal and written communication skills; able to translate complex technical concepts for non-technical stakeholders. Leadership: Demonstrated ability to lead and motivate a technical team set expectations and drive results. Preferred Qualifications
Masters degree in a relevant field. Prior experience working on projects involving the evolution of existing applications and integration of new fields / data sources. Experience implementing streaming data solutions (Kafka Spark Streaming etc.) in an enterprise context. Exposure to cloud platforms (AWS Azure Google Cloud) especially cloud-based data integration tools. Industry certifications in Informatica data engineering or cloud technologies. Experience working in regulated industries (finance healthcare etc.) with strict data governance requirements. Key Skills
Anti Money Laundering Entry Level Sales Asset Application Engineering Electrical & Instrumentation Employment Type:
Full Time Experience:
years Vacancy:
1
#J-18808-Ljbffr