Logo
Donato Technologies, Inc

Data Engineer

Donato Technologies, Inc, Austin, Texas, us, 78716

Save Job

Data Engineer Austin TX Key Skills: ETL Informatica on Cloud Oracle Snowflake AI tools.

Position is ONSITE at the location listed above (NO REMOTE WORK). Program will only accept LOCAL ONLY candidates for this position (Within 50-mile radius). Subject to change per the hiring team.

Client

requires the services of

2 Systems Analyst 3

hereafter referred to as Candidate(s) who meet the general qualifications of

Systems Analyst 3 Applications/Software Development

and the specifications outlined in this document for the

Client

.

Understands business objectives and problems identifies alternative solutions performs studies and cost/benefit analysis of alternatives. Analyzes user requirements procedures and problems to automate processing or to improve existing computer system: Confers with personnel of organizational units involved to analyze current operational procedures identify problems and learn specific input and output requirements such as forms of data input how data is to be; summarized and formats for reports. Writes detailed description of user needs program functions andsteps required to develop or modify computer program. Reviews computer system capabilities specifications and scheduling limitations to determine if requested program or program change is possible within existing system.

The

Client

requires the services of a Data Engineer hereafter referred to as Worker who meets the general qualification of Systems Analyst 3 Emerging Technologies and the specifications outlined in this document

Client

Information Technology.

The

Client

is continuing to develop an

Client

data integration hub with a goal to accomplish the following:

Develop the DAP/PMAS Report on Medicaid Personal Care Services

Implementation and configuration of the infrastructure for the data integration hub

Design development and implementation (DD&I) of the data integration hub using an agile methodology for all standard SDLC phases that includes but is not limited to:

Validation of performance metric requirements

Creation of Epics/User Stories/Tasks

Automation of data acquisition from a variety of data sources

Development of complex SQL scripts

Testing integration load and stress

Deployment / publication internally and externally

Operations support and enhancement of the data integration hub

This development effort will utilize an agile methodology based upon the approach currently in use at

Client

for the Performance Management & Analytics System (PMAS).

As a member of the agile development team the worker responsibilities may include:

Filling the role of a technical leader leading an agile development team through a project.

Data acquisition from a variety of data sources for multiple uses.

Developing complex SQL scripts to transform the source data to fit into a dimensional model then to create views and materialized views in Oracle.

Developing automation with Informatica Power Center/IICS to pull data from external data sources and transform it to fit into a dimensional model.

Collaborating with other members of the Data Engineering Team on the design and implementation of an optimal data design.

Verification and validation of SQL scripts Informatica automation and database views.

Developing automated means of performing verification and validation.

Participating in all sprint ceremonies

Work closely with the Architects and Data Engineering Team on implementation designs and data acquisition strategies.

Develop mockups and work with customers for validation

Working closely with other members of the team to address technical problems

Assisting with the implementation and configuration of developmental tools

Producing and maintaining technical specifications diagrams or other documentation as needed to support the DD&I efforts

Participation in requirements and design sessions

Interpreting new and changing business requirements to determine the impact and proposing enhancements and changes to meet these new requirements

All other duties as assigned.

II. CANDIDATE SKILLS AND QUALIFICATIONS Minimum Requirements: Candidates that do not meet or exceed the minimum stated requirements (skills/experience) will be displayed to customers but may not be chosen for this opportunity.

8 years - Required - Experience developing mappings and workflows to automate ETL processes using Informatica Power Center or IICS.

8 years - Required - Experiencing acquiring and integrating data from multiple data sources/technologies using Informatica Power Center or IICS for use by a Tableau data visualization object. Data source techs should include Oracle SQL Server Excel Access and Adobe PDF.

8 years - Required - Experience designing and developing complex Oracle and/or Snowflake SQL scripts that are fast and efficient.

8 years - Required - Strong analytical and problem-solving skills with experience as a system analyst for a data analytics performance management system or data warehousing project.

8 years - Required - Technical writing and diagraming skills including proficiency with modeling and mapping tools (e.g. Visio Erwin) and the Microsoft Office Suite (Word Excel and PowerPoint) and MS Project.

8 years - Required - Experience in planning and delivering software platforms used across multiple products and organizational units.

8 years - Required - Proven ability to write well designed testable efficient code by using best software development practices.

6 years - Preferred - Excellent oral and written communication skills.

6 years - Preferred - Effectively manage multiple responsibilities prioritize conflicting assignments and switch quickly between assignments as required.

4 years - Preferred - Experience on an agile sprint team.

4 years - Preferred - Understanding of security principles and how they apply to healthcare data.

4 years - Preferred - Experience with state of the art software components for a performance metrics data visualization or business intelligence environment.

4 years - Preferred - Bachelors degree in Computer Science Information Systems or Business or equivalent experience.

4 years - Preferred - Prior experience in the Healthcare Industry.

4 years - Preferred - Prior experience with an agency.

4 years - Preferred - Prior experience working with PII or PHI data.

4 years - Preferred - Experience designing and developing scripts using Python.

4 years - Preferred - Experience with JIRA software.

2 years - Preferred - Functional knowledge or hands on design experience with Web Services (REST SOAP etc.).

2 years - Preferred - Experience designing and developing code using Java and JavaScript.

2 years - Preferred - Experience developing CI/CD pipelines with GitHub and Git Actions.

2 years - Preferred - Experience as a Mulesoft developer.

2 years - Preferred - Experience developing code in C#.

Key Skills

Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala

Employment Type :

Full Time Experience:

years Vacancy:

1

#J-18808-Ljbffr