Trigyn Technologies
Data Modeler (Snowflake / Python / Insurance industry)
Trigyn Technologies, New York, New York, us, 10261
Job Description :
Trigyn's direct government client has an immediate need for a Data Modeler in New York, NY.
Description :
Provide recommendations, guidance, and actual implementation for new data marts, as well as improving and consolidating existing Data marts (on-premise).
Participate and contribute to data warehouse projects throughout the project lifecycle.
Establish written best practices, procedures, and provide guidance on data modeling design.
Create and maintain up-to-date documentation for all supported Data Warehouse / Data Mart (DW / DM), ETL processes.
Plan, create, modify, maintain, test, and implement code supporting:
Source System extraction processes for the Data Warehouse.
ETL processes.
BI Dashboards and reports using R and Python.
Develop and maintain Spotfire systems, including prediction models written in R / Python, TERR, and automation services.
Automate source system data extraction processes and data loading into the Data Warehouse.
Operationalize prediction models.
The consultant will adhere to application development standards, including project management methodology, SDLC, Enterprise Architecture standards, and IT governance. Support the BI Data Warehouse using TIBCO Spotfire and IBI Data Migrator.
Build knowledge graphs using Neo4j, Cypher, and GDS.
Mandatory Qualifications :
Make high-level design choices and set technical standards, including software coding standards, tools, and platforms. Design multi-leveled architecture or component interactions of large-scale software systems.
Provide guidance to large teams or possess extensive industry experience, considered a top expert in the field.
84 months of Data Modeling experience in designing and creating Data Marts and Data Warehouses using Star & Snowflake Schemas in the Insurance industry.
84 months of experience with designing and implementing ML, Predictive, and AI systems.
84 months of experience with Spotfire Analyst and Web Player using Iron Python, Python, JavaScript, and R.
84 months of experience in developing, documenting, maintaining end-to-end ETL data pipelines using IBI Data Migrator or similar tools.
24 months of experience developing data warehouses in the Workers Compensation Insurance industry (Medical Billing, Claims, DBL, PFL, Underwriting, Premium, Payroll).
36 months of experience with Graph Databases such as Neo4j, Cypher, GDS.
84 months of experience in Oracle PL/SQL development, including triggers, stored procedures, packages, query optimization, database design, and implementation.
48 months of experience implementing data governance programs, including data management, data quality, and data lineage.
Bachelors Degree in Computer Science.
Masters Degree in Computer Science or Data Science.
#J-18808-Ljbffr
#J-18808-Ljbffr