Logo
Cynet Systems

Senior Data Engineer

Cynet Systems, Santa Clarita, California, United States, 91382

Save Job

Job Description:

Pay Range: $58.02hr - $63.02hr Responsibilities

Provides technical guidance related to data architecture, data models, and metadata management to senior IT and business leaders. Defines and implements data flows through and around digital products. Participates in data modeling and testing. Extracts relevant data to solve analytical problems; ensures development teams have the required data. Interacts with the business divisions to understand all data requirements to develop business insights for CRM and translates them into data structures and data model requirements to IT. Works closely with database teams on topics related to data requirements, cleanliness, accuracy, etc. Tracks analytics impact on business. Monitors market watch. Develops sustainable data-driven solutions with current new-gen data technologies to meet the needs of the organization and business customers. Builds data pipeline frameworks to automate high-volume and real-time data delivery for Hadoop and streaming data hub. Builds robust systems with an eye on the long-term maintenance and support of the application. Builds data APIs and data delivery services that support critical operational and analytical applications for our internal business operations, customers, and partners. Transforms complex analytical models into scalable, production-ready solutions. Continuously integrates and ships code into our own premises and cloud production environments. Develops applications from the ground up using a modern technology stack such as Scala, Spark, Postgres, AngularJS, and NoSQL. Works directly with Product Managers and customers to deliver data products in a collaborative and agile environment. Education and Experience

Bachelor's or Master's degree in Computer Science, Engineering, Bioinformatics, or related field. 5+ years of experience in data engineering. Prefer prior experience working with regulated healthcare data (e.g., HIPAA, FDA 21 CFR Part 11). Core Technical Skills

ETL Development: Proficiency in building robust pipelines. Data Modeling: Strong knowledge of relational (PostgreSQL, MySQL) and NoSQL (MongoDB, DynamoDB) data models, especially clinical or device data schemas (FHIR, HL7, OMOP). Cloud Platforms: Experience with HIPAA-compliant cloud services. Programming: Strong Python and SQL skills; Spark or Scala is a plus. APIs & Integration: RESTful APIs, HL7/FHIR data ingestion, integration with EHRs and medical devices. Experience in Python, Flink, Kafka, Spark, AWS Glue, AWS EMR, AWS S3, AWS Lambda. AWS ECS, Terraform.

#J-18808-Ljbffr