DaVita
Your Role
The Data Services and Solutions team is responsible for designing, developing, testing, deploying, and operating large enterprise data warehouse/BI data solutions using both on-premises and cloud technologies. The Full Stack Data Engineer, Senior, will report to the Manager of Data Services and Solutions. In this role, you will be deeply involved in designing, developing, and deploying secure, high-quality software solutions. Your focus will be on integrating security and automation throughout the software development lifecycle (SDLC), emphasizing writing clean, maintainable code and building infrastructure to support CI/CD pipelines, automated testing, and cloud-native delivery. You will implement and enforce DevSecOps best practices tailored for Azure, contribute to infrastructure as code, and collaborate closely with developers, testers, and cloud engineers to ensure code is secure, scalable, and production-ready from day one. This role requires a hands-on engineer who thrives in a collaborative environment and is passionate about code quality, automation, and secure cloud development. Our leadership model emphasizes developing great leaders at all levels and creating opportunities for personal, professional, and financial growth. We seek leaders energized by creative and critical thinking, building high-performing teams, achieving results ethically, and fostering continuous learning.
Your Knowledge and Experience
Requires a bachelor's degree in computer science, Information Technology, Management Information Systems, or a related field (or equivalent experience), with a minimum of 5 years of relevant experience in enterprise application support and cloud-based solution delivery.
Experience with cloud platforms, preferably Azure (or AWS or GCP), and related technical stacks including ADLS, Synapse, Azure Data Factory, etc.
Experience with Snowflake and/or Databricks.
Solid experience with JavaScript, including CSS responsive design practices.
Strong understanding of data modeling (Data Vault 2.0), data mining, master data management, data integration, data architecture, data virtualization, data warehousing, and data quality techniques.
Hands-on experience with data management technologies such as Informatica PowerCenter/IICS, Collibra, Reltio MDM, DBT Cloud, Dbt Core, Denodo, and Golden Gate or Striim replication.
Working knowledge of testing tools, scheduling software (e.g., Tidal, Control-M).
Basic experience with data governance, data security, and working with information stewards, privacy, and security officers to ensure data pipelines meet quality, governance, and security standards and certifications.
Proficiency in Unix command-line operations, shell scripting, Python, and utilities like awk, sed, etc.
Hands-on experience with CI/CD pipelines (e.g., Bitbucket Pipelines, GitHub Actions) and Infrastructure as Code tools like Ansible for automating cloud deployments.
Excellent ability to influence and collaborate with stakeholders, vendors, and cross-functional teams, with strong verbal and written communication skills to translate and execute technical deliverables.
Preferred experience in the healthcare industry.
Strong process orientation with the ability to follow and improve procedures for critical maintenance and operational tasks.
#LI-CM1
#J-18808-Ljbffr