PeopleServe
Job Description:
Work in a variety of settings to build systems that collect, manage, and convert raw data into usable information for data scientists and business analysts to interpret. Their ultimate goal is to make data accessible so that organizations can use it to evaluate and optimize their performance.
Responsibilities:
Utilize approved tools, adopt key performance indicators (KPIs), increase technology component reuse, and consolidate platforms, environments and products with the goal of reducing overall IT costs Able to easily convert/deploy all the Data Platform Services into code based artifacts using ARM templates or BICEP along with security and compliance Proven solution design skills being able to script and automate a solution end-to-end (infrastructure and application layers) ETL in Azure Data Factory and Databricks Provide support and advice to operational infrastructure teams to enable efficient operation and problem resolution of team's infrastructures Supporting the planning and implementation of data design services, providing sizing and configuration assistance, and performing needs assessments Monitoring progress, managing risk, and ensuring key stakeholders are kept informed about progress and expected outcomes Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows Security (Best Practices, Containers, Linux Hardening) Provisioning resources as per the security standards Good understanding on AZ Networking mainly VNet, Subnet, Service endpoints, Private endpoints, NSG & AZ Firewall Helping projects teams to ensure that all the AZ Resources are communicating with managed identities, service principal & RBAC Role Understanding all data sources connectivity as a part of data ingestion from multiple sources Qualifications:
Proven track record with at least 4 years of experience in DevOps data platform development Strong understanding of DevOps concepts (Azure DevOps framework and tools preferred) Solid scripting skills in languages such as Python, Bash, Javascript, or similar Solid understanding of monitoring / observability concepts and tooling Extensive experience and strong understanding of cloud and infrastructure components Strong problem-solving and analytical skills, with the ability to troubleshoot complex DevOps platform issues and provide effective solutions Knowledge of developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication) Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders 4+ years of professional infrastructure and/or software development experience 3+ years of experience with AWS, GCP, Azure, or another cloud service (Azure preferred)
Work in a variety of settings to build systems that collect, manage, and convert raw data into usable information for data scientists and business analysts to interpret. Their ultimate goal is to make data accessible so that organizations can use it to evaluate and optimize their performance.
Responsibilities:
Utilize approved tools, adopt key performance indicators (KPIs), increase technology component reuse, and consolidate platforms, environments and products with the goal of reducing overall IT costs Able to easily convert/deploy all the Data Platform Services into code based artifacts using ARM templates or BICEP along with security and compliance Proven solution design skills being able to script and automate a solution end-to-end (infrastructure and application layers) ETL in Azure Data Factory and Databricks Provide support and advice to operational infrastructure teams to enable efficient operation and problem resolution of team's infrastructures Supporting the planning and implementation of data design services, providing sizing and configuration assistance, and performing needs assessments Monitoring progress, managing risk, and ensuring key stakeholders are kept informed about progress and expected outcomes Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows Security (Best Practices, Containers, Linux Hardening) Provisioning resources as per the security standards Good understanding on AZ Networking mainly VNet, Subnet, Service endpoints, Private endpoints, NSG & AZ Firewall Helping projects teams to ensure that all the AZ Resources are communicating with managed identities, service principal & RBAC Role Understanding all data sources connectivity as a part of data ingestion from multiple sources Qualifications:
Proven track record with at least 4 years of experience in DevOps data platform development Strong understanding of DevOps concepts (Azure DevOps framework and tools preferred) Solid scripting skills in languages such as Python, Bash, Javascript, or similar Solid understanding of monitoring / observability concepts and tooling Extensive experience and strong understanding of cloud and infrastructure components Strong problem-solving and analytical skills, with the ability to troubleshoot complex DevOps platform issues and provide effective solutions Knowledge of developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication) Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders 4+ years of professional infrastructure and/or software development experience 3+ years of experience with AWS, GCP, Azure, or another cloud service (Azure preferred)