ArborTekSystem
Job Title: Splunk Developer Location: Charlotte, NC (Hybrid ) Job Type: Long Term Contract Experience Level: 12+ We are looking for 12+ years Enterprise Observability Engineer, who would leverage powerfully insightful data to inform our systems and solutions, and we’re seeking an experienced pipeline-centric data engineer to put it to good use in building out ETL and Data Operations framework (Data Preparation / Normalization and Ontological processes). Technical Skills: Five or more years of experience with Python, SQL, and data visualization/exploration tools Full stack observability lead with Splunk (preferred) / Datadog, Infra monitoring, App onboarding and APM experience Proficiency in observability tools: They are familiar with tools for logging, metrics, and tracing, such as ELK Stack, Splunk and distributed tracing systems. Familiarity with OOB dashboards and templates creation. Trying to integrate ITSI to correlate event data for analytics. Communication skills, especially for explaining technical concepts to nontechnical business leaders General understanding of distributed systems: They need to understand the complexities of modern architectures, including microservices, cloud-native environments, and hybrid infrastructure. Familiarity with the AWS ecosystem, specifically Redshift and RDS Communication skills, especially for explaining technical concepts to nontechnical business leaders Ability to work on a dynamic, research-oriented team that has concurrent projects Experience in building or maintaining ETL processes Experience in insurance domain Professional certification. Strong understanding of distributed systems: They need to understand the complexities of modern architectures, including microservices, cloud-native environments, and hybrid infrastructure. Proficiency in observability tools: They are familiar with tools for logging, metrics, and tracing, such as ELK Stack, Prometheus, Grafana, and distributed tracing systems. Data analysis and visualization skills: They can analyze telemetry data to identify trends and patterns and create visualizations to communicate insights. Scripting and automation: They can automate tasks and create scripts to manage observability infrastructure. Should have experience with cloud platforms like AWS, Azure, and GCP