micro1
Overview
We are seeking a highly skilled Backend Engineer to join our team and architect robust data pipelines and APIs for high-velocity telemetry and log data. In this hybrid role based in Snoqualmie, you will collaborate closely with engineers, analysts, and DevOps to build the backbone of our observability and monitoring infrastructure. If you are passionate about backend systems, API security, and solving complex data ingestion challenges, we want to hear from you! Key Responsibilities
Design, implement, and maintain scalable data ingestion pipelines integrating with Splunk, InfluxDB, and OpenSearch Transform, normalize, and store observability data into backend databases (PostgreSQL, NoSQL) for downstream applications Develop performant and secure RESTful APIs for dashboards, alerts, and metrics visualization Enforce data retention and archival policies ensuring compliance and performance Collaborate with DevOps to integrate data pipelines into containerized (Docker/K8s), CI/CD-enabled environments Implement observability for APIs and pipelines, including logging, monitoring, and alerting Work cross-functionally with frontend engineers and business analysts to shape API data contracts and endpoint specifications Required Skills and Qualifications
7+ years of backend development experience with Python, Node.js, or Go Strong expertise in API development using frameworks like FastAPI, Flask, Express, or Gin Hands-on experience integrating with Splunk, InfluxDB, and/or OpenSearch Proficiency in Splunk SPL, InfluxQL/Flux, Elasticsearch/OpenSearch DSL Deep knowledge of SQL, data modeling, and NoSQL data stores Extensive experience with RESTful API design, security (OAuth, JWT), and JSON Solid understanding of asynchronous data processing (Celery, Kafka, or equivalents) Excellent written and verbal communication skills Preferred Qualifications
Experience with TimescaleDB, DynamoDB, and MongoDB Prior work on high-throughput telemetry or observability platforms Familiarity with implementing data retention and archival strategies in regulated environments Join our team and play a critical role in empowering our observability ecosystem, collaborating with passionate teammates in a hybrid work environment. We value clear communication and a drive for technical excellencebring your backend expertise and help us build industry-leading solutions! #J-18808-Ljbffr
We are seeking a highly skilled Backend Engineer to join our team and architect robust data pipelines and APIs for high-velocity telemetry and log data. In this hybrid role based in Snoqualmie, you will collaborate closely with engineers, analysts, and DevOps to build the backbone of our observability and monitoring infrastructure. If you are passionate about backend systems, API security, and solving complex data ingestion challenges, we want to hear from you! Key Responsibilities
Design, implement, and maintain scalable data ingestion pipelines integrating with Splunk, InfluxDB, and OpenSearch Transform, normalize, and store observability data into backend databases (PostgreSQL, NoSQL) for downstream applications Develop performant and secure RESTful APIs for dashboards, alerts, and metrics visualization Enforce data retention and archival policies ensuring compliance and performance Collaborate with DevOps to integrate data pipelines into containerized (Docker/K8s), CI/CD-enabled environments Implement observability for APIs and pipelines, including logging, monitoring, and alerting Work cross-functionally with frontend engineers and business analysts to shape API data contracts and endpoint specifications Required Skills and Qualifications
7+ years of backend development experience with Python, Node.js, or Go Strong expertise in API development using frameworks like FastAPI, Flask, Express, or Gin Hands-on experience integrating with Splunk, InfluxDB, and/or OpenSearch Proficiency in Splunk SPL, InfluxQL/Flux, Elasticsearch/OpenSearch DSL Deep knowledge of SQL, data modeling, and NoSQL data stores Extensive experience with RESTful API design, security (OAuth, JWT), and JSON Solid understanding of asynchronous data processing (Celery, Kafka, or equivalents) Excellent written and verbal communication skills Preferred Qualifications
Experience with TimescaleDB, DynamoDB, and MongoDB Prior work on high-throughput telemetry or observability platforms Familiarity with implementing data retention and archival strategies in regulated environments Join our team and play a critical role in empowering our observability ecosystem, collaborating with passionate teammates in a hybrid work environment. We value clear communication and a drive for technical excellencebring your backend expertise and help us build industry-leading solutions! #J-18808-Ljbffr