Intento Analytics
Overview
Client looking for 8 years of experience. Title: Python Backend Developer. Location: Snoqualmie, WA (Onsite). Only Locals. When sharing the resume, please mention the candidate location and work authorization. Responsibilities
Design and implement data ingestion pipelines to pull data from Splunk, InfluxDB, and OpenSearch. Normalize, transform, and insert collected data into backend systems such as PostgreSQL, MySQL, MongoDB, DynamoDB, or TimescaleDB (optional based on use case). Build RESTful APIs to expose processed data to the frontend for dashboards, alerts/health indicators, and metrics visualizations. Implement data retention and archival logic as needed for compliance or performance. Work with DevOps to integrate pipelines into CI/CD and containerized environments (Docker/Kubernetes). Implement basic observability (logs, metrics, alerts) for the APIs and pipelines. Collaborate with frontend developers and business analysts to shape data contracts and endpoint requirements. Required Skills & Experience
7 years backend development experience with Python, Node.js, or Go. Hands-on experience with API development frameworks (e.g., FastAPI, Flask, Express, or Gin). Experience integrating with Splunk, InfluxDB, and/or OpenSearch. Strong grasp of query languages such as SPL (Splunk), Flux or InfluxQL (InfluxDB), and Elasticsearch DSL (OpenSearch). Proficiency in SQL and data modeling. Experience with JSON, REST, OAuth, JWT, and API security best practices. Experience building services that process high-velocity telemetry or monitoring data. Solid understanding of asynchronous processing (Celery, Kafka, etc.). Nice to have
Flux Asynchronous Processing
#J-18808-Ljbffr
Client looking for 8 years of experience. Title: Python Backend Developer. Location: Snoqualmie, WA (Onsite). Only Locals. When sharing the resume, please mention the candidate location and work authorization. Responsibilities
Design and implement data ingestion pipelines to pull data from Splunk, InfluxDB, and OpenSearch. Normalize, transform, and insert collected data into backend systems such as PostgreSQL, MySQL, MongoDB, DynamoDB, or TimescaleDB (optional based on use case). Build RESTful APIs to expose processed data to the frontend for dashboards, alerts/health indicators, and metrics visualizations. Implement data retention and archival logic as needed for compliance or performance. Work with DevOps to integrate pipelines into CI/CD and containerized environments (Docker/Kubernetes). Implement basic observability (logs, metrics, alerts) for the APIs and pipelines. Collaborate with frontend developers and business analysts to shape data contracts and endpoint requirements. Required Skills & Experience
7 years backend development experience with Python, Node.js, or Go. Hands-on experience with API development frameworks (e.g., FastAPI, Flask, Express, or Gin). Experience integrating with Splunk, InfluxDB, and/or OpenSearch. Strong grasp of query languages such as SPL (Splunk), Flux or InfluxQL (InfluxDB), and Elasticsearch DSL (OpenSearch). Proficiency in SQL and data modeling. Experience with JSON, REST, OAuth, JWT, and API security best practices. Experience building services that process high-velocity telemetry or monitoring data. Solid understanding of asynchronous processing (Celery, Kafka, etc.). Nice to have
Flux Asynchronous Processing
#J-18808-Ljbffr