Overview
Skills
Job Details
Python Backend Engineer
Location Snoqualmie, WA
Job Description:
We are looking for a strong Backend Engineer who can design and implement robust data pipelines that ingest logs, metrics, and telemetry data from various observability tools such as Splunk, InfluxDB, and OpenSearch. This data must be processed, normalized, and persisted in appropriate backend data stores (SQL/NoSQL) for downstream usage. The engineer will also be responsible for developing performant and secure RESTful APIs to support frontend visualizations and dashboards.
________________________________________
Key Responsibilities
Design and implement data ingestion pipelines to pull data from:
o Splunk (via REST API or SDKs)
o InfluxDB (using Flux/InfluxQL)
o OpenSearch (via query DSL or API)
Normalize, transform, and insert collected data into backend systems such as:
o PostgreSQL / MySQL
o MongoDB / DynamoDB / TimescaleDB (optional based on use case)
Build RESTful APIs to expose processed data to the frontend for:
o Dashboards
o Alerts/Health indicators
o Metrics visualizations
Implement data retention and archival logic as needed for compliance or performance
Work with DevOps to integrate pipelines into CI/CD and containerized environments (Docker/K8s)
Implement basic observability (logs, metrics, alerts) for the APIs and pipelines
Collaborate closely with frontend developers and business analysts to shape data contracts and endpoint requirements
________________________________________
Required Skills & Experience
7+ years backend development experience with Python, Node.js, or Go
Hands-on experience with API development frameworks (e.g., FastAPI, Flask, Express, or Gin)
Experience integrating with Splunk, InfluxDB, and/or OpenSearch
Strong grasp of query languages like:
o SPL (Splunk)
o Flux or InfluxQL (InfluxDB)
o Elasticsearch DSL (OpenSearch)
Proficiency in SQL and data modeling
Experience with JSON, REST, OAuth, JWT, and API security best practices
Experience building services that process high-velocity telemetry or monitoring data
Solid understanding of asynchronous processing (Celery, Kafka, etc.)
________________________________________
Mandatory Areas
Must Have Skills
Skill 1 Python/ Go 7+ years
Skill 2 API Frameworks: 5+ years
Skill 3 Integration with Splunk: 5+ years
Skill 4 SQL 7 + years
Skill 5 Data Ingestion 7+ years
Good To have Skills
Skill 1 Flux
Skill 2 Asynchronous Processing