Backend Developer

  • Washington D.C., DC
  • Posted 4 days ago | Updated 2 days ago

Overview

On Site
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Analytics
Apache Airflow
Apache Kafka
Backend Development
Cloud Computing
Cloud Storage
Collaboration
Continuous Delivery
Continuous Integration
Cross-functional Team
Data Analysis
Data Flow
Google Cloud
Google Cloud Platform
JavaScript
Kubernetes
Data Modeling
Machine Learning (ML)
Docker
NoSQL
Extract
Transform
Load
GitHub
Good Clinical Practice
PyTorch
TensorFlow
RESTful
Mentorship
Microservices
NumPy
Operational Excellence
Pandas
Python
Orchestration
Terraform
React.js
Relational Databases
SQL
Scheduling
Statistics
Streaming
Workflow
scikit-learn

Job Details

Position: Backend Developer

Location: Washington, DC (5 days/week onsite)

Job Type: 6+ Months Contract to hire (FTE)

About The Role: We re building a next-generation data analytics platform on Google Cloud Platform to power in-app workflows and analytics for our users. Our stack includes Python microservices, Airflow for pipeline orchestration, and a React/Next.js frontend. You ll join a small, cross-functional team responsible for end-to-end service development, deployment, and operational excellence.

Top Skills:

  • Culture First: Mission-driven, progressive, and entrepreneurial environment.
  • Python (core backend development).
  • Google Cloud Platform (Google Cloud Platform Pub/Sub, BigQuery, Cloud Functions).
  • Data pipeline development (Airflow or equivalent).
  • Infrastructure as Code (Terraform).
  • Containerization (Docker).
  • GitHub / CI-CD.

Required Skills & Qualifications:

  • 4+ years of professional Python development experience.
  • Hands-on experience with Apache Airflow (authoring DAGs, operators, scheduling) BIG NICE TO HAVE!
  • Strong working knowledge of Google Cloud Platform services (Compute Engine, Cloud Functions, BigQuery, Pub/Sub, IAM).
  • Experience containerizing applications (Docker) and deploying with CI/CD (GitHub Actions, Cloud Build, etc.).
  • Solid understanding of SQL and relational databases; bonus for NoSQL (Firestore/Datastore).
  • Familiarity with RESTful API design.
  • Commitment to code quality: automated tests, linting, type checking.

Nice-to-Haves:

  • Experience with Terraform or other IaC tools.
  • Knowledge of Kubernetes and serverless architectures.
  • Background in event-driven or streaming data systems (Dataflow, Kafka).
  • Exposure to security best practices in cloud environments.
  • Experience performing statistical analysis and data modeling (e.g., using NumPy, pandas, SciPy).
  • Familiarity with machine learning frameworks and workflows (e.g., scikit-learn, TensorFlow, Pytorch).

Responsibilities:

  • Design, implement, and maintain backend services and APIs in Python.
  • Build and optimize data pipelines using Apache Airflow.
  • Collaborate with product and frontend teams to define clear service contracts.
  • Develop infrastructure-as-code for Google Cloud Platform resources (Pub/Sub, Cloud Functions, BigQuery, Cloud Storage).
  • Ensure reliability: write tests, set up monitoring/alerting, troubleshoot production issues.
  • Participate in code reviews, mentor junior engineers, and help evolve our best practices.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.