Backend Developer

  • Washington D.C., DC
  • Posted 21 hours ago | Updated 5 hours ago

Overview

On Site
Depends on Experience
Contract - Independent
Contract - W2
Contract - 12 Month(s)

Skills

Analytics
Apache Airflow
Apache Kafka
Backend Development
Continuous Integration
Cross-functional Team
Cloud Computing
Data Modeling
Docker
Extract
Transform
Load
Machine Learning (ML)
Good Clinical Practice
Google Cloud
Microservices
Operational Excellence
Google Cloud Platform
PyTorch
Pandas
Kubernetes
Relational Databases
JavaScript
NoSQL
Scheduling
Streaming
TensorFlow
Mentorship
Cloud Storage
Collaboration
React.js
RESTful
Python
Statistics
Continuous Delivery
Data Analysis
scikit-learn
GitHub
Data Flow
Orchestration
NumPy
SQL
Terraform
Workflow

Job Details

Position: Backend Developer

Location: Washington, DE (onsite)

Job Type: Contract Long Term

Job Description: We re building a next-generation data analytics platform on Google Cloud Platform to power in-app workflows and analytics for our users. Our stack includes Python microservices, Airflow for pipeline orchestration, and a React/Next.js frontend. You ll join a small, cross-functional team responsible for end-to-end service development, deployment, and operational excellence.

Top Skills:

  • Culture First: Mission-driven, progressive, and entrepreneurial environment.
  • Python (core backend development).
  • Google Cloud Platform (Google Cloud Platform Pub/Sub, BigQuery, Cloud Functions).
  • Data pipeline development (Airflow or equivalent).
  • Infrastructure as Code (Terraform).
  • Containerization (Docker).
  • GitHub / CI-CD.

Required Qualifications:

  • 4+ years of professional Python development experience.
  • Hands-on experience with Apache Airflow (authoring DAGs, operators, scheduling) BIG NICE TO HAVE!.
  • Strong working knowledge of Google Cloud Platform services (Compute Engine, Cloud Functions, BigQuery, Pub/Sub, IAM).
  • Experience containerizing applications (Docker) and deploying with CI/CD (GitHub Actions, Cloud Build, etc.)
  • Solid understanding of SQL and relational databases; bonus for NoSQL (Firestore/Datastore).
  • Familiarity with RESTful API design.
  • Commitment to code quality: automated tests, linting, type checking.

Nice-to-Haves:

  • Experience with Terraform or other IaC tools.
  • Knowledge of Kubernetes and serverless architectures.
  • Background in event-driven or streaming data systems (Dataflow, Kafka).
  • Exposure to security best practices in cloud environments.
  • Experience performing statistical analysis and data modeling (e.g., using NumPy, pandas, SciPy).
  • Familiarity with machine learning frameworks and workflows (e.g., scikit-learn, TensorFlow, Pytorch).

What You ll Do:

  • Design, implement, and maintain backend services and APIs in Python.
  • Build and optimize data pipelines using Apache Airflow.
  • Collaborate with product and frontend teams to define clear service contracts.
  • Develop infrastructure-as-code for Google Cloud Platform resources (Pub/Sub, Cloud Functions, BigQuery, Cloud Storage).
  • Ensure reliability: write tests, set up monitoring/alerting, troubleshoot production issues.
  • Participate in code reviews, mentor junior engineers, and help evolve our best practices.

Culture & Perks:

  • Collaborative, feedback-friendly culture.
  • 33 total PTO days (includes holidays) + 2 days/year added tenure.
  • 3 weeks/year WFH flexibility.
  • Office closed between Christmas and New Year.
  • Flexible hours: core hours 10 4, Must work 8 hrs each day.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.