Overview
Skills
Job Details
Job Description -
They are NOT looking for someone who has 25 years of experience or work in Large enterprise environments.
They are more interested in people coming from the startup environments.
Client: Gassroots Analytics
Location: Washington DC 5 Days on Site
Position: Backend Developer
Duration: Contract to Hire
Price: $100/hr (if you have someone over the rate, please let me know)
Conversion Salary: $130 $200K (If you have someone over, please let me know)
Interview: 1st round virtual and 2nd round Onsite Half Day, typically the same week
Citizenship: /
About The Role We re building a next-generation data analytics platform on Google Cloud Platform to power in-app workflows and analytics for our users. Our stack includes Python microservices, Airflow for pipeline orchestration, and a React/Next.js frontend. You ll join a small, cross-functional team responsible for end-to-end service development, deployment, and operational excellence.
Top Skills:
Culture First: Mission-driven, progressive, and entrepreneurial environment
Python (core backend development)
Google Cloud Platform (Google Cloud Platform Pub/Sub, BigQuery, Cloud Functions)
Data pipeline development (Airflow or equivalent)
Infrastructure as Code (Terraform)
Containerization (Docker)
GitHub / CI-CD
Job Description
About The Role We re building a next-generation data analytics platform on Google Cloud Platform to power in-app workflows and analytics for our users. Our stack includes Python microservices, Airflow for pipeline orchestration, and a React/Next.js frontend. You ll join a small, cross-functional team responsible for end-to-end service development, deployment, and operational excellence.
What You ll Do
Design, implement, and maintain backend services and APIs in Python
Build and optimize data pipelines using Apache Airflow
Collaborate with product and frontend teams to define clear service contracts
Develop infrastructure-as-code for Google Cloud Platform resources (Pub/Sub, Cloud Functions, BigQuery, Cloud Storage)
Ensure reliability: write tests, set up monitoring/alerting, troubleshoot production issues
Participate in code reviews, mentor junior engineers, and help evolve our best practices
What We re Looking For
4+ years of professional Python development experience
Hands-on experience with Apache Airflow (authoring DAGs, operators, scheduling) BIG NICE TO HAVE!
Strong working knowledge of Google Cloud Platform services (Compute Engine, Cloud Functions, BigQuery, Pub/Sub, IAM)
Experience containerizing applications (Docker) and deploying with CI/CD (GitHub Actions, Cloud Build, etc.)
Solid understanding of SQL and relational databases; bonus for NoSQL (Firestore/Datastore)
Familiarity with RESTful API design
Commitment to code quality: automated tests, linting, type checking
Nice-to-Haves
Experience with Terraform or other IaC tools
Knowledge of Kubernetes and serverless architectures
Background in event-driven or streaming data systems (Dataflow, Kafka)
Exposure to security best practices in cloud environments
Experience performing statistical analysis and data modeling (e.g., using NumPy, pandas, SciPy)
Familiarity with machine learning frameworks and workflows (e.g., scikit-learn, TensorFlow, Pytorch)
Culture & Perks:
Collaborative, feedback-friendly culture
33 total PTO days (includes holidays) + 2 days/year added tenure
3 weeks/year WFH flexibility
Office closed between Christmas and New Year
Flexible hours: core hours 10 4, Must work 8 hrs each day.