Overview
Skills
Job Details
Job Description -
Backend Developer
Washington,DC
Client- Client: Gassroots Analytics
6+ months C2H
Must have solid Linkedin ID with photo, old created, good conneciton
Need local candiaet or with in communte distance only
ONSITE INTERVIEW
About The Role We re building a next-generation data analytics platform on Google Cloud Platform to power in-app workflows and analytics for our users. Our stack includes Python microservices, Airflow for pipeline orchestration, and a React/Next.js frontend. You ll join a small, cross-functional team responsible for end-to-end service development, deployment, and operational excellence.
Top Skills:
- Culture First: Mission-driven, progressive, and entrepreneurial environment
- Python (core backend development)
- Google Cloud Platform (Google Cloud Platform Pub/Sub, BigQuery, Cloud Functions)
- Data pipeline development (Airflow or equivalent)
- Infrastructure as Code (Terraform)
- Containerization (Docker)
- GitHub / CI-CD
Job Description
About The Role We re building a next-generation data analytics platform on Google Cloud Platform to power in-app workflows and analytics for our users. Our stack includes Python microservices, Airflow for pipeline orchestration, and a React/Next.js frontend. You ll join a small, cross-functional team responsible for end-to-end service development, deployment, and operational excellence.
What You ll Do
- Design, implement, and maintain backend services and APIs in Python
- Build and optimize data pipelines using Apache Airflow
- Collaborate with product and frontend teams to define clear service contracts
- Develop infrastructure-as-code for Google Cloud Platform resources (Pub/Sub, Cloud Functions, BigQuery, Cloud Storage)
- Ensure reliability: write tests, set up monitoring/alerting, troubleshoot production issues
- Participate in code reviews, mentor junior engineers, and help evolve our best practices
What We re Looking For
- 6-7+ years of professional Python development experience
- Hands-on experience with Apache Airflow (authoring DAGs, operators, scheduling) BIG NICE TO HAVE!
- Strong working knowledge of Google Cloud Platform services (Compute Engine, Cloud Functions, BigQuery, Pub/Sub, IAM)
- Experience containerizing applications (Docker) and deploying with CI/CD (GitHub Actions, Cloud Build, etc.)
- Solid understanding of SQL and relational databases; bonus for NoSQL (Firestore/Datastore)
- Familiarity with RESTful API design
- Commitment to code quality: automated tests, linting, type checking
- Experience with Terraform or other IaC tools
- Knowledge of Kubernetes and serverless architectures
- Background in event-driven or streaming data systems (Dataflow, Kafka)
- Exposure to security best practices in cloud environments
- Experience performing statistical analysis and data modeling (e.g., using NumPy, pandas, SciPy)
Familiarity with machine learning frameworks and workflows (e.g., scikit-learn, TensorFlow, Pytorch