Overview
On Site
$40 - $60
Contract - W2
Unable to Provide Sponsorship
Skills
ETL
Airflow
Apache
Python
Job Details
We are seeking a skilled ETL Developer with strong expertise in Python and Apache Airflow to design, build, and optimize scalable data pipelines. In this role, you will develop production-ready workflows, ensure pipeline reliability, and collaborate with data teams to support business needs.
Responsibilities
- Develop, test, and deploy Python-based ETL pipelines using Apache Airflow.
- Write efficient, reusable Python scripts for transformations, validations, and data quality checks.
- Manage scheduling, orchestration, and monitoring of workflows in Airflow.
- Collaborate with data engineers and analysts to design pipelines aligned with business requirements.
- Troubleshoot, optimize, and scale existing ETL jobs.
- Implement best practices for code quality, testing, and CI/CD
- Contribute to documentation, observability, and knowledge sharing.
Required Skills
- Strong experience with Python(pandas, SQLAlchemy, or similar libraries for ETL).
- Proficiency with Apache AirflowDAG design, task orchestration, and operators.
- Hands-on experience with dependency/environment management (pipenv, poetry, or conda).
- Solid knowledge of SQLand relational databases.
- Understanding of data modeling and transformation patterns(star schema, SCDs, etc.).
- Familiarity with Git workflowsand CI/CD pipelines.
- Ability to work independently and collaboratively in a fast-paced environment.
Nice to Have
- Cloud platform experience (AWS, Google Cloud Platform, or Azure) for pipelines and storage.
- Familiarity with containerization tools(Docker, Kubernetes).
- Exposure to data warehouses(Snowflake, BigQuery, Redshift).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.