Overview
Hybrid
$50 - $60
Contract - W2
Contract - 1 Year(s)
Skills
SQL
Python scripts
Job Details
Key Responsibilities:
- Design, build, and maintain scalable and efficient ETL/ELT data pipelines
- Develop complex SQL queries to extract, transform, and analyze large datasets
- Write clean, efficient, and well-documented Python scripts for data processing
- Collaborate with data architects and stakeholders to ensure data quality, consistency, and governance
- Optimize performance of data workflows and queries for faster results and lower costs
- Integrate data from multiple sources including APIs, flat files, cloud storage, and databases
- Monitor and troubleshoot data flows and pipeline issues in production
- Implement data validation, error handling, and logging mechanisms
Required Qualifications:
- 3 6 years of experience in data engineering or related roles
- Proficient in SQL for complex joins, window functions, and performance tuning
- Strong programming skills in Python, especially for data transformation and automation
- Experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server) and cloud platforms (AWS, Azure, or Google Cloud Platform)
- Familiarity with data pipeline tools such as Airflow, Apache NiFi, or DBT
- Strong understanding of data modeling, ETL concepts, and data warehousing
Preferred Qualifications:
- Experience with cloud data warehouses like Snowflake, Redshift, or BigQuery
- Exposure to containerization tools like Docker or orchestration with Kubernetes
- Knowledge of CI/CD tools for data engineering pipelines
- Prior experience working in hybrid or agile environments
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.