Overview
Hybrid3 days a week
Depends on Experience
Contract - W2
Contract - 12 Month(s)
No Travel Required
Unable to Provide Sponsorship
Skills
Python Developer
ETL/data integration
Pandas
NumPy
PySpark
AWS
Azure
GCP
SQL
Job Details
Role: ETL Python Developer
Location: Charlotte, NC (Day 1 Hybrid – 3 days office in a week)
Contract -W2 . H1b transfers are accepted.
Job Description:
We are looking for a highly experienced ETL Python Developer with over 8+ years of expertise in designing, developing, and maintaining robust data pipelines and ETL workflows. The ideal candidate will have a strong background in Python programming, data integration, and processing large-scale datasets to support analytics and business intelligence initiatives.
Key Responsibilities:
- Design, develop, and optimize scalable ETL processes and data pipelines using Python.
- Build robust data workflows for extracting, transforming, and loading data from various sources to data warehouses or lakes.
- Collaborate with data analysts, data engineers, and stakeholders to understand data requirements.
- Implement data quality checks and validation mechanisms within ETL workflows.
- Automate data ingestion, transformation, and reporting processes for efficiency and accuracy.
- Monitor, troubleshoot, and optimize existing ETL processes for performance and reliability.
- Maintain documentation of ETL processes, data flow diagrams, and technical specifications.
- Adhere to best practices for coding, version control, and deployment.
Required Skills & Experience:
- 8+ years of professional experience in Python development, with a focus on ETL/data integration.
- Strong proficiency in Python, including libraries such as Pandas, NumPy, PySpark, or similar.
- Hands-on experience with building and maintaining large-scale ETL workflows.
- Familiarity with data warehousing solutions (e.g., Amazon Redshift, Snowflake, Teradata, etc.).
- Experience working with databases (SQL and NoSQL), including complex querying and data modeling.
- Knowledge of cloud platforms (AWS, Azure, Google Cloud Platform) and cloud data services is a plus.
- Experience with workflow orchestration tools such as Apache Airflow, Luigi, or similar.
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork abilities.
Preferred Qualifications:
- Experience with big data technologies (Spark, Hadoop) is advantageous.
- Familiarity with DevOps practices for deploying and scheduling ETL jobs.
- Knowledge of data governance and security best practices.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.