ETL Developer

Overview

Remote
$80,000 - $100,000
Full Time
No Travel Required

Skills

Data Flow
Collaboration
Conflict Resolution
Continuous Delivery
Continuous Integration
Data Governance
Apache Kafka
Apache Spark
Business Intelligence
Cloud Computing

Job Details

Job Title: ETL Developer

Experience: 5 6 Years
Location: Remote
Employment Type: Full-time

Visa: OPT


Job Summary:

We are looking for a skilled ETL Developer with 5 6 years of hands-on experience in designing, developing, and maintaining robust ETL solutions. The ideal candidate should have strong expertise in data integration, transformation, and loading processes, working with large-scale data platforms, cloud technologies, and modern ETL tools to support analytics and business intelligence initiatives.


Key Responsibilities:

  • Design, develop, and maintain scalable ETL pipelines and workflows.

  • Extract, transform, and load data from various structured and unstructured data sources.

  • Collaborate with Data Architects, Analysts, and BI teams to understand business requirements.

  • Optimize ETL processes for performance, scalability, and reliability.

  • Implement data quality checks, validation rules, and error handling.

  • Work with cloud platforms (AWS, Azure, Google Cloud Platform) for data integration.

  • Schedule and monitor ETL jobs, ensuring timely and accurate data delivery.

  • Troubleshoot and resolve ETL job failures and performance bottlenecks.

  • Document data flow, ETL design, and technical specifications.


Required Skills & Experience:

  • 5 6 years of ETL development experience in enterprise environments.

  • Strong expertise in ETL tools (e.g., Informatica, Talend, DataStage, SSIS, ADF, or similar).

  • Proficiency in SQL for complex queries, transformations, and performance tuning.

  • Experience with data warehousing concepts, dimensional modeling, and OLAP/OLTP systems.

  • Familiarity with cloud platforms (AWS Glue, Azure Data Factory, Google Cloud Platform Dataflow/Composer).

  • Knowledge of Python / Shell scripting for automation and workflow orchestration.

  • Experience with version control (Git) and CI/CD pipelines.

  • Strong analytical, debugging, and problem-solving skills.


Good to Have:

  • Exposure to Big Data tools (Spark, Hadoop).

  • Experience with real-time/streaming data integration (Kafka, Pub/Sub).

  • Knowledge of data governance, security, and compliance best practices.

  • Experience with BI tools (Power BI, Tableau, Looker) for data consumption.


Education:

  • Bachelor s or Master s degree in Computer Science, Information Technology, or a related field.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Intellect Quest LLC