Overview
Skills
Job Details
Job Description:
We are seeking a skilled ETL/ELT Developer to join our data engineering team. This role focuses on building, optimizing, and maintaining scalable data pipelines across cloud and hybrid environments using tools like Snowflake, Azure Data Factory, FiveTran, and other modern technologies.
Key Responsibilities:
Design, develop, and optimize data transformation workflows to cleanse, enrich, and aggregate data per business needs.
Load processed data into cloud data platforms like Snowflake, Azure Synapse, or BigQuery.
Ensure pipeline performance and resource optimization using cloud-native services.
Automate workflows using orchestration tools (e.g., ADF, Snowpipe, Streams & Tasks).
Integrate data from diverse sources such as APIs, on-prem systems, and cloud databases.
Collaborate with business analysts, developers, and data teams to gather and deliver data requirements.
Maintain documentation, including process guides, runbooks, and support materials.
Participate in change management and deployment processes.
Requirements:
Bachelor s degree in Information Systems/Technology or equivalent experience.
5 7 years of experience in ETL/ELT development in enterprise environments.
Strong working knowledge of Snowflake and cloud-based ETL tools (e.g., FiveTran, ADF, Snowpipe).
Proficient in SQL; experience with Python and/or R is a plus.
Solid troubleshooting, communication, and multitasking skills.
Ability to work collaboratively in a fast-paced, dynamic environment.