Overview
Skills
Job Details
Job Title: ETL Developer (Snowflake + MDM)
Location: California (Hybrid – Onsite + Remote)
Experience Required: 12+ Years
About the Role
We are seeking an experienced ETL Developer with strong expertise in Snowflake and Master Data Management (MDM) to join our data engineering team. The ideal candidate will be responsible for designing and developing scalable ETL/ELT pipelines, integrating data from multiple systems, and ensuring the accuracy and consistency of master data across the enterprise.
Key Responsibilities
Design, develop, and maintain ETL/ELT workflows using Snowflake and industry-standard ETL tools.
Build scalable data pipelines to support analytics, reporting, and master data processes.
Work closely with business and data teams to implement MDM solutions, ensuring data quality and consistency.
Develop complex SQL scripts, stored procedures, and transformations in Snowflake.
Optimize ETL performance, including query tuning, partitioning, and data modeling.
Integrate data from multiple sources including APIs, databases, and third-party systems.
Implement best practices for data governance, data quality, and metadata management.
Collaborate with Data Architects, Data Engineers, and QA to ensure end-to-end data reliability.
Required Skills & Qualifications
7–12+ years of experience in ETL development.
Strong hands-on experience with Snowflake, including schema design, warehouses, and performance tuning.
Solid understanding of MDM concepts, including golden records, survivorship rules, hierarchies, and match/merge logic.
Proficiency in SQL, Python, or Shell scripting.
Experience with ETL tools such as Informatica, Talend, Matillion, Datastage, or ADF.
Experience with cloud platforms: AWS, Azure, or Google Cloud Platform.
Strong understanding of data warehousing, data modeling, and data integration.
Hands-on experience with CI/CD, Git, and Agile environments.
Preferred Qualifications
Experience with Informatica MDM, Reltio, or SAP MDM.
Familiarity with Snowflake partner tools (Snowpipe, Streams & Tasks, DBT).
Knowledge of Kafka, Airflow, or cloud orchestration services.
Background in finance, healthcare, retail, or enterprise-level environments.
Education
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.