Hi,
This is Ravi from eCom Solutions Inc. I was reviewing your resume online and would like to talk to you regarding an exciting opportunity we have with one of eCom's premier clients. Your experience looks like a good fit for the position and I wanted to know if you would be interested in exploring this opportunity.
Job Title: Talend / Snowflake Data Engineer
Location: Atlanta, GA / Charlotte, Raleigh, NC (Note Remote only for initiate 2-3 weeks)
Duration: 12 Months
Job Summary
We are seeking a highly skilled Talend / Snowflake Data Engineer to design, develop, and maintain scalable data integration and data warehouse solutions. The ideal candidate will have strong experience in Talend ETL development, Snowflake data warehousing, and cloud-based data pipelines. This role requires expertise in building efficient data pipelines, optimizing performance, and supporting analytics and business intelligence initiatives.
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines using Talend to ingest and transform data from multiple sources.
- Develop and optimize data models and data pipelines in Snowflake.
- Implement data integration solutions for batch and real-time data processing.
- Build scalable data ingestion frameworks from APIs, databases, and flat files.
- Optimize Snowflake performance, including query tuning, clustering, and warehouse optimization.
- Implement data quality, validation, and monitoring mechanisms.
- Work closely with data analysts, BI developers, and business stakeholders to understand data requirements.
- Maintain and improve data governance, security, and compliance standards.
- Troubleshoot and resolve data pipeline failures and performance issues.
- Document data architecture, workflows, and ETL processes.
Required Skills & Experience
- 5+ years of experience in Data Engineering or ETL development.
- Strong hands-on experience with Talend (Talend Data Integration / Talend Cloud).
- Strong expertise in Snowflake Data Warehouse.
- Experience with SQL and advanced query optimization.
- Experience designing data pipelines and ETL/ELT workflows.
- Hands-on experience with data modeling (star schema, snowflake schema).
- Experience working with large datasets and high-volume data processing.
- Knowledge of cloud platforms (AWS / Azure).
- Experience with Git or version control systems.
Preferred Qualifications
- Experience with Python or Shell scripting for automation.
- Knowledge of ESP or other workflow orchestration tools.
- Experience with CI/CD pipelines for data engineering.
- Familiarity with data lake architectures and modern data stack.
- Understanding of data governance and security best practices.
Nice to Have
- Experience with streaming technologies (Kafka, Spark).
- Experience with BI tools such as Tableau or Power BI.
- Snowflake or Talend certifications.