Overview
Skills
Job Details
We have Contract role Data Engineer-Remote for our client at New Delhi. Please let me know if you or any of your friends would be interested in this position.
Position Details:
Data Engineer-Remote-New Delhi
Location : (Remote)
Project Duration : 06 months Contract
Job Description:
We are seeking a skilled Data Engineer, who is knowledgeable about and loves working with modern data integration frameworks, big data, and cloud technologies. Candidates must also be proficient with data programming languages (e.g., Python and SQL). The Yum! data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projects with the intent of elevating the customer experience and driving revenue and profit growth in our restaurants globally. The candidate will work in our office in Gurgaon, India.
Key Responsibilities
As a data engineer, you will:
Partner with KFC, Pizza Hut, Taco Bell & Habit Burger to build data pipelines to enable best-in-class restaurant technology solutions.
Play a key role in our Data Operations team developing data solutions responsible for driving Yum! growth.
Design and develop data pipelines streaming and batch to move data from point-of-sale, back of house, operational platforms, and more to our Global Data Hub
Contribute to standardizing and developing a framework to extend these pipelines across brands and markets
Develop on the Yum! data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best in breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.).
Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting, and other data integration points.
Skills and Qualifications:
Vast background in all things data-related
AWS platform development experience (EKS, S3, API Gateway, Lambda, etc.)
Experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus
High level of proficiency with SQL (Snowflake a big plus)
Proficiency with Python for transforming data and automating tasks
Experience with Kafka, Pulsar, or other streaming technologies
Experience orchestrating complex task flows across a variety of technologies
Bachelor s degree from an accredited institution or relevant experience