Overview
Skills
Job Details
Bachelor's degree in computer science, Engineering, or a related field
Develop and maintain data pipelines ,ELT processes, and workflow orchestration using Apache Airflow, Python and PySpark to ensure the efficient and reliable delivery data.
8 + year s experience in data engineering, ELT development, and data modeling.
Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.
Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.
Demonstrated experience in developing custom connectors for data ingestion from various sources.
Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.
Experience implementing RataQps principles and practices, including data Cl/CD pipelines.
Excellent problem-solving and troubleshooting skills, with a strong attention to detail.
Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.
Familiarity with data visualization tools Apache Superset and dashboard development.
Understanding of distributed systems and working with large-scale datasets.
Familiarity with data governance frameworks and practices.