Overview
Skills
Job Details
MAJOR DUTIES AND RESPONSIBILITIES
- Design, develop, and maintain scalable ETL pipelines to ensure data quality and availability
- Implement monitoring and alerting solutions to ensure data pipeline reliability and performance
- Develop and manage deployment pipelines to facilitate continuous integration and delivery of data engineering solutions
- Document and communicate data engineering processes and standards to business-intelligence, data, and analytics professionals with varied backgrounds Continuously evaluate and improve data engineering tools and approaches to enhance performance and efficiency
REQUIRED QUALIFICATIONS
- Expertise in data engineering languages such as Scala (preferred) or Java, with proficiency in Python
- Experience with BigData tools, particularly Spark, and familiarity with Hadoop ecosystems
- Proficiency in building and managing ETL pipelines
- Experience with cloud platforms like AWS, including services such as S3, Lambda, Redshift, and EMR
- Strong understanding of relational databases and SQL, and familiarity with NoSQL databases
- Knowledge of data architecture, data warehousing, and data marts
- Experience in implementing monitoring solutions and deployment pipelines for data applications
- Ability to work with distributed systems and manage data storage solutions Demonstrated ability and desire to continually expand skill set, and learn from and teach others