Overview
Remote
Depends on Experience
Full Time
Skills
Collaborate
Computer Science
attention to detail
ETL
EC2
Java
NoSQL
Job Details
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analysis.
- Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Optimize and maintain data infrastructure on AWS, ensuring data quality, security, and availability.
Implement best practices for data management, including data governance, data modelling, and data warehousing. - Develop and maintain data workflows and processes using AWS services such as Glue, Redshift, S3, Lambda, and others.
- Monitor and troubleshoot data pipelines, ensuring high performance and reliability.
- Collaborate with cross-functional teams to ensure data accuracy and consistency across systems.
- Stay updated with the latest trends and technologies in data engineering and AWS cloud services.
Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field. Master s degree is a plus.
- 5+ years of experience in data engineering, with a strong focus on AWS cloud services.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with AWS services such as Glue, Redshift, S3, Lambda, EC2, RDS, and others.
- Strong understanding of data modelling, ETL processes, and data warehousing concepts.
- Experience with SQL and NoSQL databases.
- Familiarity with data visualisation tools and techniques.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills, with the ability to work effectively in a hybrid environment.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.