Overview
Skills
Job Details
Key Responsibilities:
Assist in building and maintaining scalable data pipelines and ETL processes.
Support the integration of data from various sources into data warehouses or data lakes.
Collaborate with data scientists, analysts, and engineers to understand data requirements.
Monitor data pipelines for failures and contribute to incident resolution.
Write clean, efficient, and well-documented code (e.g., Python, SQL, or Scala).
Help maintain data quality, integrity, and security standards.
Participate in code reviews and learn best practices in data engineering.
Qualifications:
Required:
Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field.
Experience with SQL and basic programming knowledge in Python or another language.
Familiarity with relational databases (e.g., PostgreSQL, MySQL) and/or cloud platforms (e.g., AWS, Azure, Google Cloud Platform).
Understanding of basic data structures, APIs, and data modeling concepts.
Strong analytical and problem-solving skills.
Effective communication and collaboration abilities.