Overview
Skills
Job Details
Key Responsibilities
-
Design, develop, and maintain robust ETL/ELT data pipelines
-
Build and optimize data warehouses and data lakes
-
Ensure data quality, integrity, security, and performance
-
Integrate data from multiple sources (APIs, databases, files, streaming systems)
-
Collaborate with analytics and data science teams to support reporting and modeling needs
-
Monitor, troubleshoot, and optimize data workflows
-
Implement best practices for data governance and documentation
-
Support cloud-based data solutions and infrastructure
Required Skills & Qualifications (8 Year's)
-
Bachelor s degree in Computer Science, Engineering, or related field
-
Strong experience with SQL and relational databases
-
Hands-on experience with Python / Java / Scala
-
Experience with ETL tools and data pipeline frameworks
-
Knowledge of data warehousing concepts (star/snowflake schemas)
-
Experience with cloud platforms (AWS, Azure, or Google Cloud Platform)
-
Familiarity with Big Data technologies (Spark, Hadoop, Kafka - preferred)
-
Understanding of version control (Git) and CI/CD practices
Preferred Qualifications
-
Experience with tools like Airflow, dbt, Snowflake, Redshift, BigQuery
-
Knowledge of data security and compliance standards
-
Experience in Agile/Scrum environments