Data warehousing and modeling (ETL)

  • Wilmington, DE
  • Posted 47 days ago | Updated 2 days ago

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2

Skills

AWS
Talend
Snowflake
Oracle
Casandra
Power BI
Python
ETL
Apache

Job Details

  • Min 10 years of Exp Data professional responsible for designing, developing, and maintaining Extract, Transform, Load (ETL) processes and data pipelines within the Amazon Web Services (AWS) cloud environment. 
  • Years of experience needed – 10+ Years

Technical Skills:

  • Experience ETL Tools: Talend
  • Experience in Database: Snowflake, Oracle, Amazon RDS and Casandra
  • Experience in Big Data and Amazon Services: Apache Sqoop, AWS S3, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker
  • Experience Reporting: Power BI
  • Experience in Scripting: SQL, PL/SQL, Python, R
  • Experience Data Modeling Tools: Archimate, Erwin, Oracle Data Modeler (secondary/preferred)
  • Experience in Insurance domain

Responsibilities

  • Designing, building, and automating ETL processes using AWS services like Apache Sqoop, AWS S3, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker.
  • Developing and maintaining data pipelines to move and transform data from diverse sources into data warehouses or data lakes.
  • Ensuring data quality and integrity through validation, cleansing, and monitoring ETL processes.
  • Optimizing ETL workflows for performance, scalability, and cost efficiency within the AWS environment.
  • Troubleshooting and resolving issues related to data processing and ETL workflows.
  • Implementing and maintaining security measures and compliance standards for data pipelines and infrastructure.
  • Documenting ETL processes, data mappings, and system architecture.
  • Required Qualifications:
  • Proficiency in Big data tool and AWS services: Including Apache Sqoop, AWS S3, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker relevant to data storage and processing.
  • Strong SQL skills: For querying databases and manipulating data during the transformation process.
  • Programming and scripting proficiency: Primarily Python, for automating tasks, developing custom transformations, and interacting with AWS services via SDKs and APIs.
  • Data warehousing and modeling expertise: Understanding data warehousing concepts, dimensional modeling, and schema design to optimize data storage and retrieval.
  • Experience with ETL tools and technologies: Talend
  • Data quality management skills: Ensuring data accuracy, completeness, and consistency throughout the ETL process.

Thanks

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.