CloudDataEngineer

Overview

Contract - W2
Contract - 4 day((s))

Skills

6+ years of relevant experience or equivalent education in ETL Processing/dataarchitecture or equivalent. 4+ years of experience working with bigdatatechnologies on AWS/Azure/GCP 3+ years of experience in the Apache

Job Details

Role: Cloud Data Engineer

Location: 100% Remote

Duration: Long Term

Client Location: On Shore, US

Mode of work Client or office environment/may work remotely., Occasional evening and weekend

work.

Work Time Zone: EST or CST

About the job

Job Summary:

We're looking for a dynamic data engineer with Apache Spark and AWS experience to join the data analytics team.

You will have the opportunity to work as part of a cross-functional team to define, design and deploy frameworks for data collection, normalization, transformation, storage and reporting on AWS to support the analytic missions of the stakeholders.

Experience:

6+ years of relevant experience or equivalent education in ETL Processing/data architecture or equivalent.

4+ years of experience working with big data technologies on AWS/Azure/Google Cloud Platform

3+ years of experience in the Apache Spark/DataBricks framework (Python/Scala)

5+ Years of experience on Datacenter Migration involved in End to End Projects.

Detailed Job Description

Roles & Responsibilities:

Design, develop and deploy data pipelines including ETL-processes for getting, processing and delivering data using Apache Spark Framework.

Monitor, manage, validate and test data extraction, movement, transformation, loading, normalization, cleansing and updating processes. Build complex databases that are useful, accessible, safe and secure.

Coordinates with users to understand data needs and delivery of data with a focus on data quality, data reuse, consistency, security, and regulatory compliance.

Collaborate with team-members on data models and schemas in our data warehouse.

Collaborate with team-members on documenting source-to-target mapping.

Conceptualize and visualize data frameworks

Communicate effectively with various internal and external stakeholders.

Qualification & Experience:

Bachelor's degree in computer sciences or related field

6+ years of relevant experience or equivalent education in ETL Processing/data architecture or equivalent.

4+ years of experience working with big data technologies on AWS/Azure/Google Cloud Platform

3+ years of experience in the Apache Spark/DataBricks framework (Python/Scala) Databricks and AWS developer/architect certifications a big plus

Technical Skills:

ETL Processing/data architecture or equivalent.

Big data technologies on AWS/Azure/Google Cloud Platform

Apache Spark/DataBricks framework (Python/Scala)

Experience working with different database structures (e.g., transaction based vs. data

warehouse)

Databricks and AWS developer/architect certifications a big plus

Additional Qualifications

Strong project planning and estimating skills related to area of expertise

Strong communication skills

Good leadership skills to guide and mentor the work of less experienced personnel

Ability to be a high-impact player on multiple simultaneous engagements

Ability to think strategically, balancing long and short-term priorities

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.