Cloud Data Engineer

Overview

Remote
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 10 Month(s)
No Travel Required

Skills

Snowflake
SSIS
Databricks
Python
Azure

Job Details

< class="fw-bold">Job Description</>

We have a position for a Cloud Data Engineer with one of our clients in Remote for an initial contract duration of 10 months. s and all those authorized to work in the US are encouraged to apply.

Summary: The Client is seeking a skilled Cloud Data Engineer to join the Data Office Team in driving the modernization of enterprise analytics. This role will focus on building scalable, high-performance data pipelines and models using modern cloud technologies such as Azure, Databricks, Snowflake, SQL, Python, Scala, and PowerBI. The ideal candidate will bring a strong foundation in data engineering, a passion for solving complex data challenges, and experience working in agile environments.

Key Responsibilities:

Design and develop data pipelines and ELT processes to integrate large, diverse datasets from multiple sources.

Build and maintain data models and structures to support enterprise reporting and analytics.

Collaborate with cross-functional teams to deliver BI and analytics solutions that meet business needs.

Optimize data performance by troubleshooting and resolving issues related to large-scale data querying and transformation.

Participate in the design and documentation of data processes, including model development, validation, and implementation.

Contribute to a positive data safety culture by adhering to data governance and security policies.

Core Competencies:

Data Structures & Modeling: Design and implement scalable data architectures for structured and unstructured data.

Data Pipelines & ELT: Develop robust extraction, transformation, and loading processes using modern tools and frameworks.

Performance Optimization: Monitor and enhance data performance during development and production.

Required Qualifications:

Bachelor s degree in data Analytics, MIS, Computer Science, or a related field.

5+ years of experience in data engineering or data warehouse development, including dimensional modeling.

5+ years of experience designing and developing ETL/ELT processes using tools like SSIS, Databricks, or Python.

We're looking for a self-motivated team member who can work independently in a fully remote Agile environment - someone who takes ownership of their work, communicates proactively, and drives progress without needing close supervision. The ideal candidate thrives in a distributed Scrum team, managing their own deliverables and staying accountable through clear, consistent communication.

< class="fw-bold">Job Description</>

We have a position for a Cloud Data Engineer with one of our clients in Remote for an initial contract duration of 10 months. s and all those authorized to work in the US are encouraged to apply.

Summary: The Client is seeking a skilled Cloud Data Engineer to join the Data Office Team in driving the modernization of enterprise analytics. This role will focus on building scalable, high-performance data pipelines and models using modern cloud technologies such as Azure, Databricks, Snowflake, SQL, Python, Scala, and PowerBI. The ideal candidate will bring a strong foundation in data engineering, a passion for solving complex data challenges, and experience working in agile environments.

Key Responsibilities:

Design and develop data pipelines and ELT processes to integrate large, diverse datasets from multiple sources.

Build and maintain data models and structures to support enterprise reporting and analytics.

Collaborate with cross-functional teams to deliver BI and analytics solutions that meet business needs.

Optimize data performance by troubleshooting and resolving issues related to large-scale data querying and transformation.

Participate in the design and documentation of data processes, including model development, validation, and implementation.

Contribute to a positive data safety culture by adhering to data governance and security policies.

Core Competencies:

Data Structures & Modeling: Design and implement scalable data architectures for structured and unstructured data.

Data Pipelines & ELT: Develop robust extraction, transformation, and loading processes using modern tools and frameworks.

Performance Optimization: Monitor and enhance data performance during development and production.

Required Qualifications:

Bachelor s degree in data Analytics, MIS, Computer Science, or a related field.

5+ years of experience in data engineering or data warehouse development, including dimensional modeling.

5+ years of experience designing and developing ETL/ELT processes using tools like SSIS, Databricks, or Python.

We're looking for a self-motivated team member who can work independently in a fully remote Agile environment - someone who takes ownership of their work, communicates proactively, and drives progress without needing close supervision. The ideal candidate thrives in a distributed Scrum team, managing their own deliverables and staying accountable through clear, consistent communication.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.