Cloud Data Engineer

Overview

Remote
Depends on Experience
Contract - W2
Contract - 18 Month(s)

Skills

Azure
Databrick
snowflake
sql
python
tableu

Job Details

As a Data Engineer, you will use tools such as Azure, Databricks, Snowflake, SQL, Python, Scala, and Tableau to engineer data pipelines and data models to enhance enterprise reporting and analytics.

Specific competencies include:

  • Data Structures and Models - Designs and develops the overall database/data warehouse structure based on functional and technical requirements. Designs and develops data collection frameworks for structured and unstructured data.
  • Data Pipelines and ELT - Designs and applies data extraction, loading, and transformation techniques to connect large data sets from various sources.
  • Data Performance - Troubleshoots and fixes for data performance issues that come with querying and combining large volumes of data. Adjusts performance for solution accordingly during initial development.

RESPONSIBILITIES:

  • Complete analysis, design, and development of BI solutions
  • Database development primarily in SSIS, Databricks, and SQL
  • Collaborate with other developers to create and implement the best approach solution(s)
  • Review queries and troubleshoot for performance issues
  • Assist with the analysis and extraction of relevant information from large amounts of historical business data to feed data science initiatives
  • Participate and support the design and documentation of processes for large-scale data analyses, model development, model validation, and model implementation
  • Support and maintain a positive data safety culture by following all policies and procedures and actively contributing to a safe and secure working environment

QUALIFICATIONS:

  • A bachelor's degree in Data Analytics, MIS, Computer Science, or related area and significant experience with data analytics, business intelligence design, and development.
  • 5+ years of experience engineering within a data warehouse or related experience with dimensional data modeling.
  • 5+ years of experience designing and developing ETLs with tools like Microsoft SSIS (SQL Server Integration Services), Databricks, or Python.
  • 1 year of hands-on experience programming with Scala.
  • Excellent written and verbal communication skills.
  • Excellent organizational and time management skills.
  • Tested problem-solving and decision-making skills.
  • Experience working as part of an Agile Scrum team.