Data Engineer - Remote / Telecommute

  • Minneapolis, MN
  • Posted 16 hours ago | Updated 2 hours ago

Overview

Remote
On Site
Hybrid
$$65 / hr
Contract - W2
Contract - 1 day((s))

Skills

Data Engineer

Job Details

Job Description:

Responsibilities:
  • Data Warehouse Design and Development (Snowflake): Design, develop, and maintain scalable and efficient data models and data warehouses within the Snowflake environment.
  • ETL/ELT Pipeline Development: Design, build, and optimize robust ETL/ELT pipelines using various tools and technologies to ingest, transform, and load data from diverse sources into Snowflake and other data stores.
  • Azure Data Services: Utilize and integrate various Azure data services, including but not limited to Azure Data Factory, Azure Synapse Analytics (SQL Data Warehouse, Data Lake), Azure Databricks, Azure Blob Storage, Azure SQL Database, and Azure Data Lake Storage (ADLS).
  • Data Integration: Develop and implement data integration solutions to connect various internal and external data sources.
  • Performance Optimization: Identify and resolve performance bottlenecks in data pipelines and Snowflake queries.
  • Data Quality and Governance: Implement and enforce data quality checks, data governance policies, and data security measures.
  • Monitoring and Alerting: Design and implement monitoring and alerting systems for data pipelines and data infrastructure to ensure reliability and timely issue resolution.
  • Collaboration and Communication: Work closely with data scientists, analysts, and business stakeholders to understand their data requirements and provide effective data solutions.
  • Clearly communicate technical concepts and progress.
  • Documentation: Create and maintain comprehensive documentation for data models, ETL pipelines, and data infrastructure.
  • Problem Solving: Troubleshoot and resolve data-related issues in a timely and efficient manner.
  • Mentorship: Provide guidance and mentorship to junior data engineers, fostering their technical growth.
  • Staying Current: Keep abreast of the latest advancements in data warehousing, cloud technologies (especially Azure and Snowflake), and ETL/ELT methodologies.
Qualifications:
  • Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related field.
  • Minimum of 7 years of professional experience in data engineering.
  • Deep expertise in Snowflake data warehousing.
  • Strong understanding of Snowflake architecture, features, and best practices.
  • Proven experience in designing and implementing data models and schemas in Snowflake.
  • Proficient in writing complex SQL queries and optimizing Snowflake performance.
  • Experience with Snowflake utilities and tools (e.g., SnowSQL, Snowpipe, Tasks).
  • Strong experience with Microsoft Azure data services.
  • Hands-on experience with Azure Data Factory for building ETL/ELT pipelines.
  • Familiarity with other Azure data services such as Azure Synapse Analytics, Azure Databricks, Azure Data Lake Storage, and Azure SQL Database.
  • Extensive experience with ETL/ELT tools and techniques.
  • Proven ability to design, develop, and maintain complex ETL/ELT workflows.
  • Experience with various data integration patterns and technologies.
  • Strong proficiency in SQL and experience working with different database systems.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.