Azure Data Engineer - Remote / Telecommute

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 1 Month(s)

Skills

ADF
Agile
Analytical Skill
Analytics
Apache Flink
Apache Kafka
Business Intelligence

Job Details

We are looking for Azure Data Engineer - Remote / Telecommute for our client in Dallas, TX
Job Title: Azure Data Engineer - Remote / Telecommute
Job Location: Dallas, TX
Job Type: Contract
Job Description:
  • Develop, test, document, and support scalable data pipelines.
  • Build and evolve data integrations, including APIs, to handle increasing data volume and complexity.
  • Establish and follow data governance processes to ensure data availability, consistency, integrity, and security.
  • Build, implement, and maintain scalable solutions aligned with data governance standards and architectural roadmaps.
  • Collaborate with analytics and business teams to improve data models feeding business intelligence tools.
  • Design and develop data integrations and a data quality framework; write unit, integration, and functional tests.
  • Design, implement, and automate deployment of distributed systems for collecting and processing streaming events from multiple sources.
  • Perform data analysis to troubleshoot and resolve data-related issues.
  • Guide and mentor junior engineers on coding best practices and optimization.
Requirement/Must Have:
  • Bachelor s degree in Computer Science, Mathematics, Statistics, or a related technical field, or equivalent experience.
  • 5+ years of relevant experience in analytics, data engineering, business intelligence, or related field.
  • Strong programming skills in Python, PySpark, and SQL.
  • Experience with Databricks.
  • Experience developing integrations across multiple systems and APIs.
  • Experience with cloud-based databases, specifically Azure technologies (Azure Data Lake, ADF, Azure DevOps, Azure Functions).
  • Experience writing SQL queries for large-scale, complex datasets.
  • Experience with data warehouse technologies and creating ETL/ELT jobs.
  • Strong problem-solving and troubleshooting skills.
  • Process-oriented with excellent documentation skills.
Preferred / Nice to Have:
  • Experience designing data schemas and operating SQL/NoSQL database systems.
  • Experience with Kafka, Flink, Fivetran, and Matillion.
  • Experience in Data Science and Machine Learning.
  • Software engineering experience.
  • Experience with Snowflake.
  • Familiarity with Agile software development methodologies.
Skills:
  • Strong analytical and problem-solving skills.
  • Ability to mentor and guide junior engineers.
  • Effective collaboration and communication skills.
  • Attention to detail and documentation discipline.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.