Overview
Remote
$40 - $50
Contract - W2
Contract - Independent
Contract - 24 Month(s)
No Travel Required
Skills
Azure Databricks
Datafactory
Job Details
Job Summary:
We are seeking a skilled Azure Data Engineer with hands-on experience in Azure Databricks and Azure Data Factory (ADF) to design, develop, and maintain scalable data pipelines and analytics solutions. The ideal candidate will have a strong understanding of cloud data engineering best practices and a passion for solving complex data problems.
Key Responsibilities:
- Design and implement ETL/ELT pipelines using Azure Data Factory and Databricks.
- Develop scalable data models and workflows that integrate with various data sources (SQL, Blob, ADLS, etc.).
- Use PySpark / Scala / SQL to process and transform data within Azure Databricks.
- Create robust and secure data pipelines for batch and real-time data ingestion.
- Collaborate with data analysts, data scientists, and stakeholders to understand data requirements.
- Implement logging, monitoring, and alerting for data pipeline reliability.
- Optimize data pipelines for performance, scalability, and cost efficiency.
- Apply CI/CD principles and DevOps best practices to data engineering workflows.
- Ensure compliance with data governance, privacy, and security policies.
Required Skills & Qualifications:
- 4+ years of experience in data engineering or similar role.
- Strong hands-on experience with Azure Databricks and Azure Data Factory.
- Proficient in PySpark, SQL, and/or Scala.
- Experience with Azure Synapse, ADLS Gen2, and Azure Key Vault is a plus.
- Strong understanding of data warehousing concepts and cloud data architecture.
- Experience with Git, CI/CD pipelines, and infrastructure as code (Terraform or ARM templates preferred).
- Knowledge of data security, data masking, and data lineage.
- Excellent problem-solving, communication, and collaboration skills.
Preferred Qualifications:
- Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect).
- Experience with Power BI or other reporting/visualization tools.
- Familiarity with Delta Lake, MLflow, or Databricks Notebooks.
- Experience working in Agile or DevOps environments.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.