Analytics Engineer - Azure and Microsoft Fibre

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 1 Year(s)
Able to Provide Sponsorship

Skills

Continuous Integration
Communication
Computer Science
Conflict Resolution
Continuous Delivery
Analytical Skill
Analytics
Apache Spark
Auditing
Business Intelligence
Dashboard
Data Processing
Data Quality
Data Security
Database Administration
Data Engineering
Data Flow
Data Governance
Extract
Transform
Load
Data Masking
Data Modeling
Databricks
Decision-making
DevOps
ELT
Management
Meta-data Management
Microsoft
Supervision
Terraform
Regulatory Compliance
Reporting
SQL
Semantics
Soft Skills
Streaming
Pandas
Performance Tuning
Problem Solving
PySpark
Python
Real-time
Microsoft Azure
Microsoft Power BI
Microsoft SQL Server
Modeling
Optimization
Unity
Visualization
Workflow

Job Details

Under general supervision, combines strong technical skills with knowledge of the database administration. Works on one or more projects of high complexity.

The Analytics Engineer will contribute to our modern data estate strategy by developing scalable data solutions using Microsoft Fabric and Azure Databricks. This role will be instrumental in building resilient data pipelines, transforming raw data into curated datasets, and delivering analytics-ready models that support enterprise-level reporting and decision-making.

Work Location & Attendance Requirements:

Must be physically located in Georgia

On-site: Tuesday to Thursday, per manager's discretion

Mandatory in-person meetings:

All Hands

Enterprise Applications

On-site meetings

DECAL All Staff

Work arrangements subject to management's decision

Key Responsibilities:

Data Engineering & Pipeline Development

Build and maintain ETL/ELT pipelines using Azure Databricks and Microsoft Fabric.

Implement medallion architecture (Bronze, Silver, Gold layers) to support data lifecycle and quality.

Develop real-time and batch ingestion processes from IES Gateway and other source systems.

Ensure data quality, validation, and transformation logic is consistently applied.

Use Python, Spark, and SQL in Databricks and Fabric notebooks for data transformation.

Delta Lake: Implementing Delta Lake for data versioning, ACID transactions, and schema enforcement.

Integration with Azure Services: Integrating Databricks with other Azure services like Azure One Lake, Azure ADLS Gen2, and Microsoft fabric.

Data Modeling & Curation

Collaborate with the Domain Owners to design dimensional and real-time data models.

Create analytics-ready datasets for Power BI and other reporting tools.

Standardize field naming conventions and schema definitions across datasets.

Data Governance & Security

Apply data classification and tagging based on DECAL s data governance framework.

Implement row-level security, data masking, and audit logging as per compliance requirements.

Support integration with Microsoft Purview for lineage and metadata management.

Data Modeling:

Dimensional modeling

Real-time data modeling patterns

Reporting & Visualization Support

Partner with BI developers to ensure data models are optimized for Power BI.

Provide curated datasets that align with reporting requirements and business logic.

Create BI dashboards and train users.

DevOps & Automation

Support CI/CD pipelines for data workflows using Azure DevOps.

Assist in monitoring, logging, and performance tuning of data jobs and clusters.

Required Qualifications:

Bachelor s degree in computer science, Data Engineering, or related field.

3+ years of experience in data engineering or analytics engineering roles.

Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization.

Hands-on experience with:

Azure Databricks (Spark, Delta Lake)

Microsoft Fabric (Dataflows, Pipelines, OneLake)

SQL and Python (Pandas, PySpark)

SQL Server 2019+

Familiarity with data modeling, data governance, and data security best practices.

Strong understanding of ETL/ELT processes, data quality, and schema design.

Preferred Skills:

Experience with Power BI datasets and semantic modeling.

Knowledge of Microsoft Purview, Unity Catalog, or similar governance tools.

Exposure to real-time data processing and streaming architectures.

Knowledge of federal/state compliance requirements for data handling

Familiarity with Azure DevOps, Terraform, or CI/CD for data pipelines.

Certifications (preferred):

Microsoft Fabric Analytics Engineer.

Soft Skills:

Strong analytical and problem-solving abilities.

Excellent communication skills for technical and non-technical audiences.

Experience working with government stakeholders.

3+ years of experience in data engineering or analytics engineering roles. Required3Years

Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization.Required3Years

Azure Databricks (Spark, Delta Lake)Required3Years

Microsoft Fabric (Dataflows, Pipelines, OneLake)Required3Years

SQL and Python (Pandas, PySpark)Required3Years

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.