DECAL Analytics Engineer // Need Locals to Georgia

  • Atlanta, GA
  • Posted 14 hours ago | Updated 14 hours ago

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 11 Month(s)

Skills

Analytical Skill
Analytics
Apache Spark
Auditing
Business Intelligence
Collaboration
Communication
Conflict Resolution
Continuous Delivery
Continuous Integration
Dashboard
Data Engineering
Data Flow
Data Governance
Data Masking
Data Modeling
Data Processing
Data Quality
Data Security
Database Administration
Databricks
Decision-making
DevOps
Dimensional Modeling
ELT
Extract
Transform
Load
Management
Meta-data Management
Microsoft
Microsoft Azure
Microsoft Power BI
Microsoft SQL Server
Modeling
Optimization
Pandas
Performance Tuning
Problem Solving
PySpark
Python
Real-time
Regulatory Compliance
Reporting
SQL
Semantics
Streaming
Supervision
Terraform
Unity
Visualization
Workflow

Job Details

Hi
Please go through the job description below and let me know your thoughts
Title: DECAL Analytics Engineer
Location: Atlanta, GA. 30334
Agency Interview Type: Either Web Cam or In Person
Work Arrangement: Hybrid
Short Description:
Under general supervision, combines strong technical skills with knowledge of the database administration. Works on one or more projects of high complexity.
Complete Description:
The Analytics Engineer will contribute to our modern data estate strategy by developing scalable data solutions using Microsoft Fabric and Azure Databricks. This role will be instrumental in building resilient data pipelines, transforming raw data into curated datasets, and delivering analytics-ready models that support enterprise-level reporting and decision-making.


Work Location & Attendance Requirements:
Must be physically located in Georgia
On-site: Tuesday to Thursday, per manager's discretion
Mandatory in-person meetings:
o All Hands
o Enterprise Applications
o On-site meetings
o DECAL All Staff
Work arrangements subject to management's decision

Key Responsibilities:
Data Engineering & Pipeline Development
Build and maintain ETL/ELT pipelines using Azure Databricks and Microsoft Fabric.
Implement medallion architecture (Bronze, Silver, Gold layers) to support data lifecycle and quality.
Develop real-time and batch ingestion processes from IES Gateway and other source systems.
Ensure data quality, validation, and transformation logic is consistently applied.
Use Python, Spark, and SQL in Databricks and Fabric notebooks for data transformation.
Delta Lake: Implementing Delta Lake for data versioning, ACID transactions, and schema enforcement.
Integration with Azure Services: Integrating Databricks with other Azure services like Azure One Lake, Azure ADLS Gen2, and Microsoft fabric.
Data Modeling & Curation
Collaborate with the Domain Owners to design dimensional and real-time data models.
Create analytics-ready datasets for Power BI and other reporting tools.
Standardize field naming conventions and schema definitions across datasets.

Data Governance & Security
Apply data classification and tagging based on DECAL s data governance framework.
Implement row-level security, data masking, and audit logging as per compliance requirements.
Support integration with Microsoft Purview for lineage and metadata management.

Data Modeling:
Dimensional modeling
Real-time data modeling patterns

Reporting & Visualization Support
Partner with BI developers to ensure data models are optimized for Power BI.
Provide curated datasets that align with reporting requirements and business logic.
Create BI dashboards and train users.

DevOps & Automation
Support CI/CD pipelines for data workflows using Azure DevOps.
Assist in monitoring, logging, and performance tuning of data jobs and clusters.

Required Qualifications:
Bachelor s degree in computer science, Data Engineering, or related field.
3+ years of experience in data engineering or analytics engineering roles.
Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization.
Hands-on experience with:
o Azure Databricks (Spark, Delta Lake)
o Microsoft Fabric (Dataflows, Pipelines, OneLake)
o SQL and Python (Pandas, PySpark)
o SQL Server 2019+
Familiarity with data modeling, data governance, and data security best practices.
Strong understanding of ETL/ELT processes, data quality, and schema design.

Preferred Skills:
Experience with Power BI datasets and semantic modeling.
Knowledge of Microsoft Purview, Unity Catalog, or similar governance tools.
Exposure to real-time data processing and streaming architectures.
Knowledge of federal/state compliance requirements for data handling
Familiarity with Azure DevOps, Terraform, or CI/CD for data pipelines.

Certifications (preferred):
Microsoft Fabric Analytics Engineer.
Soft Skills:
Strong analytical and problem-solving abilities.
Excellent communication skills for technical and non-technical audiences.
Experience working with government stakeholders.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Kloudhunt LLC