Job Title: Fabric Data Engineer
Job Location: Atlanta GA (Onsite)
Job Type: Full Time Permanent
Job Description:
Contribute to the requirements elicitation process by documenting assigned parts of business requirements, in line with guidance provided
Facilitate software application design discussions, and document design decisions to guide the technical team towards building software solutions
Participate in coding and integrate new features or updates into existing applications, with a focus on maintaining system stability
Conduct code reviews, do changes to the codebase and maintain code repositories
Implement test strategies, analyse results, and coordinate bug fixes to uphold the software quality standards
Develop user training programs, documentation, and support frameworks to ensure a smooth transition to new software applications
Actively participate in resolving production issues and recommend preventive strategies to enhance system reliability
Maintain detailed records of code, testing techniques, and support activities to enrich the knowledge base and assist other similar projects
Required Skill and Experience
- Experience in Data Engineering or Analytics Engineering roles.
- Strong expertise in analytical data modelling.
- Hands-on experience with Microsoft Fabric or Azure analytics services.
- Strong SQL skills for analytical workloads.
- Experience with Spark / PySpark.
- Understanding of how data models support Power BI semantic models.
Additional Required Qualifications
- Bachelor s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
- This position may require relocation and/or travel to work/project location.
Additional Details:
- Work with Microsoft Fabric components including OneLake, Lakehouse, Warehouse, Data Engineering, and Data Factory.
- Implement shortcuts, partitioning strategies, and incremental processing.
- Support CI/CD practices for data pipelines and analytics artifacts.
- Design and implement robust data pipelines using Microsoft Fabric Data Factory, Dataflows Gen2, and Notebooks.
- Build ELT pipelines for Lakehouse and Warehouse workloads.
- Design and maintain analytical data models optimized for reporting and analytics.
- Apply dimensional modelling techniques (star/snowflake schemas).
- Implement fact tables, conformed dimensions, surrogate keys, and slowly changing dimensions (SCDs).
- Prepare analytics-ready datasets aligned to Power BI semantic models.
- Collaborate with Power BI Engineers to ensure DAX-friendly and performant models.
- Optimize data models for performance, scalability, and cost efficiency.
- Implement data quality checks, validations, and monitoring.
- Support governance practices including metadata, lineage, and documentation.