Overview
On Site
BASED ON EXPERIENCE
Full Time
Skills
AZURE
DATABRICKS
DATA BRICKS
PYTHON
PYSPARK
SPARK
LAKEHOUSE
LAKE HOUSE
Job Details
ROLE SUMMARY
The Azure Data Analytic Engineer will be the AZURE SME tasked with the development and optimization of cloud-based Business Intelligence solutions. Advances data analytics capabilities and drives innovative solutions. Possesses deep technical expertise in data engineering and plays instrumental role in managing data integrations from on-premises Oracle systems, Cloud CRM (Dynamics), and telematics. Collaborates closely with Data Science and Enterprise Data Warehouse teams and business stakeholders.
The Azure Data Analytic Engineer will be the AZURE SME tasked with the development and optimization of cloud-based Business Intelligence solutions. Advances data analytics capabilities and drives innovative solutions. Possesses deep technical expertise in data engineering and plays instrumental role in managing data integrations from on-premises Oracle systems, Cloud CRM (Dynamics), and telematics. Collaborates closely with Data Science and Enterprise Data Warehouse teams and business stakeholders.
PRIMARY RESPONSIBILITIES:
Data Ingestion and Storage:
- Designs, develops, and maintains scalable, efficient data pipelines using Data Factory, and Databricks, leveraging Py Spark for complex data transformations and large-scale processing.
- Builds and manages extract, transform, and load (ETL)/extract, load, transform (ELT) processes to seamlessly extract, transform, and load data from on-premises Oracle systems, customer relationship management (CRM) technology, and connected vehicles into data storage solutions, such as Azure Data Lake Storage and Azure SQL Database.
- Creates high-code data engineering solutions using Databricks to clean, transform, and prepare data for in-depth analysis.
- Develops and manages data models, schemas, and data warehouses, utilizing Lakehouse Architecture to enhance advanced analytics and business intelligence.
- Leverages Unity Catalog to ensure unified data governance and management across the enterprise's data assets.
- Optimizes data storage, retrieval strategies, and query performance to drive scalability and efficiency in all data operations.
- Integrate and harmonize data from diverse sources including on-premises databases, cloud services, APIs, and connected vehicle telematics.
- Ensure consistent data quality, accuracy, and reliability across all integrated data sources.
- Utilizes GitHub for version control and collaborative development, implementing best practices for code management, testing, and deployment.
- Develops workflows for continuous integration (CI) and continuous deployment (CD), ensuring efficient delivery and maintenance of data solutions.
- Work closely with Data Science, Enterprise Data Warehouse, and Data Visualization teams, as well as business stakeholders, to understand data requirements and deliver innovative solutions.
- Collaborate with cross-functional teams to troubleshoot and resolve data infrastructure issues, identifying and addressing performance bottlenecks.
- Provide technical leadership, mentorship, and guidance to junior data engineers, promoting a culture of continuous improvement and innovation.
- Technical Expertise: Extensive experience with Azure Data Factory, Databricks, and Azure Synapse, as well as proficiency in Python and PySpark.
- Data Integration: Experience integrating data from on-premises Oracle systems and connected vehicle data into cloud-based solutions.
- Lakehouse Architecture & Governance: Deep knowledge of Lakehouse Architecture and Unity Catalog for enterprise data governance.
- Version Control & Collaboration: Demonstrated proficiency in GitHub for development, collaboration, and deployment in large-scale environments.
- Infrastructure as Code (IaC): Experience with Infrastructure as Code tools such as Resource Manager (ARM) templates or terraform.
- Problem-Solving & Troubleshooting: Strong analytical skills with the ability to diagnose and resolve complex data infrastructure challenges.
- Collaboration: Proven ability to work effectively with Data Science teams, business stakeholders, and cross-functional teams to drive data-driven insights.
- Communication: Excellent verbal and written communication skills with the ability to translate technical concepts to non-technical stakeholders.
Work Environment
- Hybrid Role: Remote work 2 days per week (After 90 Days Onboarding)
- Travel Required: 0%
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.