Local candidates in Las Vegas or Carson City NV****
Work Schedule: 8:00 am to 5:00 pm
Key Responsibilities
1. Data Engineering Integration & Pipeline Oversight
- Medallion Architecture Implementation: Involved in the tactical execution of the Medallion data design (Bronze/Silver/Gold) to modernize the organization’s data lakehouse environment.
- ETL/ELT Logic Design: Architect and manage complex Extract, Transform, Load (ETL) specifications and transformational logic to automate multi-directional data flows.
- Persistent Staging & Data Lifecycle: Oversee the technical requirements for staging and landing zones to ensure high-availability and historical data integrity for downstream consumption.
- Technical Scalability & Performance Tuning: Optimize data system performance by identifying bottlenecks in data processing and ensuring the technical framework supports high-volume, diverse data sets while adhering to security protocols.
2. Technical Data Governance & Mapping
- This role focuses on executing source-to-target mappings and defining the logic needed for data transformations, including type conversions and business rules. You will be responsible for setting up automated quality checks to ensure all ingested data is accurate and auditable. A key part of the position involves integrating data privacy and regulatory standards into the system design while using Master Data Management (MDM) and data cataloging tools to document technical metadata and maintain clear data lineage from start to finish.
3. Reporting & Advanced Analytics Enablement
- Technical Requirements Engineering: Bridge the gap between business ambiguity and technical execution by converting stakeholder needs into detailed Functional Specification Documents (FSD) and Technical Design Documents (TDD).
- BI Stack Optimization: Manage and optimize the semantic layers of reporting tools (Power BI and Business Objects) to ensure performant data modeling and self-service scalability.
- AI Analytics Readiness: Architect the data environment for AI/ML consumption by ensuring "clean-room" data availability, feature set readiness, and robust dashboarding.
- SQL Querying: Execute SQL scripting to validate data sets and perform root-cause analysis on data discrepancies.
Required Technical Skills
Data Modeling: Modern Data Architectures, Azure Synapse, Snowflake, Databricks Lakehouse, and Delta Lake technologies.
Governance Tools: Hands-on experience with any data governance tool like , Microsoft Purview, Informatica Cloud Data Governance, Alation, or Collibra.
Security & Classification: Experience with Data Classification tools and implementation.
Languages: Advanced SQL, Python (preferred) for data analysis, and DAX/M-Code (for Power BI).
Technical Methodologies: Data Governance Frameworks, Data Lineage Documentation, and Agile Scrum
- A minimum of 5 years of proficiency in Data Architecture and platforms like Azure Synapse, Snowflake, or Databricks.
- 7 years of SQL skills for data validation and root-cause analysis.
- A minimum of 5 years of experience with ETL/ELT logic design and managing multi-stage data flows.
- 2 years of expertise in data governance frameworks and tools like Purview, Alation, or Collibra.
- 7 years of technical documentation (FSD/TDD) creation and 5 years of PowerBI related data research and mapping experience.
- . A minimum of 5 years of proficiency in Data Architecture and platforms like Azure Synapse, Snowflake, or Databricks.
- 7 years of SQL skills for data validation and root-cause analysis.
- A minimum of 5 years of experience with ETL/ELT logic design and managing multi-stage data flows.
- 2 years of expertise in data governance frameworks and tools like Purview, Alation, or Collibra.
- 7 years of technical documentation (FSD/TDD) creation and 5 years of PowerBI related data research and mapping experience.