Job Description:
****No Corp to Corp, Only W2/ 1099 Resumes will be considered
Location Requirement: 100% onsite
Local candidate required/Will consider nonlocal as long as they are willing to move to Carson or Las Vegas
The Systems Analyst serves as the primary technical resource between the Business, Business Process Analysts, Data Management Office (DMO) and the execution teams. This role is responsible for translating high-level data strategies into tactical technical requirements.
The Systems Analyst will oversee the technical translation of data mapping, multi-stage data flows, and complex ETL/ELT logic. The ideal candidate possesses deep expertise in the Medallion Architecture (or similar) to ensure the delivery of scalable, governed, and high-fidelity data assets that serve as the foundation for the organization s data modernization projects.
Key Responsibilities
Data Engineering Integration & Pipeline Oversight
- Medallion Architecture Implementation: Involved in the tactical execution of the Medallion data design (Bronze/Silver/Gold) to modernize the organization s data lakehouse environment.
- ETL/ELT Logic Design: Architect and manage complex Extract, Transform, Load (ETL) specifications and transformational logic to automate multi-directional data flows.
- Persistent Staging & Data Lifecycle: Oversee the technical requirements for staging and landing zones to ensure high-availability and historical data integrity for downstream consumption.
- Technical Scalability & Performance Tuning: Optimize data system performance by identifying bottlenecks in data processing and ensuring the technical framework supports high-volume, diverse data sets while adhering to security protocols.
2. Technical Data Governance & Mapping
This role focuses on executing source-to-target mappings and defining the logic needed for data transformations, including type conversions and business rules. You will be responsible for setting up automated quality checks to ensure all ingested data is accurate and auditable. A key part of the position involves integrating data privacy and regulatory standards into the system design while using Master Data Management (MDM) and data cataloging tools to document technical metadata and maintain clear data lineage from start to finish.
3. Reporting & Advanced Analytics Enablement
- Technical Requirements Engineering: Bridge the gap between business ambiguity and technical execution by converting stakeholder needs into detailed Functional Specification Documents (FSD) and Technical Design Documents (TDD).
- BI Stack Optimization: Manage and optimize the semantic layers of reporting tools (Power BI and Business Objects) to ensure performant data modeling and self-service scalability.
- AI Analytics Readiness: Architect the data environment for AI/ML consumption by ensuring "clean-room" data availability, feature set readiness, and robust dashboarding.
- SQL Querying: Execute SQL scripting to validate data sets and perform root-cause analysis on data discrepancies.
4. Technical Oversight
- Data Flow & Pipeline Optimization: Analyze existing data pipelines to identify opportunities for latency reduction and systemic throughput efficiency.
- Technical Liaison: Act as the technical point of contact for data projects, synchronizing efforts between Business Process Analysts, Data Engineers, and DevOps teams.
Required Skills:
- Data Modeling: Modern Data Architectures, Azure Synapse, Snowflake, Databricks Lakehouse, and Delta Lake technologies.
- Governance Tools: Hands-on experience with any data governance tool like , Microsoft Purview, Informatica Cloud Data Governance, Alation, or Collibra.
- Security & Classification: Experience with Data Classification tools and implementation.
- Languages: Advanced SQL, Python (preferred) for data analysis, and DAX/M-Code (for Power BI).
- Technical Methodologies: Data Governance Frameworks, Data Lineage Documentation, and Agile Scrum.
- A minimum of 5 years of proficiency in Data Architecture and platforms like Azure Synapse, Snowflake, or Databricks.
- 7 years of SQL skills for data validation and root-cause analysis.
- A minimum of 5 years of experience with ETL/ELT logic design and managing multi-stage data flows.
- 2 years of expertise in data governance frameworks and tools like Purview, Alation, or Collibra.
- 7 years of technical documentation (FSD/TDD) creation and 5 years of PowerBI related data research and mapping experience.
Desired Skills:
- Python language skills for data analysis.
- Experience architecting data environments for AI/ML consumption and analytics readiness.