Application Systems Analyst Programmer 3
• Posted 3 hours ago • Updated 3 hours ago

Link Technologies
Dice Job Match Score™
🛠️ Calibrating flux capacitors...
Job Details
Skills
- IT Consulting
- Data Management
- Translation
- Data Mapping
- Data Engineering
- High Availability
- Data Integrity
- Performance Tuning
- Data Processing
- Business Rules
- Privacy
- Systems Design
- Master Data Management
- Mobile Device Management
- Meta-data Management
- Advanced Analytics
- Requirements Engineering
- Functional Requirements
- Technical Drafting
- Business Intelligence
- Semantics
- Reporting
- Business Objects
- Scalability
- Analytics
- Artificial Intelligence
- Machine Learning (ML)
- Scripting
- Optimization
- Business Process
- DevOps
- Data Modeling
- Microsoft
- Informatica
- Cloud Computing
- Python
- Data Analysis
- DAX
- Documentation
- Agile
- Scrum
- Data Architecture
- Microsoft Azure
- Snow Flake Schema
- Databricks
- SQL
- Data Validation
- Root Cause Analysis
- Extract
- Transform
- Load
- ELT
- Logic Synthesis
- Management
- Data Flow
- Data Governance
- Technical Writing
- FSD
- Test-driven Development
- Microsoft Power BI
- Research
- Mapping
- Law
Summary
- JOB-7531
- Application Systems Analyst Programmer 3
- Link Technologies (LinkTechConsulting.com), a Las Vegas-based IT consulting firm, is currently seeking an Application Systems Analyst / Programmer 3 to join our team.
Role Summary
The Systems Analyst serves as the primary technical resource between the Business, Business Process Analysts, Data Management Office (DMO) and the execution teams. This role is responsible for translating high-level data strategies into tactical technical requirements.
The Systems Analyst will oversee the technical translation of data mapping, multi-stage data flows, and complex ETL/ELT logic. The ideal candidate possesses deep expertise in the Medallion Architecture (or similar) to ensure the delivery of scalable, governed, and high-fidelity data assets that serve as the foundation for the organization's data modernization projects.
Key Responsibilities
- Data Engineering Integration & Pipeline Oversight
- Medallion Architecture Implementation: Involved in the tactical execution of the Medallion data design (Bronze/Silver/Gold) to modernize the organization's data lakehouse environment.
- ETL/ELT Logic Design: Architect and manage complex Extract, Transform, Load (ETL) specifications and transformational logic to automate multi-directional data flows.
- Persistent Staging & Data Lifecycle: Oversee the technical requirements for staging and landing zones to ensure high-availability and historical data integrity for downstream consumption.
- Technical Scalability & Performance Tuning: Optimize data system performance by identifying bottlenecks in data processing and ensuring the technical framework supports high-volume, diverse data sets while adhering to security protocols.
- Technical Data Governance & Mapping
- This role focuses on executing source-to-target mappings and defining the logic needed for data transformations, including type conversions and business rules. You will be responsible for setting up automated quality checks to ensure all ingested data is accurate and auditable. A key part of the position involves integrating data privacy and regulatory standards into the system design while using Master Data Management (MDM) and data cataloging tools to document technical metadata and maintain clear data lineage from start to finish.
- Reporting & Advanced Analytics Enablement
- Technical Requirements Engineering: Bridge the gap between business ambiguity and technical execution by converting stakeholder needs into detailed Functional Specification Documents (FSD) and Technical Design Documents (TDD).
- BI Stack Optimization: Manage and optimize the semantic layers of reporting tools (Power BI and Business Objects) to ensure performance data modeling and self-service scalability.
- AI Analytics Readiness: Architect the data environment for AI/ML consumption by ensuring "clean room" data availability, feature set readiness, and robust dashboarding.
- SQL Querying: Execute SQL scripting to validate data sets and perform root-cause analysis on data discrepancies.
- Technical Oversight
- Data Flow & Pipeline Optimization: Analyze existing data pipelines to identify opportunities for latency reduction and systemic throughput efficiency.
- Technical Liaison: Act as the technical point of contact for data projects, synchronizing efforts between Business Process Analysts, Data Engineers, and DevOps teams.
- A minimum of 5 years of proficiency in Data Architecture and platforms like Azure Synapse, Snowflake, or Databricks.
- 7 years of SQL skills for data validation and root-cause analysis.
- A minimum of 5 years of experience with ETL/ELT logic design and managing multi-stage data flows.
- 2 years of expertise in data governance frameworks and tools like Purview, Alation, or Collibra.
- 7 years of technical documentation (FSD/TDD) creation and 5 years of Power BI related data research and mapping experience.
- A minimum of 5 years of proficiency in Data Architecture and platforms like Azure Synapse, Snowflake, or Databricks.
- 7 years of SQL skills for data validation and root-cause analysis.
- A minimum of 5 years of experience with ETL/ELT logic design and managing multi-stage data flows.
- 2 years of expertise in data governance frameworks and tools like Purview, Alation, or Collibra.
- 7 years of technical documentation (FSD/TDD) creation and 5 years of Power BI related data research and mapping experience.
Required Technical Skills
Data Modeling: Modern Data Architectures, Azure Synapse, Snowflake, Databricks Lakehouse, and Delta Lake technologies.
Governance Tools: Hands-on experience with any data governance tool like, Microsoft Purview, Informatica Cloud Data Governance, Alation, or Collibra.
Security & Classification: Experience with Data Classification tools and implementation.
Languages: Advanced SQL, Python (preferred) for data analysis, and DAX/M-Code (for Power BI).
Technical Methodologies: Data Governance Frameworks, Data Lineage Documentation, and Agile Scrum
Link Technologies is an equal opportunity employer. All qualified applicants will receive consideration for employment without discrimination based on race, color, religion, sex, gender identity/expression, sexual orientation, national origin, protected veteran status, disability, or any other factors protected by law.
- Dice Id: linktech
- Position Id: JOB-7531
- Posted 3 hours ago
Company Info
Founded in 2000, Link Technologies is a certified SDB, DBE, WOSB, and 8(a) Graduate company delivering customized IT and cybersecurity solutions to top commercial, government and hospitality clients. Specializing in infrastructure development, PCI compliance, QSA audits, and end-to-end project management, Link Technologies provides cost-effective, scalable solutions that align with their core values: Client Focus, Quality, and Satisfaction.
Link Technologies provides highly qualified professionals to take care of evolving client need in an industry with rigorous standards, enabling organizations to focus on what matters most: their core business. We operate our own fully managed Network Operations Center (NOC) and Security Operations Center (SOC), providing around-the-clock monitoring, SIEM services, and Tier 1 & 2 Help Desk support—all managed in-house to ensure quality, accountability, and rapid response.
Quality isn’t an add-on: it’s a standard. At Link Technologies, we believe every engagement, no matter how complex, should be executed with precision and consistency. The Link Technologies team is committed to delivering exceptional results the first time, every time. This commitment to quality is the foundation of our continued success and the driving force behind everything we do
Similar Jobs
It looks like there aren't any Similar Jobs for this job yet.
Search all similar jobs

