Respond by: 03/19/2026
Rate: DOE
Type: Contract
Work Mode: Remote
Location: Austin TX
Please respond with resume and 3 references preferably supervisor (name, title, company, email, phone number)
Background Check will be performed if a candidate is selected for placement and will have to be passed'
Job Description
This position focuses heavily on reverse engineering Mainframe based data extracts and reports, integrating data into Snowflake, and supporting downstream analytics and reporting through Power BI.
This role bridges legacy systems and modern cloud-based data platforms, ensuring data accuracy, reliability, and availability for business intelligence needs.Job Description
- Major responsibilities include participating in the detailed sessions with business stakeholders to understand reporting needs, data extract requirements, and downstream data consumption patterns.
- Develop reverse engineering data extracts documentation to support business reporting and extract needs.
- Design and implement ETL/ELT pipelines to move data into Snowflake using approved tools and frameworks. Translate business requirements into clear functional and technical specifications for developing the future self-service analytics.
- Other responsibilities may include coordination and overseeing the analysis, designs, plans, diagrams, and verifying procedures for existing and proposed applications. Provides technical advice, assistance, and recommendations in matters related to reports' development.
Essential Job Duties
- Collaborate with data engineers and architects to ensure proper data modeling, schema design, and transformation logic in Snowflake.
- Validate data quality and reconcile source to target results across Mainframe, Lonestar, Open Systems, and Snowflake environments.
- Develop data conversion and system implementation plans.
- Prepare and obtain approval of system and programming documentation.
- Design a new database scheme to persist data from the requirements and technical
- Support the creation of Power BI datasets, semantic models, and dashboards using Snowflake as the primary data source.
- Troubleshot data issues across the full pipeline from Lonestar or Open System extract to Snowflake load to Power BI visualization.
- Document technical processes, data flows, and system dependencies.
- Work closely with business stakeholders to understand reporting requirements and translate them into technical solutions.
- Participate in code reviews, testing cycles, and production support activities.
II. CANDIDATE SKILLS AND QUALIFICATIONS
| Minimum Requirements: Candidates that do not meet or exceed the minimum stated requirements (skills/experience) will be displayed to customers but may not be chosen for this opportunity. |
| Years | Required/Preferred | Experience |
| 8+ | Required | Hands on experience with Snowflake (SQL, data loading, transformations, performance tuning). |
| 8+ | Required | Understanding of ETL/ELT concepts, data pipelines, and data integration patterns. |
| 4+ | Required | Experience building or supporting Power BI reports, datasets, and data models. Strong experience gathering and documenting requirements for reporting, data extracts, or data driven applications |
| 5+ | Required | Ability to analyze complex data flows and troubleshoot issues across multiple systems. |
| 5+ | Required | Ability to analyze complex data flows and troubleshoot issues across multiple systems. |
| 4+ | Required | Knowledge of enterprise information management processes and methodologies, relational database management systems, and metadata management |
| 2+ | Preferred | Knowledge of local, state, and federal laws and regulations (PII) relevant to data management and data governance |
| 2+ | Required | Experience in working in fast-paced and ever-changing environments; achieving project delivery and deadlines; analyzing complex information and developing plans to address identified issues. |
| 2 | Required | Skill in interpersonal relationships, including the ability to work with people under pressure, negotiate among multiple parties, resolve conflicts, and establish and maintain effective working relationships with various levels of personnel. |
| 3+ | Preferred | Experience with Informatica, Matillion, Azure Data Factory, or similar data integration tools. |
| - | Preferred | Graduation from an accredited four-year college or university with major coursework in data processing, computer science, business administration, or a related field. |
III. TERMS OF SERVICE
Services are expected to start 4/01/2026 and are expected to complete by 8/31/2026. Total estimated hours per Candidate shall not exceed 800 hours. This service may be amended, renewed, and/or extended providing both parties agree to do so in writing, which will be determined in July or August of 2026 based on RRC funding availability for the next fiscal year.