Research Software Engineer I/II/III

Overview

On Site
Full Time

Skills

Computer Science
High Performance Computing
Unstructured Data
Sensors
Documentation
Frontend Development
Interfaces
Coaching
Biomedical Engineering
Innovation
Privacy
Artificial Intelligence
Software Design
Data Quality
Extract
Transform
Load
ProVision
Database
Software Documentation
Testing
Reliability Engineering
Research
Collaboration
API
Workflow
Sustainability
Access Control
HIPAA
Auditing
Continuous Improvement
Professional Development
Science
Statistics
Bioinformatics
Analytics
Python
SQL
Analytical Skill
RDBMS
PostgreSQL
MySQL
Orchestration
Reporting
PySpark
Git
GitHub
Version Control
Docker
Computer Networking
Management
Continuous Integration
Continuous Delivery
Data Lake
Storage
Apache Parquet
Linux
Command-line Interface
Scripting
System Administration
DevOps
Data Governance
Meta-data Management
Regulatory Compliance
Web Applications
Dashboard
Snow Flake Schema
Communication
Organizational Skills
Supervision
Mentorship
Attention To Detail
Data Engineering
Facebook
LinkedIn

Job Details

Research Software Engineer I/II/III

Job no: 537625
Work type: Staff Full-Time
Location: Main Campus (Gainesville, FL)
Categories: Computer Science, Allied Health, Grant or Research Administration, Artificial Intelligence, Engineering
Department:19340000 - EG-BIOMEDICAL ENGINEERING

Classification Title:
Research Software Engineer I/II/III

Classification Minimum Requirements:
The position title and level will be commensurate with experience.

Level I - A Bachelor's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and two years of experience; Master's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field.

Level II - A Bachelor's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and three years of experience; Master's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and one year of experience; Doctoral Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field.

Level III - A Bachelor's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and five years of experience; Master's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and three years of experience; Doctoral Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and one year of experience.

Job Description:
This position will support the AI-Powered Athletics project, which is a collaborative effort between the University of Florida (UF) and the University Athletics Association (UAA). A cornerstone of the project is an on-prem database that utilizes HiPerGator, UF's AI supercomputer. The database integrates structured and unstructured data from UF student-athletes on health, nutrition, academic performance, and sports performance, including wearable sensors at practices and games. The role is to build and maintain the database, including monitoring data ingestion, managing data pipelines, integrating new technologies and data types, managing role-based access control, ensuring industry standard documentation practices, and implementing best practices to ensure stability, security, and sustainability. Additionally, this role offers opportunities to collaborate on front-end development, such as creating dashboards and user interfaces for coaching staff and researchers to access, visualize, and interpret the data. Ability to interact with faculty and staff from Engineering, Research Computing, UF IT, Athletics, and other areas is essential.

This dynamic position will be integrated into the AI-Powered Athletics team, which is led by Dr. Jennifer Nichols (Biomedical Engineering) and Spencer Thomas (University Athletics Association). Our mission is to contribute to the research, education, and athletic communities at UF with innovation in athletic performance, AI, and technology.

Candidates will be expected to adhere to privacy regulations related to student-athletes, including but not limited to HIPAA, FERPA, NCAA compliance, and UAA reporting guidelines. Excellent communication skills and the ability to work on interdisciplinary teams is required.

Learn more about the AI-Powered Athletics project at
Learn more about HiPerGator at ;br>
  • Maintain and scale the UF Athletics Databank
    • Manage ingestion and storage of multi-modal structured and unstructured athlete data
    • Integrate new data types and new technologies following good software design principles
    • Build automated systems for ensuring data quality
    • Implement and improve workflow orchestration to ensure data pipeline efficiency
    • Provision, monitor, and troubleshoot the database infrastructure
    • Create and maintain comprehensive software documentation
    • Conduct through testing of the databank and workflow to ensure system reliability
  • Collaborate on athletic, research, and education projects
    • Collaborate with UF researchers, front-end developers, and athletics stakeholders to translate needs into technical requirements and features
    • Implement schema changes, feature flags, and API contracts to ensure smooth, reproducible data workflow
  • Ensure security, compliance, and sustainability
    • Enforce data-governance and security best practices: implement role-based access controls, encrypt data at rest and in transit, and ensure compliance with HIPAA, FERPA, NCAA reporting, and UAA guidelines.
    • Identify and communicate areas for improvement and continued development
    • Design and implement audit processes.
  • Continual improvement and professional development
    • Stay abreast of advancements in relevant technologies and tools.
    • May be required to perform other duties as assigned by supervisor, as needed.

Expected Salary:
Salary is commensurate with education and experience

Required Qualifications:
The position title and level will be commensurate with experience.

Level I - A Bachelor's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and two years of experience; Master's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field.

Level II - A Bachelor's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and three years of experience; Master's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and one year of experience; Doctoral Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field.

Level III - A Bachelor's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and five years of experience; Master's Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and three years of experience; Doctoral Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and one year of experience.

Preferred:
Technical Skills

  • Strong programming skills in Python and SQL
  • Experience with analytical and relational database platforms (e.g. ClickHouse, PostgreSQL, MySQL, or similar)
  • Understanding of the full data lifecycle, including ingestion, orchestration, cleaning, and reporting using modern automation frameworks (e.g. Dagster, Airflow, PySpark, and similar)
  • Proficiency with Git/GitHub for collaborative development and version control
  • Containerization proficiency with Docker or Apptainer (including networking, volume management, and CI/CD integration)
  • Experience designing and implementing data lake/lakehouse architectures and working with columnar storage formats (e.g. Parquet, Delta Lake)
  • Strong Linux command-line, scripting, and systems administration skills
  • Understanding of DevOps practices, such as automated build, test, deploy pipelines and monitoring
  • Robust data-governance capabilities including metadata management, lineage, and adherence to regulatory compliance.
  • Experience developing user-facing web applications or dashboards (e.g. Streamlit) and integrating with Snowflake

Professional Skills

  • Excellent written and verbal communication and interpersonal skills
  • Excellent organizational skills and ability to prioritize and complete simultaneous projects with minimal supervision
  • Ability to mentor undergraduate students and less experienced developers in building, extending, or maintaining codebases
  • Accuracy, attention to detail, and commitment to developing efficient, robust, scalable, modular, and maintainable codebases.
  • Commitment to continuous learning and applying best practices in data engineering

Special Instructions to Applicants:
In order to be considered for this position, you must upload a cover letter and resume with application.

This is a time-limited position.

Application must be submitted by 11:55 p.m. (ET) of the posting end date.

Health Assessment Required:No

Advertised: 21 Oct 2025 Eastern Daylight Time
Applications close: 04 Nov 2025 Eastern Standard Time

Whatsapp Facebook LinkedIn Email App
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.