Logo
University of Florida

Research Software Engineer I/II/III

University of Florida, Gainesville, Florida, us, 32635

Save Job

Classification Title Research Software Engineer I/II/III

Classification Minimum Requirements

Level I – A Bachelor’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and two years of experience; Master’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field.

Level II – A Bachelor’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and three years of experience; Master’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and one year of experience; Doctoral Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field.

Level III – A Bachelor’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and five years of experience; Master’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and three years of experience; Doctoral Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and one year of experience.

Job Description This position will support the AI-Powered Athletics project, which is a collaborative effort between the University of Florida (UF) and the University Athletics Association (UAA). A cornerstone of the project is an on-prem database that utilizes HiPerGator, UF’s AI supercomputer. The database integrates structured and unstructured data from UF student‑athletes on health, nutrition, academic performance, and sports performance, including wearable sensors at practices and games. The role is to build and maintain the database, including monitoring data ingestion, managing data pipelines, integrating new technologies and data types, managing role‑based access control, ensuring industry standard documentation practices, and implementing best practices to ensure stability, security, and sustainability.

Additionally, this role offers opportunities to collaborate on frontend development, such as creating dashboards and user interfaces for coaching staff and researchers to access, visualize, and interpret the data. Ability to interact with faculty and staff from Engineering, Research Computing, UF IT, Athletics, and other areas is essential.

This dynamic position will be integrated into the AI-Powered Athletics team, which is led by Dr. Jennifer Nichols (Biomedical Engineering) and Spencer Thomas (University Athletics Association). Our mission is to contribute to the research, education, and athletic communities at UF with innovation in athletic performance, AI, and technology.

Candidates will be expected to adhere to privacy regulations related to student‑athletes, including but not limited to HIPAA, FERPA, NCAA compliance, and UAA reporting guidelines. Excellent communication skills and the ability to work on interdisciplinary teams is required.

Learn more about the AI-Powered Athletics project at https://aipoweredathletics.eng.ufl.edu/.

Learn more about HiPerGator at https://ai.ufl.edu/research/hipergator/.

Responsibilities

Maintain and scale the UF Athletics Databank

Manage ingestion and storage of multi-modal structured and unstructured athlete data

Integrate new data types and new technologies following good software design principles

Build automated systems for ensuring data quality

Implement and improve workflow orchestration to ensure data pipeline efficiency

Provision, monitor, and troubleshoot the database infrastructure

Create and maintain comprehensive software documentation

Conduct thorough testing of the databank and workflow to ensure system reliability

Collaborate on athletic, research, and education projects

Collaborate with UF researchers, front‑end developers, and athletics stakeholders to translate needs into technical requirements and features

Implement schema changes, feature flags, and API contracts to ensure smooth, reproducible data workflow

Ensure security, compliance, and sustainability

Enforce data‑governance and security best practices: implement role‑based access controls, encrypt data at rest and in transit, and ensure compliance with HIPAA, FERPA, NCAA reporting, and UAA guidelines.

Identify and communicate areas for improvement and continued development

Design and implement audit processes.

Continual improvement and professional development

Stay abreast of advancements in relevant technologies and tools.

May be required to perform other duties as assigned by supervisor, as needed.

Expected Salary Salary is commensurate with education and experience.

Required Qualifications

Level I – A Bachelor’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and two years of experience; Master’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field.

Level II – A Bachelor’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and three years of experience; Master’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and one year of experience; Doctoral Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field.

Level III – A Bachelor’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and five years of experience; Master’s Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and three years of experience; Doctoral Degree in computer or physical science, statistics, bioinformatics, analytics, or similar field and one year of experience.

Preferred Technical Skills

Strong programming skills in Python and SQL

Experience with analytical and relational database platforms (e.g., ClickHouse, PostgreSQL, MySQL, or similar)

Understanding of the full data lifecycle, including ingestion, orchestration, cleaning, and reporting using modern automation frameworks (e.g., Dagster, Airflow, PySpark, and similar)

Proficiency with Git/GitHub for collaborative development and version control

Containerization proficiency with Docker or Apptainer (including networking, volume management, and CI/CD integration)

Experience designing and implementing data lake/lakehouse architectures and working with columnar storage formats (e.g., Parquet, Delta Lake)

Strong Linux command‑line, scripting, and systems administration skills

Understanding of DevOps practices, such as automated build, test, deploy pipelines and monitoring

Robust data‑governance capabilities including metadata management, lineage, and adherence to regulatory compliance.

Experience developing user‑facing web applications or dashboards (e.g., Streamlit) and integrating with Snowflake

Preferred Professional Skills

Excellent written and verbal communication and interpersonal skills

Excellent organizational skills and ability to prioritize and complete simultaneous projects with minimal supervision

Ability to mentor undergraduate students and less experienced developers in building, extending, or maintaining codebases

Accuracy, attention to detail, and commitment to developing efficient, robust, scalable, modular, and maintainable codebases.

Commitment to continuous learning and applying best practices in data engineering

Special Instructions to Applicants In order to be considered for this position, you must upload a cover letter and resume with application.

This is a time‑limited position.

Application must be submitted by 11:55 p.m. (ET) of the posting end date.

Health Assessment Required No

#J-18808-Ljbffr