This demo is designed to showcase the capabilities of LLMs in diagnosing I/O performance issues in HPC applications profiled using the Darshan I/O profiler as well as their ability to provide individualized performance guidance.
Each of the three example cases below represent a unique application profiled with Darshan and analyzed by an LLM. Click on the Interact with This Trace button to see the original Darshan trace, the LLM's analysis of the trace, and a chat interface where you can interact with the LLM to ask questions about the trace.
This work was created by the DIRLab team at University of Delaware in collaboration with Lawrence Berkeley National Laboratory, The Ohio State University, and NC State University.
This demo represents an ongoing extension of ION: Navigating the HPC I/O Optimization Journey using Large Language Models [HotStorage'24].
Any questions can be directed to lead developer, Chris Egersdoerfer at cegersdo@udel.edu.