Education and Support of Large Language Models in a Research Institution
Juan José García Mesa and Gil SpeyerVolume 16, Issue 1 (March 2025), pp. 28–30
https://doi.org/10.22369/issn.2153-4136/16/1/6BibTeX
@article{jocse-16-1-6, author={Juan Jos\'{e} García Mesa and Gil Speyer}, title={Education and Support of Large Language Models in a Research Institution}, journal={The Journal of Computational Science Education}, year=2025, month=mar, volume=16, issue=1, pages={28--30}, doi={https://doi.org/10.22369/issn.2153-4136/16/1/6} }
As the capabilities of large language models (LLMs) continue to expand, with more accurate and powerful models being released monthly, researchers and educators are increasingly eager to incorporate these tools into their work. The growing demand for this technology reflects its transformative potential in natural language and its impact on scientific research. However, as more users seek to harness the power of LLMs, the need to provide comprehensive education and scalable support becomes ever more critical. Our institution has recognized this challenge and developed a support framework to educate users through regular educational events, consultations, and project support. To address the growing need for LLM support, we have implemented several key strategies, including deploying Jupyter Lab sessions using Open OnDemand for seamless HPC access and integrating cloud-based solutions via Jetstream2. We provide insights into our approach, detailing how we empower researchers and educators to leverage the capabilities of LLMs in their diverse applications.