Speaker
Daniel Schiller
(Institute for Theoretical Physics, Heidelberg University)
Description
Foundation models are a very successful approach to linguistic tasks. Naturally, there is the desire to develop foundation models for physics data. Currently, existing networks are much smaller than publicly available Large Language Models (LLMs), the latter having typically billions of parameters. By applying pretrained LLMs in an unconventional way, we introduce large networks for cosmological data.
Authors
Ayodele Ore
Caroline Heneka
Daniel Schiller
(Institute for Theoretical Physics, Heidelberg University)
Florian Nieser
Tilman Plehn