The term "foundation model" refers to machine learning models that are trained on vast datasets and can be applied to a wide range of situations. The large language model GPT-4 is an example.
The group of the guest has recently presented a foundation model for optophysiological responses in mouse visual cortex trained on recordings from 135.000 neurons in mice watching movies.
We discuss the design, validation, use of this and future neuroscience foundation models.
Fler avsnitt av Theoretical Neuroscience Podcast
Visa alla avsnitt av Theoretical Neuroscience PodcastTheoretical Neuroscience Podcast med Gaute Einevoll finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.
