Schedule

December 15th, 2024

See the full schedule on the NeurIPS website.

Time (PST) Event
8:50 - 9:00 Introduction and opening remarks
9:00 - 9:30 Keynote Talk - Tom Goldstein: Can transformers solve harder problems than they were trained on? Scaling up test-time computation via recurrence
9:30 - 10:00 Keynote Talk - Surya Ganguli: An analytic theory of creativity in convolutional diffusion models
10:00 - 10:30 Keynote Talk - Hanie Sedghi: Exploring and Improving Planning Capabilities of LLMs
10:30 - 10:50 Coffee Break - Coffee break
10:50 - 11:05 Contributed Talk - Christos Perivolaropoulos: Softmax is not enough (for sharp out-of-distribution)
11:05 - 11:20 Contributed Talk - David Krueger: Input Space Mode Connectivity in Deep Neural Networks
11:20 - 12:20 Poster Session 1
12:20 - 13:20 Lunch Break
13:20 - 13:35 Contributed Talk - Antonio Sclocchi: Unraveling the Latent Hierarchical Structure of Language and Images via Diffusion Models
13:35 - 13:50 Contributed Talk - Bao Pham: Memorization to Generalization: The Emergence of Diffusion Models from Associative Memory
13:50 - 14:20 Keynote Talk - Zico Kolter: Is this really science? A lukewarm defense of alchemy
14:20 - 14:50 Keynote Talk - Misha Belkin: Building on observations: some personal experience
14:50 - 15:10 Coffee Break - Coffee break
15:10 - 16:10 Panel Discussion: Yasaman Bahri, Andrew Gordon Wilson, Misha Belkin, Eero Simoncelli, Surya Ganguli
16:10 - 16:30 Winners Announcement + Closing Remarks
16:30 - 17:30 Poster Session 2