The Trillion Parameter Consortium – an emerging collective of national laboratories, universities, institutes, and companies – brings together individuals and groups who are developing, training, and harnessing large-scale models along with those operating the high-performance computing systems necessary for model training.
TPC supports collaboration among innovators in the fields of artificial intelligence, supercomputing, and data science. To that end, we are excited to announce a forthcoming series of seminars featuring some of the most prominent figures in these domains. These seminars will explore the incredible potential of Large-Language Models (LLMs) and their synergy with High-Performance Computing (HPC) techniques and technologies.
Upcoming Events
Sajal Dash
Research Scientist at Oak Ridge National Laboratory
Presenting on May 8, 2024
Kyle Lo
Research Scientist at the Allen Institute for AI in Seattle
Presenting on May 22, 2024
Bo Li
Neubauer Associate Professor in the Department of Computer Science
Presenting on June 5, 2024
Hosted by:
Dario Dematties
Postdoctoral Researcher at
Northwestern Argonne Institute of Science and Engineering
Past Events
Michael C. Frank
Stanford University
Presenting on April 24, 2024
Dexter Pratt
Director of Software Development
Presented on April 10, 2024
Rio Yokota
Global Scientific Information and Computing Center, Tokyo Institute of Technology
Presented on March 20, 2024
Yuan-Sen Ting
Australian National University and Ohio State University
Presented on March 6, 2024
Large Language Models (LLMs): Tutorial Workshop
Several Presenters
Presented on February 12 & 13, 2024
Professor Irina Rish
Université de Montréal (UdeM)
Presented on February 7, 2024
Kshitij Gupta
MSc student at Mila through the Université de Montréal (UdeM)
Presenting on January 25, 2024
Leon Song
Senior Principal Research Manager at Microsoft Research
Presented on December 4, 2023
Rick L. Stevens
Associate Lab Director and Distinguished Fellow at Argonne National Laboratory
Presented on November 28, 2023