Oxford LLMs 2023 Highlights
The first-ever Oxford school on Large Language Models for Social Science took place in September 2023. We were fortunate to attract significant interest in the workshop and welcomed around 30 exceptionally qualified and enthusiastic participants π. This event brought together PhD students and postdocs from around the world π, all eager to delve into advanced text analysis and AI for social science π§ π».
School Conveners
We were fortunate to have two exceptional experts guiding us through the schoolβs activities. Dr. Elena Voita, a Research Scientist at FAIR (Meta AI) π§ , delivered insightful lectures on the history and development of NLP. She walked us through everything from early rule-based systems to the latest Transformer models and model alignment techniques.
Meanwhile, Ilya Boytsov, the NLP lead at Wayfair ποΈ, conducted engaging workshops that allowed us to apply what we learned in practical settings π οΈ. His practical coding tutorials were packed with industry insights and tips, providing participants with a real-world perspective on deploying language models. Despite the challenges, including visa issues that prevented Ilya from attending in person, he conducted cyberpunk-like hybrid workshops, teaching online to a live audience. These sessions were an unforgettable learning experience π.
Open Source Materials
To ensure that the knowledge gained extended beyond the event, we created the website you are currently viewing and a GitHub repository. These platforms became the central hubs for all workshop materials, including lecture notes, coding exercises, and presentation slides (and soon to be videos! π₯). By making these resources publicly accessible, we aim to allow a wider audience to benefit from the workshop π.
Looking Ahead
The success of the 2023 summer school has set a high bar for the future. Weβre already planning improvements based on the invaluable feedback we received. With exciting new content and enhanced logistics, the next workshop promises to be at least as good as the one beforeβhopefully even better! π
Keep an eye on our website and GitHub repository for updates and access to our extensive collection of resources!