This collection brings together lecture slides, workshop materials, and practical guides from past iterations of the Oxford LLMs summer school for social sciences. The content spans foundational concepts to advanced applications—from transformer architectures and model interpretability to fine-tuning techniques and LLM agent systems.

2024

Building on the 2023 program, the 2024 materials feature fundamentals of LLMs, applied tutorials, and recent advancements. The program covers model architecture and evaluation, agent-based systems, advanced fine-tuning techniques, and practical implementations such as retrieval‑augmented generation and LLM observability.

Lectures

Seminars

2023

The 2023 program covers fundamental and applied aspects of transformer‑based language models, from the original “Attention is All You Need” paper to contemporary systems like ChatGPT. Lectures by Elena Voita focus on bias, interpretability, and alignment, while hands‑on seminars by Ilya Boytsov introduce fine‑tuning, parameter‑efficient methods, and classic NLP workflows.

Lectures

All lecture materials below were created by Elena Voita.

Seminars

Workshop materials below were designed and implemented by Ilya Boytsov.

GitHub Archive

All materials are also collected in the oxford-llms-workshop GitHub repository .