This workshop teaches you how to run open-weight language models (LLMs) locally on your computer for research applications. You'll learn to use Ollama (a simple, powerful tool for running models like Llama, Qwen, and Mistral) to perform text classification, quality evaluation, and batch processing of research data, all without cloud APIs or usage limits.