This workshop offers a general introduction to the GPT (Generative Pretrained Transformers) model. No technical background is required. We will explore the transformer architecture upon which GPT models are built, how transformer models encode natural language into embeddings, and how GPT predicts text.
What You Should Know First
Python Fundamentals Parts 1-3, or equivalent knowledge.
Is Python not working on your laptop? Attend the workshop anyway, we can provide you with a cloud-based solution until you figure out the problems with your local installation.