Short course — Large Language Models

AI-eye
Apply now
Start date not set yet
More information ...
Brochure, helpdesk
  • ASK HAN
    We are happy to help with all your questions

Start date not set yet

Are you a software developer, bachelor or master graduate or lecturer/researcher looking to take your knowledge of AI and Large Language Models (LLMs) to the next level? This course offers an in-depth, hands-on introduction to the world of LLMs and Deep Learning.

In brief

    3 April 2025Start date
    7 Thursday sessionsDuration
    10.00 - 16.00Class time
    €1,750Course fee
    Certificate of participationOutcome

About the LLM Course

The rise of Artificial Intelligence (AI) is transforming the way companies will have to operate. Especially with Large Language Models (LLMs) making AI more accessible than ever. But what exactly can you achieve with LLMs, and just as importantly, what are their limitations? We begin with an introduction to the fundamentals of LLMs before diving into practical implementation. Solid programming skills - preferably in Python - are therefore highly recommended. 

By the end of this 7-day LLM course, you will have gained valuable insights into both the possibilities and constraints of LLMs. You will also have builtyour own Retrieval Augmented Generation (RAG) system to "chat" with your documents through an LLM. This course follows on from the AI in Practice course and requires additional programming experience. 

2 AI-faces

Course programm

During this 7-day course, we blend theory with practice. You learn to build a fully functional RAG application. 

Week 1: Introduction to Deep Learning and Natural Language Processing  

We explore concepts such as: How does deep learning work?  How does a computer learn to process language? What are Embeddings?

Week 2: Transformers and Transfer Learning

The Transformer architecture has revolutionized LLMs. This week you learn how these models are constructed and how to better apply knowledge from pretrained models to new tasks.

Week 3: Prompting, API Calls and Dockerization

This week covers how to create effective prompts, launch a model in a Docker environment and make it accessible to other programs via an API call.

Week 4: Preprocessing, Vector Databases and Semantic Search

In week 4 we get started with the basic ingredients for RAG: chatting with your own documents. This also includes document preprocessing, storing documents in a vector database and how they can be used for RAG. 

Week 5: Retrieval Augmented Generation, Auto Encoders en JEPA

In week 5 we dive deeper into the different ways of working with embeddings. In addition to RAG, you explore Autoencoders (for tasks like anomaly detection) and developments in the Joint Embedding Predictive Architecture (JEPA).

Week 6: Automated Knowledge Graphs

Alongside RAG, you also learn how to work with automated knowledge graphs.

Week 7: User Interfaces

Ultimately, we want an interface that the user can interact with. In the last lesson we bring everything together: from preprocessing and the RAG model to the API and the user interface for interaction.

More information

Get to know the lecturere

Raoul Grouls is a lecturer-researcher at the HAN Research Center for AI & Data Science and holds a degree in Artificial Intelligence from Utrecht University. He has previously worked as a senior data scientist at various organizations such as Business & Decision and the international consulting firm Eraneos.

Portret Raoel Grouls

Contact us

Got a question? Contact us at ASK HAN. We're happy to help!

Opening hours

Monday to Friday: 08:00 - 17:00