From the course: Introduction to AI Orchestration with LangChain and LlamaIndex
What you should know
From the course: Introduction to AI Orchestration with LangChain and LlamaIndex
What you should know
For this course, you should be comfortable reading and writing Python code. It will be helpful if you've at least played with apps like ChatGPT, but you don't need any prior experience with programming large language models or LLMs, or with orchestration frameworks like LangChain or LlamaIndex. In this course, we'll concentrate on locally running LLMs as well as cloud-hosted ones like those from OpenAI. You should have a fairly modern development machine with at least 16 GB of RAM and lots of free disk space. If your environment has a GPU, we will make use of it, but even if not, you'll still be able to work with local models, but more slowly. Apple silicon with 16 GB or more of shared memory will also work well. Even if you're only interested in running cloud models instead of on your local machine, almost everything in this course will still be applicable to you. As you work through the course, I'd like to encourage you to use the notebook feature here on LinkedIn Learning to keep track of your notes, ideas, and inspiration. After you finish the course, you can export all these notes for later use. Also, take advantage of the Q&A feature to let us know what you think of the course and what you'd like to learn next. Most importantly of all, have fun.
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.