From the course: Hands-On AI: Building LLM-Powered Apps
Prompts and prompt templates - Python Tutorial
From the course: Hands-On AI: Building LLM-Powered Apps
Prompts and prompt templates
- [Instructor] Let's continue our discussion about large language models. We know now they have emergent abilities that we can instruct them to do many, many different tasks. The way we instruct them to perform tasks is by using prompts. A prompt is a user-defined input to which the LLM is meant to respond. In case of OpenAI models, we will be using a specific markup language called Chat Markup Language, or ChatML In ChatML, we separate the prompt into arrays of rows and contents. There are three type of rows, system, assistant, and user System is the system prompt where we provide instruction to the whole system. This is usually set at the beginning of a session and not changed. The assistant role is the language model's response. And the user role is for user inputs. So in a conversation, we can expect the list of prompts goes from system to user, and then the language model responds as assistant, and then user replies back as user, taking turns in making the discussions. To facilitate communication between users and language models, it is very common to use a prompt template. Prompt template is a predefined recipe for generating prompts. LangChain provides an abstraction for building prompt templates. As an example, here is a prompt template for chat. In this prompt template, there is a system prompt and a user prompt. In the user prompt, we provided a variable called text using Python f-string. So when we use this prompt template, we can substitute the variable text with a virtual string. The string will be substituted in to the template with the user-specified inputs. Once substituted, we can then send the whole thing to the large language model for it to generate answers. So now we can grab an OpenAI API key so we can add large language model to our chat with PDF application.
Contents
-
-
-
Language models and tokenization4m 53s
-
Large language model capabilities1m 48s
-
Challenge: Introduction to Chainlit2m 28s
-
Solution: Introduction to Chainlit solution1m 18s
-
Prompts and prompt templates3m
-
Obtaining an OpenAI token1m 20s
-
Challenge: Adding an LLM to the Chainlit app1m 31s
-
Solution: Adding an LLM to the Chainlit app3m 20s
-
Large language model limitations3m 43s
-
-
-
-