From the course: Introduction to AI Orchestration with LangChain and LlamaIndex

Unlock the full course today

Join today to access over 24,400 courses taught by industry experts.

LLM function calling

LLM function calling

You've probably asked an AI chat engine a question before, but has the LLM ever asked you to do something? This simple concept, called task offloading, makes it possible to open a new world of possibilities for LLMs and AI apps. The implementation of this feature though is unfortunately named function calling. I say the name is unfortunate because the LLM isn't actually calling a function in the traditional sense. I mean, maybe you could call it kind of RPC, Remote Procedure Call. When the need arises, the LLM pauses the conversation to make a specific request and returns control to the user to respond, for example, by calling a Python function. Then the next turn in the conversation gets the answer back and the LLM resumes whatever it was doing at the time. And the key thing here is that this requires multiple conversational turns to complete. The OpenAI API expects a specific JSON description of the function, including its name, description, and parameters. The return type will get…

Contents