Tool Use
How LLMs interact with external systems to extend their capabilities
What it is
Tool use (also called function calling) is the ability of an LLM to invoke external code or APIs during a conversation. The model is given descriptions of available tools in its system prompt. When it determines a tool should be used, it outputs a structured call, execution is handed off to the calling code, and the result is returned to the model as context before it continues generating.
Common tools include: web search, code execution, database queries, file operations, calendar access, and custom APIs. The model doesn't run code itself, it decides what to call and interprets results.
This architecture is how Claude Code, ChatGPT's code interpreter, and virtually every AI product with external integrations work.