Tool Use and Function Calling
Definition: Tool Use (Function Calling)
Tool Use (Function Calling)
Tool use allows LLMs to call external functions to retrieve data, perform computations, or take actions:
tools = [{
"name": "run_simulation",
"description": "Run a wireless simulation with given parameters",
"input_schema": {
"type": "object",
"properties": {
"snr_db": {"type": "number"},
"modulation": {"type": "string"},
"n_bits": {"type": "integer"},
},
"required": ["snr_db", "modulation"],
},
}]
The LLM decides when and how to call tools based on the conversation.
Tool use transforms LLMs from text generators into agents that can interact with code, databases, and external services.
Definition: LLM Agent
LLM Agent
An LLM agent is a system where the LLM acts as a controller, using tools in a loop to accomplish complex tasks:
while not done:
action = llm.decide(context, tools)
result = execute(action)
context.append(result)
Agent frameworks: LangChain, LlamaIndex, Claude tool use, OpenAI function calling.
Theorem: Tool Call Reliability
For a pipeline with sequential tool calls, each with success probability , the overall success rate is: At and , . Error handling and verification at each step are essential.
Each tool call is an opportunity for the LLM to generate invalid arguments or misinterpret results. Reliability compounds multiplicatively.
Example: Implementing Tool Use for Simulation
Build a tool-use system where the LLM can run BER simulations and analyze results.
Tool Definition and Handler
import anthropic
import json
tools = [{
"name": "ber_simulation",
"description": "Simulate BER for a given modulation at a given SNR",
"input_schema": {
"type": "object",
"properties": {
"modulation": {"type": "string", "enum": ["BPSK", "QPSK", "16QAM"]},
"snr_db": {"type": "number"},
"n_bits": {"type": "integer", "default": 100000},
},
"required": ["modulation", "snr_db"],
},
}]
def handle_tool_call(name, inputs):
if name == "ber_simulation":
# Run actual simulation
ber = simulate_ber(**inputs)
return json.dumps({"ber": ber, "n_errors": int(ber * inputs.get("n_bits", 100000))})
RAG Retrieval Demo
See how RAG retrieves relevant documents for a query
Parameters
Quick Check
What is the primary purpose of tool use in LLM applications?
To speed up inference
To ground LLM responses in real-time data and computations
To reduce API costs
Tools give LLMs access to external information and computation capabilities they lack internally.
Tool Use / Function Calling
A capability where LLMs generate structured calls to external functions, enabling interaction with code, databases, and APIs.
LLM Agent
A system where an LLM acts as a controller, iteratively using tools and reasoning to accomplish complex multi-step tasks.