Define a Function for OpenAI Function Calling
Function calling (now called "tool use" in the OpenAI API) allows GPT models to invoke developer-defined functions with structured arguments. Instead of returning free-form text, the model returns a JSON object matching your function's parameter schema — which you parse and execute as application logic. This example defines a weather lookup function with properly typed parameters, required field constraints, and descriptive strings that help the model choose when and how to call the function. The function definition follows JSON Schema for the parameters property. Each parameter needs a type, description, and optionally an enum for constrained values. The description field is the most important part — it tells the model what the parameter means and what values are valid. A parameter described as "city name in English" produces different model behavior than one described as "full city name including country for disambiguation, e.g. Paris, France vs Paris, Texas". Tool choice can be set to "auto" (model decides when to call the function), "required" (model must call a function), or {"type": "function", "function": {"name": "..."}} (force a specific function). For extraction tasks, use "required" to guarantee structured output. For assistant tasks, use "auto" so the model can choose to answer directly when the function is not needed.
{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "What's the weather like in Tokyo right now?"}
],
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather conditions for a city",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "City name in English, e.g. Tokyo"
},
"units": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "Temperature unit"
}
},
"required": ["city"]
}
}
}
],
"tool_choice": "auto"
}FAQ
- What is the difference between function calling and structured outputs?
- Function calling (tool use) is for invoking external actions — the model decides when to call a function and returns structured arguments. Structured outputs constrain the final response format. They serve different purposes but can be combined.
- How do I handle the model's function call response?
- Check if the response finish_reason is "tool_calls". Extract the function name and arguments from tool_calls[0]. Execute your function with those arguments, then send the result back in a tool role message to continue the conversation.
- Can I define multiple functions?
- Yes. Include multiple objects in the tools array. The model selects the appropriate function based on the user's request and each function's description. In parallel tool call mode, the model can call multiple functions simultaneously.
Related Examples
The Chat Completion API is the primary interface for all GPT models and the foun...
Build an OpenAI System Prompt for a Coding AssistantA well-structured system prompt is the most important factor in GPT model output...
Build a JSON Schema for Structured AI OutputsStructured output APIs from OpenAI (json_schema mode) and Anthropic (tool use) r...