Function Calling
Last updated
Last updated
Avian's API is OpenAI compatible, including their libraries and schema.
Learn how to connect large language models to external tools using the Avian API.
In an API call, you can describe functions and have the model intelligently choose to output a JSON object containing arguments to call one or many functions. The Chat Completions API does not call the function; instead, the model generates JSON that you can use to call the function in your code.
The latest models have been trained to both detect when a function should be called (depending on the input) and to respond with JSON that adheres to the function signature more closely than previous models. With this capability also comes potential risks. We strongly recommend building in user confirmation flows before taking actions that impact the world on behalf of users (sending an email, posting something online, making a purchase, etc).
Model Name | Function Calling | Parallel Function Calling |
---|---|---|
Function calling allows you to more reliably get structured data back from the model. For example, you can:
Create assistants that answer questions by calling external APIs
e.g., define functions like send_email(to: string, body: string)
, or get_current_weather(location: string, unit: 'celsius' | 'fahrenheit')
Convert natural language into API calls
e.g., convert "Who are my top customers?" to get_customers(min_revenue: int, created_before: string, limit: int)
and call your internal API
Extract structured data from text
e.g., define a function called extract_data(name: string, birthday: string)
, or sql_query(query: string)
...and much more!
The basic sequence of steps for function calling is as follows:
Call the model with the user query and a set of functions defined in the functions
parameter.
The model can choose to call one or more functions; if so, the content will be a stringified JSON object adhering to your custom schema (note: the model may hallucinate parameters).
Parse the string into JSON in your code, and call your function with the provided arguments if they exist.
Call the model again by appending the function response as a new message, and let the model summarize the results back to the user.
The default behavior for tool_choice
is tool_choice: "auto"
. This lets the model decide whether to call functions and, if so, which functions to call.
We offer three ways to customize the default behavior depending on your use case:
To force the model to always call one or more functions, you can set tool_choice: "required"
. The model will then select which function(s) to call.
To force the model to call only one specific function, you can set tool_choice: {"type": "function", "function": {"name": "my_function"}}
.
To disable function calling and force the model to only generate a user-facing message, you can set tool_choice: "none"
.
Remember to replace "your-api-key-here" with your Avian API Key when using this code.
Meta-Llama-3.1-405B-Instruct
✅
✅