Function Calling
Function calling with a Large Language Model (LLM) is a powerful feature that allows you to define functions that the LLM can call. The LLM will determine if and when a function is called.
There are many use cases for function calling. Here are few that we've implemented in our Angular applications:
- Providing data to the LLM from state or an Angular service.
- Performing tasks on behalf of the user.
- Dispatching NgRx actions that are AI scoped that perform tasks or provide suggestions.
Demo
Example
Run the function calling example in Stackblitz
A few notes:
- First, you will need an OpenAI API Key.
- Try the prompt:
"Who am I?"
. That will invoke thegetUser
function. - Try the prompt:
"Show me all the lights"
. This will invoke thegetLights
function. - Try the prompt:
"Turn off all lights"
. This will invoke thecontrolLights
function.
Defining a Function
Hashbrown provides the
createTool({
name: 'getUser',
description: 'Get information about the current user',
handler: () => {
const authService = inject(AuthService);
return authService.getUser();
},
});
Let's break down the example above:
name
: The name of the function that the LLM will call.description
: A description of what the function does. This is used by the LLM to determine if it should call the function.handler
: The function that will be called when the LLM invokes the function. This is where you can perform any logic you need, such as fetching data from a service or performing a task. The function is invoked with anAbortSignal
and is expected to return aPromise
of theResult
.
The method signature for a handler
is:
(abortSignal: AbortSignal) => Promise<Result>;
Functions with Arguments
hashbrown's schema
. The LLM will invoke the function with the specified arguments.
We'll be using Skillet - hasbrown's LLM-optimized schema language - for defining the function arguments. Let's look at an example function that enables the LLM to control the lights in our smart home client application.
createTool({
name: 'controlLight',
description: 'Control a light',
schema: s.object('Control light input', {
lightId: s.string('The id of the light'),
brightness: s.number('The brightness of the light'),
}),
handler: async (input) =>
this.lightsStore.updateLight(input.lightId, {
brightness: input.brightness,
}),
});
Let's review the code above.
name
: The name of the function that the LLM will call.description
: A description of what the function does. This is used by the LLM to determine if it should call the function.schema
: The schema that defines the arguments that the function requires. This is where you can define the input parameters for the function using Skillet.handler
: The function that will be called when the LLM invokes the function. This is where you can perform any logic you need, such as fetching data from a service or performing a task. The function is invoked with anAbortSignal
and is expected to return aPromise
of theResult
.
In this example, we expect that the input
will be an object with the properties lightId
and brightness
, which are defined in the schema
.
Note that the input
arguments are strongly-typed based on the provided schema.
Providing the Functions
The next step is to provide the tools
when using hashbrown's resource-based APIs.
@Component({
selector: 'app-chat',
providers: [LightsStore],
template: ` <!-- omitted for brevity - full code in stackblitz example --> `,
})
export class ChatComponent {
lightsStore = inject(LightsStore);
chat = chatResource({
model: 'gpt-4.1',
system: 'You are a helpful assistant that can answer questions and help with tasks',
tools: [
createTool({
name: 'getUser',
description: 'Get information about the current user',
handler: () => {
const authService = inject(AuthService);
return authService.getUser();
},
}),
createTool({
name: 'getLights',
description: 'Get the current lights',
handler: async () => this.lightsStore.entities(),
}),
createTool({
name: 'controlLight',
description: 'Control a light',
schema: s.object('Control light input', {
lightId: s.string('The id of the light'),
brightness: s.number('The brightness of the light'),
}),
handler: async (input) =>
this.lightsStore.updateLight(input.lightId, {
brightness: input.brightness,
}),
}),
],
});
sendMessage(message: string) {
this.chat.sendMessage({ role: 'user', content: message });
}
}
Let's review the code above.
- First, we define the
tools
array in thechatResource
configuration. - We use the
function to define the functions that the LLM can call. - The
handler
functions are defined to perform the necessary logic, such as fetching data from services or updating the state. - Finally, we can use the
sendMessage
method to send a message to the LLM, which can then invoke the defined functions as needed.
Conclusion
Function calling is a powerful feature that allows you to define functions that the LLM can invoke.
By using hashbrown's