Store data outside LLM context
In this example, we demonstrate how to use set_meta_data to store metadata to reference later.
The AI agent, will store a user’s name and then look up any additional stored information.
The benefit of storing information in meta_data is that the information it can be referenced from any function with the same meta_data_token, while also
never being exposing the information to the language model. This allows for the AI agent to store sensitive information without
the risk of it being exposed to the AI agent.
1. Storing user information
The store_user function is used to store the user’s name. The user’s name and a secret associated with them are stored as meta_data.
This will send a request to data_map.webhooks.url with the user’s name as a query parameter.
In this example, we have our server responding with a Records array in the response. This will contain information on the user.
Server Response:
Once the response has returned from the webhook, the meta_data will be set using set_meta_data in the data_map.webhooks.output.action of the function, the
get_user_info function will be toggled on using toggle_functions, and the user will be notified that their information was stored.
The meta_data key is set to the users name and the value is set to the first value in the Records array.
2. Looking up user information
After storing the user information in meta_data, we can now look up the user information with the get_user_info function.
This function will check if the meta_data is valid by checking if the user’s name is a key in the meta_data object.
If the meta_data is valid, the user will be told their information. If the meta_data is invalid, the user will be told that their information has not been stored.