Skip to main content

AI agents: API

Creating a connection string

Your agent will need a connection string to connect with the LLM. Create a connection string using an AiConnectionString instance and the PutConnectionStringOperation operation.
(You can also create a connection string using Studio, see here)

You can use a local Ollama model if your considerations are mainly speed, cost, open-source, or security,
Or you can use a remote OpenAI service for its additional resources and capabilities.

Example:

using (var store = new DocumentStore())
{
// Define the connection string to OpenAI
var connectionString = new AiConnectionString
{
// Connection string name & identifier
Name = "open-ai-cs",

// OpenAI connection settings
OpenAiSettings = new OpenAiSettings(
apiKey: "your-api-key",
endpoint: "https://api.openai.com/v1",
// LLM model for text generation
model: "gpt-4.1")
};

// Deploy the connection string to the server
var operation = new PutConnectionStringOperation<AiConnectionString>(connectionString);
var putConnectionStringResult = store.Maintenance.Send(operation);
}

Syntax:

public class AiConnectionString
{
public string Name { get; set; }
public string Identifier { get; set; }
public OpenAiSettings OpenAiSettings { get; set; }
...
}

public class OpenAiSettings : AbstractAiSettings
{
public string ApiKey { get; set; }
public string Endpoint { get; set; }
public string Model { get; set; }
public int? Dimensions { get; set; }
public string OrganizationId { get; set; }
public string ProjectId { get; set; }
}

Defining and running an AI agent

Define agent configuration

To create an AI agent you need to prepare an agent configuration and populate it with your settings and tools.

Start by creating a new AiAgentConfiguration instance.
While creating the instance, pass its constructor:

The agent will send the system prompt you define here to the LLM, to define the LLM's role and explain it how this role should be fulfilled.

Read in the example below what role we assign to LLM.
Throughout the code, we are careful with the details we send to the LLM about employees and orders.

// Start setting an agent configuration
var agent = new AiAgentConfiguration("reward-productive-employee", connectionString.Name,
"You work for a human experience manager. " +
"The manager uses your services to find which employee earned the largest profit for the company " +
"and suggest a reward for this employee. " +
"The manager provides you with the name of a country, or with the word \"everything\" to indicate all countries. " +
"then you: " +
"1. use a query tool to load all the orders sent to the selected countery, " +
"or a query tool to load all orders sent to all countries. " +
"2. calculate which employee made the largest profit." +
"3. use a query tool to learn in what region the employee lives." +
"4. find suitable vacation sites or other rewards based on the employee's residence area." +
"5. use an action tool to store in the database the employee's ID, profit, and your reward suggestions." +
"When you're done, return these details in your answer to the user as well."
);
  • Method definition:
public AiAgentConfiguration(string name, string connectionStringName, string systemPrompt);
  • AiAgentConfiguration definition:
public class AiAgentConfiguration
{
// A unique identifier given to the AI agent configuration
public string Identifier { get; set; }

// The name of the AI agent configuration
public string Name { get; set; }

// Connection string name
public string ConnectionStringName { get; set; }

// The system promnpt that defines the role and purpose of the agent and the LLM
public string SystemPrompt { get; set; }

// An example object that sets the layout for the LLM's response to the user.
// The object is translated to a schema before we send it to the LLM.
public string SampleObject { get; set; }

// A schema that sets the layout for the LLM's response to the user.
// If both a sample object and a schema are defined, only the schema is used.
public string OutputSchema { get; set; }

// A list of Query tools that the LLM can use (through the agent) to access the database
public List<AiAgentToolQuery> Queries { get; set; } = new List<AiAgentToolQuery>();

// A list of Action tools that the LLM can use to trigger the user to action
public List<AiAgentToolAction> Actions { get; set; } = new List<AiAgentToolAction>();

// Agent parameters whose value the client passes to the LLM each time a chat is started,
// for stricter control over queris initiated by the LLM and as a means for interaction
// between the client and the LLM.
public List<AiAgentParameter> Parameters { get; set; } = new List<AiAgentParameter>();

// The trimming configuration defines if and how the chat history is summarized,
// to minimize the amount of data passed to the LLM when a chat is started.
public AiAgentChatTrimmingConfiguration ChatTrimming { get; set; } = new AiAgentChatTrimmingConfiguration(new AiAgentSummarizationByTokens());

// Control over the number of times that the LLMis allowed to use agent tools to handle a user prompt.
public int? MaxModelIterationsPerCall { get; set; }
}

Once the agent configuration is created, we need to add it a few additional elements.

Set agent ID

Use the Identifier property to provide the agent with a unique ID that the system will recognize it by.

// Set agent ID
agent.Identifier = "reward-productive-employee";

Add agent parameters

Agent parameters are optional variables, whose values are provided by the client (or by a user through the client) to the agent when a chat is started.
The values are then embedded by the agent in query tools when the tools are used by the LLM. Users and clients can provide their selections and preferences through agent parameters, to focus the queries and the whole interaction on their needs.

In the example below, an agent parameter is used to determine what area of the world a query will handle.

To add an agent parameter create an AiAgentParameter instance, initialize it with the parameter's name and description (explaining the LLM what the parameter is for), and pass this instance to the agent.Parameters.Add method.

  • Example:
//  Set agent parametera
agent.Parameters.Add(new AiAgentParameter("country", "A specific country that orders were shipped to, " +
"or \"everywhere\" to look for orders shipped to all countries"));
  • AiAgentParameter Definition:
public AiAgentParameter(string name, string description);

Set maximum number of iterations

You can limit the number of times that the LLM is allowed to request the usage of agent tools in response to a single user prompt.
To change this limit use MaxModelIterationsPerCall.

  • Example:
// Limit the number of times the LLM can request for tools in response to a single user prompt
agent.MaxModelIterationsPerCall = 3;
  • AiAgentParameter Definition:
public int? MaxModelIterationsPerCall

Set chat trimming configuration

The LLM doesn't keep the history of previous chats. To allow a continuous conversation, we include in every new message we send to the LLM the entire conversation since it started.
To save traffic and tokens, you can summarize older messages.
This can be helpful when transfer rate and cost are a concern or the context may become too large to handle efficiently.

To summarize old messages create an AiAgentChatTrimmingConfiguration instance, use this instance to configure your trimming strategy, and set the agent's ChatTrimming property with the instance.

When creating the instance, pass its constructor a summarization strategy using a AiAgentSummarizationByTokens class.

The original conversation history, before it was summarized, can optionally be kept in the @conversations-history collection.
To determine whether to keep the original messages and for how long, also pass the AiAgentChatTrimmingConfiguration constructor an AiAgentHistoryConfiguration instance with your history settings.

  • Example:
// Set chat trimming configuration
AiAgentSummarizationByTokens summarization = new AiAgentSummarizationByTokens()
{
// When the number of tokens stored in the conversation exceeds this limit
// summarization of old messages will be triggered.
MaxTokensBeforeSummarization = 32768,
// The maximum number of tokens that the conversation is allowed to contain
// after summarization.
MaxTokensAfterSummarization = 1024
};

agent.ChatTrimming = new AiAgentChatTrimmingConfiguration(summarization);
  • Syntax:
public class AiAgentSummarizationByTokens
{
// The maximum number of tokens allowed before summarization is triggered.
public long? MaxTokensBeforeSummarization { get; set; }

// The maximum number of tokens allowed in the generated summary.
public long? MaxTokensAfterSummarization { get; set; }
}
*/

/*
public class AiAgentHistoryConfiguration
{
// Enables history for the AI agents conversations.
public AiAgentHistoryConfiguration()

// Enables history for the AI agents conversations.
// <param name="expiration">The timespan after which history documents expire.</param>
public AiAgentHistoryConfiguration(TimeSpan expiration)

// The timespan after which history documents expire.
public int? HistoryExpirationInSec { get; set; }
}

Add agent tools

You can enhance your agent with Query and Action tools, that allow the LLM to query your database and trigger client actions.

After defining agent tools and submitting them to the LLM, it is up to the LLM to decide if and when to use them.

Query tools

Query tools provide the LLM with the ability to retrieve data from the database.

A query tool includes a natural-language description that explains the LLM what the tool is for, and an RQL query.
To use a query tool, the LLM will request the agent to apply it, and the agent will run the query and pass the results to the LLM.

Both the user and the LLM can provide values to the RQL query, using parameters.
In the query, a parameter is defined with a $.
E.g., where Country == $country

  • To include user parameters in the query, include in the query agent parameters - the parameters whose values the user provides when the chat is started.
  • To include LLM parameters in the query, define the parameters in the tool's parameters schema. The LLM will pass their values each time it requests the agent to run the tool.

You can define a parameters schema as either a sample object or a schema.
Be aware that if you define both a sample object and a schema, only the schema will be used.

Query tools are allowed to include in their RQL queries only queries for read operations.
To make changes in the database, use an action tool instead.

  • Example:
    • The first query tool will be used by the LLM when it needs to retrieve all the orders sent to any place in the world. (the system prompt instructs it to use this tool when the user enters "everywhere" when the chat is started.
    • The second query tool will be used by the LLM when it needs to retrieve all the orders thaty were sent to a particular country. It uses the "country" agent parameter.
    • The third tool retrieves from the database the general location of an employee.
      To do this it uses a $employeeId parameter, whose value is set by the LLM in its request to run this tool.
agent.Queries =
[
// Set query tool that triggers the agent to retrieve all the orders sent everywhere
new AiAgentToolQuery
{
Name = "retrieve-orders-sent-to-all-countries",
Description = "a query tool that allows you to retrieve all orders sent to all countries.",
Query = "from Orders as O select O.Employee, O.Lines.Quantity",
ParametersSampleObject = "{}"
},

// Set query tool that triggers the agent to retrieve all the orders sent to a specific country
new AiAgentToolQuery
{
Name = "retrieve-orders-sent-to-a-specific-country",
Description = "a query tool that allows you to retrieve all orders sent to a specific country",
Query = "from Orders as O where O.ShipTo.Country == $country select O.Employee, O.Lines.Quantity",
ParametersSampleObject = "{}"
},

// Set query tool that triggers the agent to retrieve the performer's
// residence region details (country, city, and region) from the database
new AiAgentToolQuery
{
Name = "retrieve-performer-living-region",
Description = "a query tool that allows you to retrieve an employee's country, city, and region, by the employee's ID",
Query = "from Employees as E where id() == $employeeId select E.Address.Country, E.Address.City, E.Address.Region",
ParametersSampleObject = "{" +
"\"employeeId\": \"embed the employee's ID here\"" +
"}"
}
];
  • Syntax: Query tools are defined in a list of AiAgentToolQuery classes.
public class AiAgentToolQuery
{
public string Name { get; set; }
public string Description { get; set; }
public string Query { get; set; }
public string ParametersSampleObject { get; set; }
public string ParametersSchema { get; set; }
}

Action tools

Action tools allow the LLM to trigger the client to actions like modifying or adding documents, or any other operation that the client is permitted to perform.

Unlike a query tool, an action tool does not include a query. It only includes a description and a parameters schema.
The description informs the LLM in natural language what the tool is capable of.
The schema is filled by the LLM with values when it requests the agent to apply the action.

In the example below, the action tool is requested to store employee's details in the database. The LLM will provide these details when it requests the agent to apply the tool.

When the client finishes performing the action, it is required to send the LLM a response that explains how it went, e.g. done.

  • The following action tool sends to the client employee details that the tool needs to store in the database.
agent.Actions =
[
// Set action tool that triggers the client to store the performer's details
new AiAgentToolAction
{
Name = "store-performer-details",
Description = "an action tool that allows you to store the ID of the employee that made " +
"the largest profit, the profit, and your suggestions for a reward, in the database.",
ParametersSampleObject = "{" +
"\"employeeID\": \"embed the employee’s ID here\"," +
"\"profit\": \"embed the employee’s profit here\"," +
"\"suggestedReward\": \"embed your suggestions for a reward here\"" +
"}"
}
];
  • Syntax: Action tools are defined in a list of AiAgentToolAction classes.
public class AiAgentToolAction
{
public string Name { get; set; }
public string Description { get; set; }
public string ParametersSampleObject { get; set; }
public string ParametersSchema { get; set; }

}

Create a Response object and the Agent

The agent configuration is almost ready.
The only part still missing is an object for the LLM's response, when it finishes its work and needs to reply.

Create a response object class with the fields that you want the LLM to fill in its response.
Then, create the agent using the CreateAgentAsync method and pass it a new instance of your response object.
Set each response-object property with a natural-language explanation to the LLM, indicating what the LLM should embed in it.

  • Example:
// Create the agent
// Pass it an object for its response
var createResult = await store.AI.CreateAgentAsync(agent, new Performer
{
employeeID = "the ID of the employee that made the largest profit",
profit = "the profit the employee made",
suggestedReward = "your suggestions for a reward"
});
// An object for the LLM response
public class Performer
{
public string employeeID;
public string profit;
public string suggestedReward;
}

Alternatively, you can set the agent configiration's SampleObject or OutputSchema properties with, respectively, a sample object or a schema string and use a CreateAgentAsync overload that creates the agent without passing it a response object.

A sample object is a JSON object whose properties indicate in natural language what values the LLM should embed in them. A schema follows the formal format used by the LLM.

A sample object is normally easier to create. Note that if you define a sample object the agent will translate it to a schema in any case before passing it to the LLM.

If you define both a sample object and a schema, only the schema will be used.

  • Example:
// Set sample object
agent.SampleObject = "{" +
"\"employeeID\": \"the ID of the employee that made the largest profit\", " +
"\"profit\": \"the profit the employee made\"" +
"\"suggestedReward\": \"your suggestions for a reward\"" +
"}";
  • CreateAgentAsync overloads:
// Create the agent with just the defined configuration
CreateAgentAsync(configuration);

// Create the agent with just the defined configuration
CreateAgentAsync(AiAgentConfiguration configuration, CancellationToken token = default(CancellationToken));

// Create the agent while passing it a response object
CreateAgentAsync<TSchema>(AiAgentConfiguration configuration, TSchema sampleObject, CancellationToken token = default(CancellationToken));

Conversations

A conversation is a communication session between the client, the agent, and the LLM, during which the LLM may use agent tools to interact with the database and the client.

If agent parameters were defined, the agent will start the conversation only when they are entered.

Continuous conversations

The LLM doesn't record its chats, but starts a new chat each time.
The AI agent allows a continuous conversation by storing conversations history in the @conversations collection. When a new chat is started using the ID of a stored conversation, the agent will fetch the entire conversation, send it to the LLM, and the conversation will be resumed where you left off.

Stored conversations' Prefix and IDs

Conversations are kept in the @conversations collection with a prefix (such as Chats/) that can be set when the conversation is initiated. The conversation ID that follows the prefix is created automatically by RavenDB, similarly to the creation of automatic IDs for documents.

You can:

  • Provide a full ID
  • Provide a prefix that ends with / or | to trigger automatic ID creation.

Set the conversation

  • Set a chat using the store.AI.Conversation method.
    Pass Conversation:
    • The agent ID
    • The conversaion ID
      If you pass the method the ID of an existing conversation (e.g. "Chats/0000000000000008883-A") the conversation will be retrieved from storage and continued where you left off.
      If you provide an empty prefix (e.g. "Chats/), a new conversation will start.
    • agent parameter values in an AiConversationCreationOptions instance.
  • Set the user prompt using the SetUserPromptmethod.
    The user prompt informs the agent with the user's requests and expectations for this chat.
  • Use the value returned by the Conversation method to run the chat.
  • Example:
// Set the chat
// Pass it the agent's ID, a prefix for conversations stored in @Conversations,
// and agent parameters' values
// Here, the country is simply set to "France" for the example.
// A user would pick a country, or enter "everyehwre" for all countries.
var chat = store.AI.Conversation(
createResult.Identifier,
"Performers/",
new AiConversationCreationOptions().AddParameter("country", "France"));
  • Conversation Definition:
public IAiConversationOperations Conversation(string agentId, string conversationId, AiConversationCreationOptions creationOptions, string changeVector = null)
  • SetUserPrompt Definition:
void SetUserPrompt(string userPrompt);

Action tools handlers

Handle an action tools request by passing the chat's Handle method the name of the action tool you want to handle. When the LLM sends your agent an action request, any data included in the request reaches the handler and the client can to use it.

Pass Handle an object to populate with the request's data. The object should have the same structure you defined for the action tool's parameters schema.

When you finish handling the requested action return the LLM an indication that it was done.

// Handle action tool.
// In this example, the action tool is requested to store an employee's
// details in the database.
chat.Handle("store-performer-details", (Performer performer) =>
{
using (var session1 = store.OpenSession())
{
// These values are passed to the client by the action tool
Performer rewarded = new Performer
{
employeeID = performer.employeeID,
profit = performer.profit,
suggestedReward = performer.suggestedReward
};

// store the values in the Performers collection in the database
session1.Store(rewarded);
session1.SaveChanges();
}

// return to the agent an indication that the action went well.
return "done";
});
// An object for the LLM response
public class Performer
{
public string employeeID;
public string profit;
public string suggestedReward;
}

Conversation reply

LLM replies are returned by the agent to the client in an AiAnswer object.
The conversation status is indicated by AiAnswer.AiConversationResult.

  • AiAnswersyntax:
public class AiAnswer<TAnswer>
{
// The answer content produced by the AI
public TAnswer Answer;

// The status of the conversation
public AiConversationResult Status;
}

public enum AiConversationResult
{
// The conversation has completed and a final answer is available
Done,
// Further interaction is required, such as responding to tool requests
ActionRequired
}

Set user prompt and run the conversation

Set the user prompt using the SetUserPrompt method, and run the chat with the RunAsync method.

// Set the user prompt and run the chat
chat.SetUserPrompt("send a few suggestions to reward the employee that made the largest profit");

var LLMResponse = await chat.RunAsync<Performer>(CancellationToken.None);

if (LLMResponse.Status == AiConversationResult.Done)
{
// The LLM successfully processed the user prompt and returned its response.
// The performer's ID, profit, and suggested rewards were stored in the Performers
// collection by the action tool, and are also returned in the final LLM response.
}

Full example

The agent in this example helps a human experience manage to reward employees.
It searches, using query tools, the orders sent to a certain country or (if the manager prompts it "everywhere") to all countries, and finds the employee that made the largest profit.
It then uses another query tool to find, by the employee's ID (fetched from the orders) the employee's residence region, and finds rewards based on this location.
Finally, it uses an action tool to store the employee's ID, profit, and reward suggestions in the Performers collection in the database, and returns the same details in its final response as well.

public async Task createAndRunAiAgent_full()
{
var store = new DocumentStore();

// Define the connection string to OpenAI
var connectionString = new AiConnectionString
{
// Connection string name & identifier
Name = "open-ai-cs",

ModelType = AiModelType.Chat,

// OpenAI connection settings

OpenAiSettings = new OpenAiSettings(
apiKey: "your-api-key",
endpoint: "https://api.openai.com/v1",
// LLM model for text generation
model: "gpt-4.1")
};

// Deploy the connection string to the server
var operation = new PutConnectionStringOperation<AiConnectionString>(connectionString);
var putConnectionStringResult = store.Maintenance.Send(operation);

using var session = store.OpenAsyncSession();

// Start setting an agent configuration
var agent = new AiAgentConfiguration("reward-productive-employee", connectionString.Name,
"You work for a human experience manager. " +
"The manager uses your services to find which employee earned the largest profit for the company " +
"and suggest a reward for this employee. " +
"The manager provides you with the name of a country, or with the word \"everything\" to indicate all countries. " +
"then you: " +
"1. use a query tool to load all the orders sent to the selected countery, " +
"or a query tool to load all orders sent to all countries. " +
"2. calculate which employee made the largest profit." +
"3. use a query tool to learn in what region the employee lives." +
"4. find suitable vacation sites or other rewards based on the employee's residence area." +
"5. use an action tool to store in the database the employee's ID, profit, and your reward suggestions." +
"When you're done, return these details in your answer to the user as well."
);

// Set agent ID
agent.Identifier = "reward-productive-employee";

// Set agent parametera
agent.Parameters.Add(new AiAgentParameter("country", "A specific country that orders were shipped to, " +
"or \"everywhere\" to look for orders shipped to all countries"));

agent.Queries =
[
// Set query tool that triggers the agent to retrieve all the orders sent everywhere
new AiAgentToolQuery
{
Name = "retrieve-orders-sent-to-all-countries",
Description = "a query tool that allows you to retrieve all orders sent to all countries.",
Query = "from Orders as O select O.Employee, O.Lines.Quantity",
ParametersSampleObject = "{}"
},

// Set query tool that triggers the agent to retrieve all the orders sent to a specific country
new AiAgentToolQuery
{
Name = "retrieve-orders-sent-to-a-specific-country",
Description = "a query tool that allows you to retrieve all orders sent to a specific country",
Query = "from Orders as O where O.ShipTo.Country == $country select O.Employee, O.Lines.Quantity",
ParametersSampleObject = "{}"
},

// Set query tool that triggers the agent to retrieve the performer's
// residence region details (country, city, and region) from the database
new AiAgentToolQuery
{
Name = "retrieve-performer-living-region",
Description = "a query tool that allows you to retrieve an employee's country, city, and region, by the employee's ID",
Query = "from Employees as E where id() == $employeeId select E.Address.Country, E.Address.City, E.Address.Region",
ParametersSampleObject = "{" +
"\"employeeId\": \"embed the employee's ID here\"" +
"}"
}
];

agent.Actions =
[
// Set action tool that triggers the client to store the performer's details
new AiAgentToolAction
{
Name = "store-performer-details",
Description = "an action tool that allows you to store the ID of the employee that made " +
"the largest profit, the profit, and your suggestions for a reward, in the database.",
ParametersSampleObject = "{" +
"\"employeeID\": \"embed the employee’s ID here\"," +
"\"profit\": \"embed the employee’s profit here\"," +
"\"suggestedReward\": \"embed your suggestions for a reward here\"" +
"}"
}
];

// Set chat trimming configuration
AiAgentSummarizationByTokens summarization = new AiAgentSummarizationByTokens()
{
// When the number of tokens stored in the conversation exceeds this limit
// summarization of old messages will be triggered.
MaxTokensBeforeSummarization = 32768,
// The maximum number of tokens that the conversation is allowed to contain
// after summarization.
MaxTokensAfterSummarization = 1024
};

agent.ChatTrimming = new AiAgentChatTrimmingConfiguration(summarization);

// Limit the number of times the LLM can request for tools in response to a single user prompt
agent.MaxModelIterationsPerCall = 3;

var createResult = await store.AI.CreateAgentAsync(agent, new Performer
{
employeeID = "the ID of the employee that made the largest profit",
profit = "the profit the employee made",
suggestedReward = "your suggestions for a reward"
});

// Set chat ID, prefix, and agent parameters.
// In this example the "country" agent parameter is set to "France",
// to trigger a query that retrieves orders sent to a particular country.
// Providing "everywhere" instead, will trigger another query that retrieves all orders.
var chat = store.AI.Conversation(
createResult.Identifier,
"Performers/",
new AiConversationCreationOptions().AddParameter("country", "France"));

// Handle the action tool that the LLM uses to store the performer's details in the database
chat.Handle("store-performer-details", (Performer performer) =>
{
using (var session1 = store.OpenSession())
{
// These values are passed to the client by the action tool
Performer rewarded = new Performer
{
employeeID = performer.employeeID,
profit = performer.profit,
suggestedReward = performer.suggestedReward
};

// store the values in the Performers collection in the database
session1.Store(rewarded);
session1.SaveChanges();
}
return "done";
});

// Set the user prompt and run the chat
chat.SetUserPrompt("send a few suggestions to reward the employee that made the largest profit");

var LLMResponse = await chat.RunAsync<Performer>(CancellationToken.None);

if (LLMResponse.Status == AiConversationResult.Done)
{
// The LLM successfully processed the user prompt and returned its response.
// The performer's ID, profit, and suggested rewards were stored in the Performers
// collection by the action tool, and are also returned in the final LLM response.
}
}