OptionaldeprecatedWhether the function is deprecated or not.
If the deprecated is true, the function is not recommended to use.
LLM (Large Language Model) may not use the deprecated function.
OptionaldescriptionDescription of the function.
IHttpLlmFunction.description is composed using the following rules:
@param tags@security tags@tag tags@deprecated tag is addedFor reference, the description is a critical property for teaching the
purpose of the function to LLMs (Large Language Models). LLMs use this
description to determine which function to call.
Also, when the LLM converses with users, the description explains the
function to the user. Therefore, the description property has the
highest priority and should be carefully considered.
HTTP method of the endpoint.
Representative name of the function.
The name is a representative name identifying the function in the
IHttpLlmApplication. The name value is just composed by joining
the IHttpMigrateRoute.accessor by underscore _ character.
Here is the composition rule of the IHttpMigrateRoute.accessor:
The
accessoris composed with the following rules. At first, namespaces are composed by static directory names in the path. Parametric symbols represented by:paramor{param}cannot be a part of the namespace.
Instead, they would be a part of the function name. The function name is composed with the HTTP method and parametric symbols like
getByParamorpostByParam. If there are multiple path parameters, they would be concatenated byAndlikegetByParam1AndParam2.
For reference, if the operation's method is
delete, the function name would be replaced toeraseinstead ofdelete. It is the reason why thedeleteis a reserved keyword in many programming languages.
- Example 1
- Path:
POST /shopping/sellers/sales- Accessor:
shopping.sellers.sales.post- Example 2
- Endpoint:
GET /shoppings/sellers/sales/:saleId/reviews/:reviewId/comments/:id- Accessor:
shoppings.sellers.sales.reviews.getBySaleIdAndReviewIdAndCommentId
Get the Swagger operation metadata.
Get the Swagger operation metadata, of the source.
Swagger operation metadata.
OptionaloutputExpected return type.
If the target operation returns nothing (void), the output would be
undefined.
List of parameter types.
If you've configured IHttpLlmApplication.IOptions.keyword as true,
number of IHttpLlmFunction.parameters are always 1 and the first
parameter's type is always ILlmSchemaV3.IObject. The properties'
rule is:
pathParameters: Path parameters of IHttpMigrateRoute.parametersquery: Query parameter of IHttpMigrateRoute.querybody: Body parameter of IHttpMigrateRoute.body{
...pathParameters,
query,
body,
}
Otherwise, the parameters would be multiple, and the sequence of the parameters are following below rules:
[
...pathParameters,
...(query ? [query] : []),
...(body ? [body] : []),
];
Path of the endpoint.
Get the migration route metadata.
Get the migration route metadata, of the source.
Migration route metadata.
OptionalseparatedCollection of separated parameters.
Filled only when IHttpLlmApplication.IOptions.separate is configured.
OptionaltagsCategory tags for the function.
Same with OpenApi.IOperation.tags indicating the category of the function.
Validate function for the arguments.
You know what? LLMs (Large Language Models) like OpenAI frequently make
mistakes when composing arguments for function calling. Even with simple
types like number defined in the parameters schema, LLMs often
provide a string typed value instead.
In such cases, you should provide validation feedback to the LLM using
this validate function. The validate function returns detailed
information about type errors in the arguments.
Based on my experience, OpenAI's gpt-4o-mini model tends to construct
invalid function calling arguments about 50% of the time on the first attempt.
However, when corrected through this validate function, the success
rate jumps to 99% on the second attempt, and I've never seen a failure
on the third attempt.
If you have separated parameters, use the IHttpLlmFunction.ISeparated.validate function instead when validating LLM-composed arguments.
In that case, this
validatefunction is meaningful only after you've merged the LLM and human composed arguments using the HttpLlm.mergeParameters function.
Arguments to validate
Validation result
LLM function calling schema from HTTP (OpenAPI) operation.
IHttpLlmFunctionis a data structure representing a function converted from the OpenAPI operation, used for the LLM (Large Language Model) function calling. It's a typical RPC (Remote Procedure Call) structure containing the function name, parameters, and return type.If you provide this
IHttpLlmFunctiondata to the LLM provider like "OpenAI", the "OpenAI" will compose a function arguments by analyzing conversations with the user. With the LLM composed arguments, you can execute the function through LlmFetcher.execute and get the result.For reference, different between
IHttpLlmFunctionand its origin source OpenApi.IOperation is,IHttpLlmFunctionhas converted every type schema information from OpenApi.IJsonSchema to ILlmSchemaV3 to escape reference types, and downgrade the version of the JSON schema to OpenAPI 3.0. It's because LLM function call feature cannot understand both reference types and OpenAPI 3.1 specification.Additionally, the properties' rule is:
pathParameters: Path parameters of OpenApi.IOperation.parametersquery: Query parameter of IHttpMigrateRoute.querybody: Body parameter of IHttpMigrateRoute.bodyAuthor
Jeongho Nam - https://github.com/samchon
Reference
https://platform.openai.com/docs/guides/function-calling