Configuration for the HTTP LLM application schema composition.

interface IConfig {
    equals: boolean;
    maxLength: number;
    reference: boolean;
    separate: null | (schema: ILlmSchema) => boolean;
    strict: boolean;
}

Hierarchy (View Summary)

Properties

equals: boolean

Whether to disallow superfluous properties or not.

false
maxLength: number

Maximum length of function name.

When a function name is longer than this value, it will be truncated.

If not possible to truncate due to the duplication, the function name would be modified to randomly generated (UUID v4).

64
reference: boolean

Whether to allow reference type in everywhere.

If you configure this property to false, most of reference types represented by ILlmSchema.IReference would be escaped to a plain type unless recursive type comes.

This is because some LLM models do not understand the reference type well, and even the modern version of LLM sometimes occur the hallucination.

However, the reference type makes the schema size smaller, so that reduces the LLM token cost. Therefore, if you're using the modern version of LLM, and want to reduce the LLM token cost, you can configure this property to true.

true
separate: null | (schema: ILlmSchema) => boolean

Separator function for the parameters.

When composing parameter arguments through LLM function call, there can be a case that some parameters must be composed by human, or LLM cannot understand the parameter.

For example, if the parameter type has configured ILlmSchema.IString.contentMediaType which indicates file uploading, it must be composed by human, not by LLM (Large Language Model).

In that case, if you configure this property with a function that predicating whether the schema value must be composed by human or not, the parameters would be separated into two parts.

  • ILlmFunction.separated.llm
  • ILlmFunction.separated.human

When writing the function, note that returning value true means to be a human composing the value, and false means to LLM composing the value. Also, when predicating the schema, it would better to utilize the LlmTypeChecker like features.

null

Schema to be separated.

Whether the schema value must be composed by human or not.

strict: boolean

Whether to apply the strict mode.

If you configure this property to true, the LLM function calling does not allow optional properties and dynamic key typed properties in the ILlmSchema.IObject type. In other words, when strict mode is enabled, ILlmSchema.IObject.additionalProperties is fixed to false, and every property must be ILlmSchema.IObject.required.

However, the strict mode actually shows lower performance in practice. If you utilize the typia.validate function and give its validation feedback to the LLM, the performance is much better than the strict mode.

Therefore, I recommend you to just turn off the strict mode and utilize the typia.validate function instead.

false