Configuration for OpenAPI v3.1 based LLM schema composition.

interface IConfig {
    constraint: boolean;
    reference: boolean;
}

Properties

constraint: boolean

Whether to allow constraint properties or not.

If you configure this property to false, the schemas do not contain the constraint properties of below. Instead, below properties would be written to the ILlmSchemaV3_1.__IAttribute.description property as a comment string like "@format uuid".

This is because some LLM schema model like IChatGptSchema has banned such constraint, because their LLM cannot understand the constraint properties and occur the hallucination.

Therefore, considering your LLM model's performance, capability, and the complexity of your parameter types, determine which is better, to allow the constraint properties or not.

true
reference: boolean

Whether to allow reference type in everywhere.

If you configure this property to false, most of reference types represented by ILlmSchemaV3_1.IReference would be escaped to a plain type unless recursive type case.

This is because some low sized LLM models does not understand the reference type well, and even the large size LLM models sometimes occur the hallucination.

However, the reference type makes the schema size smaller, so that reduces the LLM token cost. Therefore, if you're using the large size of LLM model, and want to reduce the LLM token cost, you can configure this property to true.

false