Propagate the LLM function call.

HttmLlm.propagate() is a function propagating the target API endpoint with with the connection information and arguments composed by Large Language Model like OpenAI (+human sometimes).

By the way, if you've configured the IHttpLlmApplication.IOptions.separate, so that the parameters are separated to human and LLM sides, you have to merge these humand and LLM sides' parameters into one through HttpLlm.mergeParameters function.

About the IHttpLlmApplication.IOptions.keyword option, don't worry anything. This HttmLlm.propagate() function will automatically recognize the keyword arguments and convert them to the proper sequence.

For reference, the propagation means always returning the response from the API endpoint, even if the status is not 200/201. This is useful when you want to handle the response by yourself.

Error only when the connection is failed