Interface: LLM<AdditionalChatOptions, AdditionalMessageOptions>
Unified language model interface
Extends
LLMChat
<AdditionalChatOptions
>
Type Parameters
• AdditionalChatOptions extends object
= object
• AdditionalMessageOptions extends object
= object
Properties
metadata
metadata:
LLMMetadata
Defined in
packages/core/llms/dist/index.d.ts:20
Methods
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<ChatResponseChunk
<object
>,any
,any
>>
Get a chat response from the LLM
Parameters
• params: LLMChatParamsStreaming
<AdditionalChatOptions
, AdditionalMessageOptions
>
Returns
Promise
<AsyncIterable
<ChatResponseChunk
<object
>, any
, any
>>
Overrides
Defined in
packages/core/llms/dist/index.d.ts:24
chat(params)
chat(
params
):Promise
<ChatResponse
<AdditionalMessageOptions
>>
Parameters
• params: LLMChatParamsNonStreaming
<AdditionalChatOptions
, AdditionalMessageOptions
>
Returns
Promise
<ChatResponse
<AdditionalMessageOptions
>>
Overrides
LLMChat.chat
Defined in
packages/core/llms/dist/index.d.ts:25
complete()
complete(params)
complete(
params
):Promise
<AsyncIterable
<CompletionResponse
,any
,any
>>
Get a prompt completion from the LLM
Parameters
• params: LLMCompletionParamsStreaming
Returns
Promise
<AsyncIterable
<CompletionResponse
, any
, any
>>
Defined in
packages/core/llms/dist/index.d.ts:29
complete(params)
complete(
params
):Promise
<CompletionResponse
>
Parameters
• params: LLMCompletionParamsNonStreaming
Returns
Promise
<CompletionResponse
>
Defined in
packages/core/llms/dist/index.d.ts:30