Class: ChatModel
Extends
Constructors
new ChatModel(args)
new ChatModel(
args
?):ChatModel
Parameters
Parameter | Type | Description |
---|---|---|
args ? | object | - |
args.cache ? | CacheStorage <string , Response > | Enables caching for model responses. Must implement .get(key) and .set(key, value) , both of which can be either sync or async.Some examples include: new Map() , quick-lru (opens in a new tab), or any keyv adaptor (opens in a new tab). |
args.cacheKey ? | CacheKey <Run & Config , string > | A function that returns a cache key for the given params. A simple example would be: (params) => JSON.stringify(params) The default cacheKey function uses hash-object (opens in a new tab) to create a stable sha256 hash of the params. |
args.client ? | Client | - |
args.context ? | Ctx | - |
args.debug ? | boolean | Whether or not to add default console.log event handlers |
args.events ? | Events <Run & Config , Response , any > | - |
args.params ? | Config & Partial <Run > | - |
Returns
Overrides
Source
src/model/chat.ts:29 (opens in a new tab)
Properties
Property | Type | Description | Inheritance | Source |
---|---|---|---|---|
modelProvider | "openai" | - | AbstractModel .modelProvider | src/model/chat.ts:27 (opens in a new tab) |
modelType | "chat" | - | AbstractModel .modelType | src/model/chat.ts:26 (opens in a new tab) |
tokenizer | ITokenizer | - | AbstractModel .tokenizer | src/model/model.ts:65 (opens in a new tab) |
Methods
addEvents()
addEvents(
events
):ChatModel
Add event handlers to the model.
Parameters
Parameter | Type |
---|---|
events | Events <Run & Config , Response , ChatCompletion > |
Returns
Inherited from
Source
src/model/model.ts:235 (opens in a new tab)
addParams()
addParams(
params
):ChatModel
Add the params. Overrides existing keys.
Parameters
Parameter | Type |
---|---|
params | Partial <Config & Partial <Run >> |
Returns
Inherited from
Source
src/model/model.ts:213 (opens in a new tab)
clone()
clone(
args
?):ChatModel
Clone the model and merge/override the given properties.
Parameters
Parameter | Type | Description |
---|---|---|
args ? | object | - |
args.cache ? | CacheStorage <string , Response > | Enables caching for model responses. Must implement .get(key) and .set(key, value) , both of which can be either sync or async.Some examples include: new Map() , quick-lru (opens in a new tab), or any keyv adaptor (opens in a new tab). |
args.cacheKey ? | CacheKey <Run & Config , string > | A function that returns a cache key for the given params. A simple example would be: (params) => JSON.stringify(params) The default cacheKey function uses hash-object (opens in a new tab) to create a stable sha256 hash of the params. |
args.client ? | Client | - |
args.context ? | Ctx | - |
args.debug ? | boolean | Whether or not to add default console.log event handlers |
args.events ? | Events <Run & Config , Response , any > | - |
args.params ? | Config & Partial <Run > | - |
Returns
Overrides
Source
src/model/chat.ts:187 (opens in a new tab)
getClient()
getClient():
Client
Get the current client
Returns
Inherited from
Source
src/model/model.ts:180 (opens in a new tab)
getContext()
getContext():
Ctx
Get the current context
Returns
Inherited from
Source
src/model/model.ts:191 (opens in a new tab)
getEvents()
Get the current event handlers
Returns
Events
<Run
& Config
, Response
, ChatCompletion
>
Inherited from
Source
src/model/model.ts:230 (opens in a new tab)
getParams()
Get the current params
Returns
Inherited from
Source
src/model/model.ts:208 (opens in a new tab)
run()
run(
params
,context
?):Promise
<Response
>
Parameters
Parameter | Type |
---|---|
params | object |
params.frequency_penalty ? | null | number |
params.function_call ? | "none" | "auto" | ChatCompletionFunctionCallOption |
params.functions ? | Function [] |
params.handleUpdate ? | (chunk ) => void |
params.logit_bias ? | null | Record <string , number > |
params.max_tokens ? | null | number |
params.messages ? | ChatMessage [] |
params.model ? | "gpt-4" | "gpt-4-32k" | "gpt-3.5-turbo" | "gpt-3.5-turbo-16k" | string & object | "gpt-4-0314" | "gpt-4-0613" | "gpt-4-32k-0314" | "gpt-4-32k-0613" | "gpt-3.5-turbo-0301" | "gpt-3.5-turbo-0613" | "gpt-3.5-turbo-16k-0613" |
params.presence_penalty ? | null | number |
params.response_format ? | ResponseFormat |
params.seed ? | null | number |
params.stop ? | null | string | string [] |
params.temperature ? | null | number |
params.tool_choice ? | ChatCompletionToolChoiceOption |
params.tools ? | ChatCompletionTool [] |
params.top_p ? | null | number |
context ? | Ctx |
Returns
Promise
<Response
>
Inherited from
Source
src/model/model.ts:78 (opens in a new tab)
setCache()
setCache(
cache
):ChatModel
Set the cache to a new cache. Set to undefined to remove existing.
Parameters
Parameter | Type |
---|---|
cache | undefined | CacheStorage <string , Response > |
Returns
Inherited from
Source
src/model/model.ts:174 (opens in a new tab)
setClient()
setClient(
client
):ChatModel
Set the client to a new OpenAI API client.
Parameters
Parameter | Type |
---|---|
client | Client |
Returns
Inherited from
Source
src/model/model.ts:185 (opens in a new tab)
setContext()
setContext(
context
):ChatModel
Set the context to a new context. Removes all existing values.
Parameters
Parameter | Type |
---|---|
context | Ctx |
Returns
Inherited from
Source
src/model/model.ts:202 (opens in a new tab)
setEvents()
setEvents(
events
):ChatModel
Set the event handlers to a new set of events. Removes all existing event handlers.
Set to empty object {}
to remove all events.
Parameters
Parameter | Type |
---|---|
events | Events <Run & Config , Response , ChatCompletion > |
Returns
Inherited from
Source
src/model/model.ts:244 (opens in a new tab)
setParams()
setParams(
params
):ChatModel
Set the params to a new params. Removes all existing values.
Parameters
Parameter | Type |
---|---|
params | Config & Partial <Run > |
Returns
Inherited from
Source
src/model/model.ts:223 (opens in a new tab)
updateContext()
updateContext(
context
):ChatModel
Add the context. Overrides existing keys.
Parameters
Parameter | Type |
---|---|
context | Ctx |