"glm-3-turbo"
Whether to stream the results or not. Defaults to false.
Optional maxmax value is 8192,defaults to 1024
Optional messagesMessages to pass as a prefix to the prompt
Optional requestUnique identifier for the request. Defaults to a random UUID.
Optional stopOptional temperatureAmount of randomness injected into the response. Ranges from 0 to 1 (0 is not included). Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks. Defaults to 0.95
Optional topPTotal probability mass of tokens to consider at each step. Range from 0 to 1 Defaults to 0.7
Optional zhipuAIApiAPI key to use when making requests. Defaults to the value of
ZHIPUAI_API_KEY environment variable.
Get the identifying parameters for the model
Get the parameters used to invoke the model
Optional incremental_Optional max_Optional request_Optional result_Optional stream?: booleanOptional temperature?: null | numberOptional top_Optional top_Static encodeGenerated using TypeDoc
Interface defining the input to the ZhipuAIChatInput class.