Optional frequency_The penalty factor for repeated tokens in predictions.
Optional modelThe model config to use
Optional n_The number of predictions to generate.
Optional presence_The penalty factor for tokens not present in predictions.
Optional repeat_The penalty factor for repeated sequences in predictions.
Optional stopThe list of stop words to use for stopping the inference process.
Optional streamIndicates whether the inference should be performed in streaming mode.
Optional temperatureThe temperature value for controlling the randomness of predictions.
Optional tfs_The z-score threshold for filtering predictions.
Optional threadsThe number of threads to use for the inference process.
Optional top_The number of top predictions to consider.
Optional top_The top cumulative probability threshold for predictions.
Generated using TypeDoc
Represents the parameters for the inference process.
InferParams