Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| 712:configure_stages [2026/03/02 12:40] – Prinz, Patrick | 712:configure_stages [2026/03/25 09:44] (current) – [AI Feature Tokenizer Configuration] Prinz, Patrick | ||
|---|---|---|---|
| Line 526: | Line 526: | ||
| </cg> | </cg> | ||
| </ | </ | ||
| + | |||
| + | **When using models newer than gpt4o (like gpt5)** | ||
| + | |||
| + | >The temperature has to be adjusted to the default value of 1. Add < | ||
| + | |||
| + | >The < | ||
| **AI content generation feature (OpenAI compatible): | **AI content generation feature (OpenAI compatible): | ||
| Line 687: | Line 693: | ||
| ===== AI Feature Tokenizer Configuration ===== | ===== AI Feature Tokenizer Configuration ===== | ||
| - | LLM that are NOT using the standard byte-pair-encoding algorithm CANNOT be used with the Azure tokenizer. Stages implements the HuggingFaceTokenizer from deep java library to support customized tokenizer. | + | LLM that are NOT using the standard byte-pair-encoding algorithm |
| Add the property “tokenizerFolder” to the cg-host section | Add the property “tokenizerFolder” to the cg-host section | ||