Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| 712:configure_stages [2026/03/05 09:19] – Prinz, Patrick | 712:configure_stages [2026/03/25 09:44] (current) – [AI Feature Tokenizer Configuration] Prinz, Patrick | ||
|---|---|---|---|
| Line 529: | Line 529: | ||
| **When using models newer than gpt4o (like gpt5)** | **When using models newer than gpt4o (like gpt5)** | ||
| - | >Most likely the temperature has to be adjusted. Add < | + | >The temperature has to be adjusted |
| - | > | + | >The < |
| **AI content generation feature (OpenAI compatible): | **AI content generation feature (OpenAI compatible): | ||
| Line 693: | Line 693: | ||
| ===== AI Feature Tokenizer Configuration ===== | ===== AI Feature Tokenizer Configuration ===== | ||
| - | LLM that are NOT using the standard byte-pair-encoding algorithm CANNOT be used with the Azure tokenizer. Stages implements the HuggingFaceTokenizer from deep java library to support customized tokenizer. | + | LLM that are NOT using the standard byte-pair-encoding algorithm |
| Add the property “tokenizerFolder” to the cg-host section | Add the property “tokenizerFolder” to the cg-host section | ||