Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
712:configure_stages [2026/03/02 12:40] Prinz, Patrick712:configure_stages [2026/03/25 09:44] (current) – [AI Feature Tokenizer Configuration] Prinz, Patrick
Line 526: Line 526:
 </cg> </cg>
 </code> </code>
 +
 +**When using models newer than gpt4o (like gpt5)**
 +
 +>The temperature has to be adjusted to the default value of 1. Add <cg-property name="temperature" value="1" /> to the relevant <cg-hosts>
 +
 +>The <chatbot-property name="maxTokens" value="500"></chatbot-property> has to be replaced with <chatbot-property name="maxCompletionTokens" value="500"></chatbot-property>
  
 **AI content generation feature (OpenAI compatible):** **AI content generation feature (OpenAI compatible):**
Line 687: Line 693:
 ===== AI Feature Tokenizer Configuration ===== ===== AI Feature Tokenizer Configuration =====
  
-LLM that are NOT using the standard byte-pair-encoding algorithm CANNOT be used with the Azure tokenizer. Stages implements the HuggingFaceTokenizer from deep java library to support customized tokenizer.+LLM that are NOT using the standard byte-pair-encoding algorithm or are provided via the openAICompatible adapter CANNOT be used with the Azure tokenizer. Stages implements the HuggingFaceTokenizer from deep java library to support customized tokenizer.
  
 Add the property “tokenizerFolder” to the cg-host section Add the property “tokenizerFolder” to the cg-host section