maxkb 1.10.9 版本,mcp 调用完 AI 回答结束后报错

exception: Attempted to exit cancel scope in a different task than it was entered in
+++++++++++++++++++++++++ get_num_tokens_from_messages() is not presently implemented for model cl100k_base. See https://platform.openai.com/docs/guides/text-generation/managing-tokens for information on how messages are converted to tokens.
E:\fzy_pro\MaxKB-1.10.9-lts.venv\Lib\site-packages\huggingface_hub\file_download.py:945: FutureWarning: resume_download is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use force_download=True.
warnings.warn(
Traceback (most recent call last):
File “E:\fzy_pro\MaxKB-1.10.9-lts\apps\setting\models_provider\impl\base_chat_open_ai.py”, line 98, in get_num_tokens_from_messages
return super().get_num_tokens_from_messages(messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “E:\fzy_pro\MaxKB-1.10.9-lts.venv\Lib\site-packages\langchain_openai\chat_models\base.py”, line 1264, in get_num_tokens_from_messages
raise NotImplementedError(
NotImplementedError: get_num_tokens_from_messages() is not presently implemented for model cl100k_base. See https://platform.openai.com/docs/guides/text-generation/managing-tokens for information on how messages are converted to tokens.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “E:\fzy_pro\MaxKB-1.10.9-lts\apps\application\flow\workflow_manage.py”, line 506, in hand_event_node_result
for r in result:
File “E:\fzy_pro\MaxKB-1.10.9-lts\apps\application\flow\step_node\ai_chat_step_node\impl\base_chat_node.py”, line 103, in write_context_stream
_write_context(node_variable, workflow_variable, node, workflow, answer, reasoning_content)
File “E:\fzy_pro\MaxKB-1.10.9-lts\apps\application\flow\step_node\ai_chat_step_node\impl\base_chat_node.py”, line 47, in _write_context
message_tokens = chat_model.get_num_tokens_from_messages(node_variable.get(‘message_list’))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “E:\fzy_pro\MaxKB-1.10.9-lts\apps\setting\models_provider\impl\base_chat_open_ai.py”, line 101, in get_num_tokens_from_messages
tokenizer = TokenizerManage.get_tokenizer()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “E:\fzy_pro\MaxKB-1.10.9-lts\apps\common\config\tokenizer_manage_config.py”, line 18, in get_tokenizer
TokenizerManage.tokenizer = GPT2TokenizerFast.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “E:\fzy_pro\MaxKB-1.10.9-lts.venv\Lib\site-packages\transformers\tokenization_utils_base.py”, line 2069, in from_pretrained
return cls._from_pretrained(
^^^^^^^^^^^^^^^^^^^^^
File “E:\fzy_pro\MaxKB-1.10.9-lts.venv\Lib\site-packages\transformers\tokenization_utils_base.py”, line 2107, in _from_pretrained
slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “E:\fzy_pro\MaxKB-1.10.9-lts.venv\Lib\site-packages\transformers\tokenization_utils_base.py”, line 2315, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “E:\fzy_pro\MaxKB-1.10.9-lts.venv\Lib\site-packages\transformers\models\gpt2\tokenization_gpt2.py”, line 153, in init
with open(vocab_file, encoding=“utf-8”) as vocab_handle:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: expected str, bytes or os.PathLike object, not NoneType

看了社区的回答应该是要下载模型,就是不知道下载哪个模型,麻烦能告诉一下吗

是标准化部署吗,用的是那种部署方式呢?