是否有办法离线配置使用本地的OpenAI格式向量化模型

发现凡是OpenAI格式的向量化模型(无论哪个Provider),都需要请求一个OpenAI的地址,如果连接不上就会出现错误HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7f3c70c1c450>, 'Connection to openaipublic.blob.core.windows.net timed out. (connect timeout=None)'))
这个问题有办法解决吗:cry:

可以参考这个解决方案

类似的问题还有huggingface联网问题

2025-02-18 09:08:09 [DEBUG] 
Received request: POST to /v1/embeddings with body  {
  "input": [
    [
      57668,
      53901,
      6447
    ]
  ],
  "model": "text-embedding-bge-m3",
  "encoding_format": "base64"
}
2025-02-18 09:08:09 [ERROR] 
'input' field must be a string or an array of strings

临时联网解决了,但是又遇到个问题,为什么MaxKB在测试文本向量化模型的时候传入的是一个数组啊