_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {...

2025年09月25日 09:51 响应: 1.95秒 状态: success

🚨 错误信息

Traceback (most recent call last): File "D:\新建文件夹\langchain-base\langchain_learn.py", line 20, in <module> model.invoke(message) ~~~~~~~~~~~~^^^^^^^^^ File "D:\software\Python\Lib\site-packages\langchain_core\language_models\chat_models.py", line 395, in invoke self.generate_prompt( ~~~~~~~~~~~~~~~~~~~~^ [self._convert_input(input)], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<6 lines>... **kwargs, ^^^^^^^^^ ).generations[0][0], ^ File "D:\software\Python\Lib\site-packages\langchain_core\language_models\chat_models.py", line 1023, in generate_prompt return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs) ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\software\Python\Lib\site-packages\langchain_core\language_models\chat_models.py", line 840, in generate self._generate_with_cache( ~~~~~~~~~~~~~~~~~~~~~~~~~^ m, ^^ ...<2 lines>... **kwargs, ^^^^^^^^^ ) ^ File "D:\software\Python\Lib\site-packages\langchain_core\language_models\chat_models.py", line 1089, in _generate_with_cache result = self._generate( messages, stop=stop, run_manager=run_manager, **kwargs ) File "D:\software\Python\Lib\site-packages\langchain_openai\chat_models\base.py", line 1184, in _generate raise e File "D:\software\Python\Lib\site-packages\langchain_openai\chat_models\base.py", line 1179, in _generate raw_response = self.client.with_raw_response.create(**payload) File "D:\software\Python\Lib\site-packages\openai\_legacy_response.py", line 364, in wrapped return cast(LegacyAPIResponse[R], func(*args, **kwargs)) ~~~~^^^^^^^^^^^^^^^^^ File "D:\software\Python\Lib\site-packages\openai\_utils\_utils.py", line 286, in wrapper return func(*args, **kwargs) File "D:\software\Python\Lib\site-packages\openai\resources\chat\completions\completions.py", line 1147, in create return self._post( ~~~~~~~~~~^ "/chat/completions", ^^^^^^^^^^^^^^^^^^^^ ...<46 lines>... stream_cls=Stream[ChatCompletionChunk], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "D:\software\Python\Lib\site-packages\openai\_base_client.py", line 1259, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\software\Python\Lib\site-packages\openai\_base_client.py", line 1047, in request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-3.5-turbo-16k-0613` has been deprecated, learn more here: https://platform.openai.com/docs/deprecations', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

🤖 AI解决方案

openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-3.5-turbo-16k-0613` has been deprecated, learn more here: https://platform.openai.com/docs/deprecations', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

原因分析

指定的模型 gpt-3.5-turbo-16k-0613 已被 OpenAI 官方弃用。

解决方案

将代码中使用的模型名称更新为 OpenAI 提供的当前可用模型。例如,可以替换为 gpt-3.5-turbo 或其他更新的模型。

# 示例:假设你的模型实例化代码是这样的
# model = ChatOpenAI(model_name="gpt-3.5-turbo-16k-0613")

# 修改为:
from langchain_openai import ChatOpenAI

# 选择一个当前可用的模型,例如:
model = ChatOpenAI(model_name="gpt-3.5-turbo")
# 或者其他更新的模型,如 "gpt-4", "gpt-4o" 等

预防措施

  • 定期检查模型状态:关注 OpenAI 官方文档或更新公告,了解模型是否有弃用或更新计划。
  • 使用最新推荐模型:在开发时,优先选择 Langchain 和 OpenAI 官方推荐的最新稳定模型,以减少因模型更新导致问题的风险。