Skip to content

0.16.0

Choose a tag to compare

@pchalasani pchalasani released this 13 Sep 14:18
· 916 commits to main since this release

feat: Support OpenAI o1-preview, o1-mini

To use these you can set the LLM config as follows:

config = OpenAIGPTConfig(
	chat_model=OpenAIChatModel.O1_MINI # or O1_PREVIEW
)

Or in many example scripts you can directly specify the model using -m o1-preview or -m o1-mini, e.g.:

python3 examples/basic/chat.py -m o1-mini

Also any pytest that runs against a real (i.e. not MockLM) LLM can be run with these models using --m o1-preview or --m o1-mini, e.g.

pytest -xvs tests/main/test_llm.py --m o1-mini

Note these models (as of Sep 12 2024):

  • do not support streaming, so langroid sets stream to False even if you try to stream
  • do not support system msg, so langroid maps any supplied system msg to a msg with role User, and
  • do not allow temperature setting, so any temperature setting is ignored when using langroid (the models use default temperature = 1)