Skip to content

issue integrating o1 model with azureopenai - chat_completion() argument mismatch error #470

@brunosavoca

Description

@brunosavoca

Hi everyone!

i'm implementing the o1 model through azureopenai integration. so far, i've updated the files openai.py and the taskweaver config json. when launching the application using chainlit (chainlit run app.py), i'm encountering the following error:

TypeError: OpenAIService.chat_completion() takes from 2 to 6 positional arguments but 7 were given
i suspect this might be due to recent api changes or unmatched parameters between o1 model integration and azure api interface.

To Reproduce
steps to reproduce the behavior:

start the service with:
chainlit run app.py
type any user query (e.g. "hello" or "what can you do?")
wait for the response
see the error message in the terminal logs immediately after the query is processed:

TypeError: OpenAIService.chat_completion() takes from 2 to 6 positional arguments but 7 were given

Expected behavior
the service should process the query and respond normally without raising any parameter-count error.

Environment Information:

os: windows
python version: 3.13.2
llm that you're using: o1 through azureopenai
other configurations except the llm api/key related: default taskweaver config, no changes other than adjusting model integrations

Additional context
i'm going to continue debugging further, but wanted to check early in case others in the community encountered similar integration issues or if a recent change to azureopenai api may have introduced breaking changes. if anyone has insights or faced a similar issue, please share!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions