Skip to content

Unable to get Structured Response with Agentchat Swarm and AzureOpenAI gpt models #6638

Open
@sharanyabhat

Description

@sharanyabhat

What happened?

I have a Swarm implementation with 5 agents using AzureOpenAIChatCompletionClient.
I have configured structured output via a pydantic class passed with output_content_type in the Assistant Agent configs
When I try to fetch the response using task_result = await Console(team.run_stream(task=query)) I see the below failure:

Traceback (most recent call last):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_single_threaded_agent_runtime.py", line 533, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_sequential_routed_agent.py", line 67, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_routed_agent.py", line 485, in on_message_impl
    return await h(self, message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_routed_agent.py", line 268, in wrapper
    return_value = await func(self, message, ctx)  # type: ignore
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 79, in handle_request
    async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/agents/_assistant_agent.py", line 827, in on_messages_stream
    async for inference_output in self._call_llm(
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/agents/_assistant_agent.py", line 939, in _call_llm
    async for chunk in model_client.create_stream(
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_ext/models/openai/_openai_client.py", line 811, in create_stream
    async for chunk in chunks:
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_ext/models/openai/_openai_client.py", line 1032, in _create_stream_chunks_beta_client
    event = await event_future
            ^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 192, in __anext__
    return await self._iterator.__anext__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 241, in __stream__
    events_to_fire = self._state.handle_chunk(sse_event)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 348, in handle_chunk
    return self._build_events(
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 576, in _build_events
    choice_state.get_done_events(
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 608, in get_done_events
    self._content_done_events(choice_snapshot=choice_snapshot, response_format=response_format)
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 649, in _content_done_events
    parsed = maybe_parse_content(
             ^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/_parsing/_completions.py", line 161, in maybe_parse_content
    return _parse_content(response_format, message.content)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/_parsing/_completions.py", line 221, in _parse_content
    return cast(ResponseFormatT, model_parse_json(response_format, content))
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/_compat.py", line 169, in model_parse_json
    return model.model_validate_json(data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/pydantic/main.py", line 744, in model_validate_json
    return cls.__pydantic_validator__.validate_json(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pydantic_core._pydantic_core.ValidationError: 1 validation error for MyResponse
  Invalid JSON: trailing characters at line 2 column 1 [type=json_invalid, input_value='{"responses":[{"Markdown... - **Answers**: None"}]}', input_type=str]
    For further information visit https://errors.pydantic.dev/2.11/v/json_invalid
ERROR:autogen_core:Error processing publish message for agent1_agent_71cba8a1-71b6-49b6-810a-426a80fdad39/71cba8a1-71b6-49b6-810a-426a80fdad39
Traceback (most recent call last):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_single_threaded_agent_runtime.py", line 533, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_sequential_routed_agent.py", line 72, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_routed_agent.py", line 486, in on_message_impl
    return await self.on_unhandled_message(message, ctx)  # type: ignore
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 133, in on_unhandled_message
    raise ValueError(f"Unhandled message in agent container: {type(message)}")
ValueError: Unhandled message in agent container: <class 'autogen_agentchat.teams._group_chat._events.GroupChatError'>
ERROR:autogen_core:Error processing publish message for agent3_agent_71cba8a1-71b6-49b6-810a-426a80fdad39/71cba8a1-71b6-49b6-810a-426a80fdad39
Traceback (most recent call last):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_single_threaded_agent_runtime.py", line 533, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_sequential_routed_agent.py", line 72, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_routed_agent.py", line 486, in on_message_impl
    return await self.on_unhandled_message(message, ctx)  # type: ignore
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 133, in on_unhandled_message
    raise ValueError(f"Unhandled message in agent container: {type(message)}")
ValueError: Unhandled message in agent container: <class 'autogen_agentchat.teams._group_chat._events.GroupChatError'>
ERROR:autogen_core:Error processing publish message for agent2_agent_71cba8a1-71b6-49b6-810a-426a80fdad39/71cba8a1-71b6-49b6-810a-426a80fdad39
Traceback (most recent call last):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_single_threaded_agent_runtime.py", line 533, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_sequential_routed_agent.py", line 72, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_routed_agent.py", line 486, in on_message_impl
    return await self.on_unhandled_message(message, ctx)  # type: ignore
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 133, in on_unhandled_message
    raise ValueError(f"Unhandled message in agent container: {type(message)}")
ValueError: Unhandled message in agent container: <class 'autogen_agentchat.teams._group_chat._events.GroupChatError'>
ERROR:autogen_core:Error processing publish message for agent4_agent_71cba8a1-71b6-49b6-810a-426a80fdad39/71cba8a1-71b6-49b6-810a-426a80fdad39
Traceback (most recent call last):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_single_threaded_agent_runtime.py", line 533, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_sequential_routed_agent.py", line 72, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_core/_routed_agent.py", line 486, in on_message_impl
    return await self.on_unhandled_message(message, ctx)  # type: ignore
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 133, in on_unhandled_message
    raise ValueError(f"Unhandled message in agent container: {type(message)}")
ValueError: Unhandled message in agent container: <class 'autogen_agentchat.teams._group_chat._events.GroupChatError'>
ERROR:llm.api:Unable to process the query 'list devices' for org_id:bdd8043c-c38f-402d-8711-163502d1fecd, chat_id:9ceb2a8c-4250-11f0-9e0e-22925462b1c9 err:ValidationError: 1 validation error for MyResponse
  Invalid JSON: trailing characters at line 2 column 1 [type=json_invalid, input_value='{"responses":[{"Markdown... - **Answers**: None"}]}', input_type=str]
    For further information visit https://errors.pydantic.dev/2.11/v/json_invalid
Traceback:
Traceback (most recent call last):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 79, in handle_request
    async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/agents/_assistant_agent.py", line 827, in on_messages_stream
    async for inference_output in self._call_llm(
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/agents/_assistant_agent.py", line 939, in _call_llm
    async for chunk in model_client.create_stream(
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_ext/models/openai/_openai_client.py", line 811, in create_stream
    async for chunk in chunks:
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_ext/models/openai/_openai_client.py", line 1032, in _create_stream_chunks_beta_client
    event = await event_future
            ^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 192, in __anext__
    return await self._iterator.__anext__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 241, in __stream__
    events_to_fire = self._state.handle_chunk(sse_event)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 348, in handle_chunk
    return self._build_events(
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 576, in _build_events
    choice_state.get_done_events(
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 608, in get_done_events
    self._content_done_events(choice_snapshot=choice_snapshot, response_format=response_format)
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 649, in _content_done_events
    parsed = maybe_parse_content(
             ^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/_parsing/_completions.py", line 161, in maybe_parse_content
    return _parse_content(response_format, message.content)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/_parsing/_completions.py", line 221, in _parse_content
    return cast(ResponseFormatT, model_parse_json(response_format, content))
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/_compat.py", line 169, in model_parse_json
    return model.model_validate_json(data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/pydantic/main.py", line 744, in model_validate_json
    return cls.__pydantic_validator__.validate_json(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pydantic_core._pydantic_core.ValidationError: 1 validation error for MyResponse
  Invalid JSON: trailing characters at line 2 column 1 [type=json_invalid, input_value='{"responses":[{"Markdown... - **Answers**: None"}]}', input_type=str]
    For further information visit https://errors.pydantic.dev/2.11/v/json_invalid
,
 stack trace:Traceback (most recent call last):
  File "/Users/sharanyab/my_project/llm/api.py", line 94, in chat
    response, input_tokens, output_tokens, model_name = get_llm_chat_response(org_id=str(org_id), query=chat.query,
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/llm/api_functions.py", line 51, in get_llm_chat_response
    resp, input_tokens, output_tokens, model_name = chat(query=query, messages=messages, temperature=temperature,
                                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/utils/lm_calls/main.py", line 284, in chat
    response, input_tokens, output_tokens, model_id = asyncio.run(get_response_from_autogen(
                                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/base_events.py", line 654, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/utils/lm_calls/autogen/autogen.py", line 98, in get_response_from_autogen
    json_resp, tool_calls, output_tokens, input_tokens = await (
                                                         ^^^^^^^
  File "/Users/sharanyab/my_project/utils/lm_calls/autogen/autogen.py", line 37, in run_team_stream
    task_result = await Console(team.run_stream(task=query))
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/ui/_console.py", line 117, in Console
    async for message in stream:
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_base_group_chat.py", line 518, in run_stream
    raise RuntimeError(str(message.error))
RuntimeError: ValidationError: 1 validation error for MyResponse
  Invalid JSON: trailing characters at line 2 column 1 [type=json_invalid, input_value='{"responses":[{"Markdown... - **Answers**: None"}]}', input_type=str]
    For further information visit https://errors.pydantic.dev/2.11/v/json_invalid
Traceback:
Traceback (most recent call last):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 79, in handle_request
    async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/agents/_assistant_agent.py", line 827, in on_messages_stream
    async for inference_output in self._call_llm(
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_agentchat/agents/_assistant_agent.py", line 939, in _call_llm
    async for chunk in model_client.create_stream(
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_ext/models/openai/_openai_client.py", line 811, in create_stream
    async for chunk in chunks:
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/autogen_ext/models/openai/_openai_client.py", line 1032, in _create_stream_chunks_beta_client
    event = await event_future
            ^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 192, in __anext__
    return await self._iterator.__anext__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 241, in __stream__
    events_to_fire = self._state.handle_chunk(sse_event)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 348, in handle_chunk
    return self._build_events(
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 576, in _build_events
    choice_state.get_done_events(
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 608, in get_done_events
    self._content_done_events(choice_snapshot=choice_snapshot, response_format=response_format)
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/streaming/chat/_completions.py", line 649, in _content_done_events
    parsed = maybe_parse_content(
             ^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/_parsing/_completions.py", line 161, in maybe_parse_content
    return _parse_content(response_format, message.content)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/lib/_parsing/_completions.py", line 221, in _parse_content
    return cast(ResponseFormatT, model_parse_json(response_format, content))
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/openai/_compat.py", line 169, in model_parse_json
    return model.model_validate_json(data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sharanyab/my_project/.venv/lib/python3.11/site-packages/pydantic/main.py", line 744, in model_validate_json
    return cls.__pydantic_validator__.validate_json(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pydantic_core._pydantic_core.ValidationError: 1 validation error for MyResponse
  Invalid JSON: trailing characters at line 2 column 1 [type=json_invalid, input_value='{"responses":[{"Markdown... - **Answers**: None"}]}', input_type=str]
    For further information visit https://errors.pydantic.dev/2.11/v/json_invalid

We are using AzureOpenAIChatCompletionClient :

common_args = {
    "timeout": int(os.getenv("LLM_TIMEOUT","500")),  # in seconds
    "stream_options":{"include_usage": True}
}

model_client = AzureOpenAIChatCompletionClient(
        azure_deployment=model_id,
        model=model_id,
        api_version=api_version,
        azure_endpoint=base_url,
        api_key=api_key,
        top_p=top_p,
        temperature=temperature,
        **common_args,

The Swarm configs is as below:

termination = HandoffTermination(target="user") | TextMentionTermination("TERMINATE")
    team = Swarm(
        participants = agents,
        termination_condition=termination,
        emit_team_events=True,
    )

The structured response model looks as below:

class MyResponse(BaseModel):
    class Response(BaseModel):
        Markdown: str

    responses: list[Response]

Agent configs are all smilar as below:


class HistoryMessages(ChatCompletionContext):
    """
    A simple implementation of ChatCompletionContext abstract class that stores messages in memory.
    """
    async def get_messages(self) -> List[LLMMessage]:
        """Retrieve all messages from the context."""
        return self._messages

common_args = {
        "model_client": model_client,
        "reflect_on_tool_use": False,
        "output_content_type": MyResponse,
        "model_client_stream": True,
        "model_context" : HistoryMessages(initial_messages=messages),
    }

agent1 = AssistantAgent(
        "agent1",
        handoffs=["agent2", "agent3", "agent4", "agent5"],
        system_message=AGENT1_SYSTEM_MESSAGE,
        **common_args,
    )

Randomly the workflow works, but most of the time I encounter this issue. Is this a known bug? Is there an issue or error in my implementation?
The issue is not seen consistently, seen randomly but frequently.
The issue is also seen in 0.5.7 as well, and more frequently.
The issue looks to be similar to #6480

Which packages was the bug in?

Python AgentChat (autogen-agentchat>=0.4.0)

AutoGen library version.

Autogen 0.5.6

Other library version.

autogen-agentchat==0.5.6
autogen-core==0.5.6
autogen-ext==0.5.6
azure-ai-inference==1.0.0b9
azure-ai-projects==1.0.0b10
azure-common==1.1.28
azure-core==1.34.0
azure-identity==1.21.0
azure-search-documents==11.5.2

Model used

gpt-4o

Model provider

Azure OpenAI

Other model provider

No response

Python version

Python 3.11

.NET version

None

Operating system

None

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions