Skip to content

issue: Ask/Explain buttons in chat context do not use users OpenAI direct connections #15579

Open
@wilstdu

Description

@wilstdu

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

0.6.14

Ollama Version (if applicable)

No response

Operating System

macOS, Ubuntu

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

If a user has a direct connection to OpenAI-compatible service configured, it should also use that connection settings when trying to fetch any OpenAI format information.

In my case, it was only about the Ask and Explain buttons, but there should be more places where this is a problem.

Actual Behavior

Our setup:

  • LiteLLM
  • OpenWebUI (0.6.14 in the cluster, same problems when started locally with the latest version 0.6.15)
  • Kubernetes cluster

Users create their tokens in LiteLLM and configure them in OpenWebUI - Settings - Connections - Manage Direct Connections:

  • URL - LiteLLM domain address
  • Key - LiteLLM user API key

Behaviour:
When trying to continue a discussion about certain points of the generated response in the chat via these buttons:
Image

It results an error:
Image

After setting these environment variables:
OPENAI_API_BASE_URLS
OPENAI_API_KEYS
It works.

The problem with those variables is that they are set system-wide, meaning that it would need a technical user in LiteLLM, and the requests that the individual user does would go to the technical user`s bills (doesn't make sense, heh?).

Steps to Reproduce

  1. Make sure that OPENAI_API_BASE_URLS and OPENAI_API_KEYS are not set
  2. Start a fresh local instance of OpenWebUI - it's reproducible everywhere - Docker, pip install, local dev.sh script
  3. Setup the admin user if needed, and login
  4. Go to OpenWebUI - Settings - Connections - Manage Direct Connections:
    URL - your OpenAI domain address
    Key - your OpenAI API key
  5. Start a chat, ask anything, wait for response
  6. After getting the response - double click on any text from the response
  7. Click Ask or Explain - an error will show up after sending the request

Logs & Screenshots

OpenWebUI logs: openwebui-problem.log

Additional Information

This is the location where the error is thrown:
Image

The main root cause:
Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions