Configuring Open WebUI for AI Interaction|Setting base AI models
Applies to SUSE AI 1.0

2 Setting base AI models

Open WebUI needs at least one base AI model configured so that you can interact with it or create custom AI models as described in Chapter 5, Managing custom AI models. You can download models from repositories such as Ollama or Hugging Face, or use models via OpenAI-compatible APIs.

2.1 Downloading AI models from Ollama

Ollama (https://ollama.com/) is an online repository that hosts open source AI models. This procedure describes how to download and use these models from the Open WebUI interface.

Requirements
  • You must have Open WebUI administrator privileges to access configuration screens or settings mentioned in this section.

    1. In the bottom left of the Open WebUI window, click your avatar icon to open the user menu and select Admin Panel .

    2. Click the Settings tab and select Models from the left menu.

    3. In the top-right corner of the screen, click the small "download" icon to open the Manage Models screen.

      Downloading Ollama AI models
      Figure 2.1: Downloading Ollama AI models
    4. On the Ollama library page, identify the model tag of an AI model that you want to download, for example, gemma3:4b.

    5. Paste the model tag in the Pull a model from Ollama.com input field and confirm by clicking the small download button on the right.

      Entering model tag to download
      Figure 2.2: Entering model tag to download

      The model download starts and, after it is finished, the model will be available as a base model for AI interaction.

2.2 Downloading AI models from Hugging Face

Hugging Face (https://huggingface.co/models) is an online repository for AI models. It also supports models in the GGUF binary format optimized for quick loading and saving of models. This procedure describes how to download and use these GGUF models from the Open WebUI interface.

Requirements
  • You must have Open WebUI administrator privileges to access configuration screens or settings mentioned in this section.

    1. Navigate your Web browser to https://huggingface.co/models and in the left Libraries section, click GGUF to include only GGUF models to choose from, for example, gemma-7b.

      Finding models on Hugging Face
      Figure 2.3: Finding models on Hugging Face
    2. Click the model name to open its model card and select Use this model › Ollama

      Hugging Face model card
      Figure 2.4: Hugging Face model card
    3. From the How to use pop-up, select the full path to the model, for example, hf.co/google/gemma-7b.

      Ollama download on Hugging Face
      Figure 2.5: Ollama download on Hugging Face
    4. In the bottom left of the Open WebUI window, click your avatar icon to open the user menu and select Admin Panel .

    5. Click the Settings tab and select Connections from the left menu.

    6. In the Manage Ollama API Connections section, click the small "download" icon on the right to open the Manage Ollama screen.

      Downloading Ollama models from Hugging Face
      Figure 2.6: Downloading Ollama models from Hugging Face
    7. Paste the full path to the model tag from Hugging Face in the Pull a model from Ollama.com input field and confirm by clicking the small download button on the right.

      Entering model tag to download
      Figure 2.7: Entering model tag to download

      The model download starts and, after it is finished, the model will be available as a base model for AI interaction.

2.3 Using AI models from OpenAI-compatible providers

Instead of downloading AI models locally, you can use OpenAI-compatible API to access models from model providers. These providers include both local instances of servers, such as LocalAI or llamafile, and cloud providers, such as Groq or OpenAI. This procedure describes how to configure the Groq provider API from the Open WebUI interface.

Requirements
  • You must have Open WebUI administrator privileges to access configuration screens or settings mentioned in this section.

    1. Create a Groq cloud account at https://console.groq.com/.

    2. Create an API key at https://console.groq.com/keys.

    3. In the bottom left of the Open WebUI window, click your avatar icon to open the user menu and select Admin Panel .

    4. Click the Settings tab and select Connections from the left menu.

    5. In the Manage OpenAI API Connections section, click the small "plus" icon on the right to open the Add connection screen.

    6. Enter https://api.groq.com/openai/v1 as the base URL and the API key that you have previously created. Confirm with Save .

      Adding the Groq API
      Figure 2.8: Adding the Groq API

      Open WebUI now shows all the available models from Groq in the model selector drop-down list.

      List of AI modules from the Groq API
      Figure 2.9: List of AI modules from the Groq API