Run a LLM chatbot within a column

Hi,
Is there a way O can run a LLM chatbot with in a column or sidebar?

You can do it but you won

Hi @kattapug,

Thanks for posting!

You can build one but you would have to use st.text_input for chat input because st.chat_input can only be used once per app page and inside the main area of the app. It cannot be used in the sidebar, columns, expanders, forms, or tabs.

We however plan to support this in the future.

Here’s example basic code snippet for sidebar implementation:

import streamlit as st
from openai import OpenAI

client = OpenAI(
    api_key='YOUR-OPENAI-API-KEY' # this can also be stored in your secrets.toml file
)


def llm_chatbot_response(user_input):
    try:
        # Sending the prompt to OpenAI's GPT-3
        response = client.chat.completions.create(
            model="gpt-4-1106-preview",  # You can choose different models
            messages=[
                {
                "role": "user",
                "content": f"{user_input}"
                }
            ],
            temperature=1,
            max_tokens=4095,
            top_p=1,
            frequency_penalty=0,
            presence_penalty=0
        )
        print(response.choices[0].message.content)
        return response.choices[0].message.content
    except Exception as e:
        return f"An error occurred: {str(e)}"

# Sidebar for chatbot interaction
with st.sidebar:
    st.title("LLM Chatbot")
    user_input = st.text_input("Ask your question...")
    if user_input:
        response = llm_chatbot_response(user_input)
        st.write(response)
1 Like

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.