Summary
Is there any way to create an html prompt with streamlit? I have an initial textbox with a question, but sometimes I need more info from the user. A javascript prompt would be best I think.
Is there any way to create an html prompt with streamlit? I have an initial textbox with a question, but sometimes I need more info from the user. A javascript prompt would be best I think.
I think just another textbox would be better UX-wise and also easier to code.
Great, I could try another textbox. But, how can I make the UI wait for user input?
I don’t think I understand your question. Once the UI is drawn, nothing hapens until the user interacts with a widget. So the UI s always waiting for user input.
I need to get additional information from the user after the first interaction. But, only sometimes.
For example:
case 1:
Example pizza company, what’s your order:
… (text input)
[optional input text] Great, you want a pepperoni pizza, what’s your address?
[customer adds address]
case 2:
Example pizza company, what’s your order:
… (text input); address included
Great, sending 1 pepperoni pizza to 123 Fake st., Mobile, AL 12345
In case 1, I have an interaction that needs some updated info (eg an address) so I need to present a modal dialog for user input.
In case 2, I have an interaction that already has the address included, so no new user input dialog should be presented.
I don’t get it. You can have just a text input for the address, the user only have to type the address once even if they are ordering several pizzas. What is wrong with that? There’s no way out of typing the address once.
OK, I’m not doing a good job of explaining it…
Imagine that this is a “pizza company conversational chat bot”. In most cases, it will be sufficient to just have a single text box as input to drive the conversation. However, in some cases we will need to get additional information back from the user (eg “what is your address” “I noticed that you ordered to mediums, would you like to make it a meal for $5 more” “we’re currently out of black olives”, etc, etc)
So, I have a form with the original prompt. But, in some cases I need to gather more information from the user. I’m not sure how I would go about that.
That’s just part of the conversation. The chatbot asks questions (what’s your order? What’s your address?) or provides information (we’re out of black olives) and the user submit an answer or a reaction in a text input. Why would some questions require a different UI?
How is this a conversation you couldn’t have in whatsapp? There are no popups in whatsapp.
Yep, a popup maybe isn’t necessary. I guess I’d need to reuse the existing textbox then? Or create a new one?
How would you implement such an interface in streamlit? Is it possible? I mean, just the basic input text box.
It seems that once you create a text input that you can’t get that element again? One would need to get this element, get it’s text, clear it, etc. Is this possible?
Well, I’m guessing by the lack of updated response that what I’m asking for is not possible with this library. I’ve been beating my head against the wall with this library for 2 days now trying to get even the most simplest of interactions to work well.
I did find the chatbot component but like even the most basic apps the UI is pretty much non usable for anything other than the most simplest of use cases.
I finally got that the lib revolves around state. Fine. I tried to roll my own chat functionality using state instead and I get “bad format” errors
This code seems to just do nothing at all:
import streamlit as st
from streamlit_chat import message
st.title("ChatGPT-like Web App")
top = st.empty()
bottom = st.empty()
def text_changed():
input = st.session_state["input"]
st.session_state["input"] = ""
bottom.write(input)
bottom.info(input)
user_input = top.text_input("You:",key='input',on_change=text_changed)
And if you change the “empty” calls to “container” you get the “bad format” bug.
Ug, I’ve wasted enough time with this lib, moving on to something else.
Hey @javamonkey,
sorry for the frustration. You can find some examples for how to build chatbots with Streamlit on our new page about generative AI: https://streamlit.io/generative-ai
We’re also launching some dedicated chat features very soon, which will make this a lot simpler. Feel free to follow roadmap.streamlit.io and our social accounts for updates. Once this is launched, we plan to add a more official tutorial on how to build chatbots/LLM apps to our docs.
Again, sorry this doesn’t feel as smooth as it should be yet. Obviously, LLMs apps are a pretty new field for all of us – we’re working hard to make Streamlit better for it but we also wanted to take some time to properly think through what we want to implement. Not that we have to rip out everything again in 6 months because the field moved on so quickly.
I think this is what you are trying to do:
def text_changed():
st.session_state["last"] = st.session_state["input"]
st.session_state["input"] = ""
st.session_state.setdefault("last", "")
st.text_input("You:", key='input' ,on_change=text_changed)
st.write(st.session_state["last"])
st.info(st.session_state["last"])
Which is a start but far from enough. At a minimum you want to preserve and display the whole conversation.
def text_changed():
st.session_state["history"].append(st.session_state["input"])
st.session_state["input"] = ""
st.session_state.setdefault("history", [])
st.text_input("You:", key='input', on_change=text_changed)
for msg in st.session_state["history"]:
st.info(msg)
A conversation with yourself won’t get you a pizza, so let’s add a “pizza company conversational chat bot”.
import requests
def generate_output(history):
url = "http://hipsum.co/api/?type=hipster-centric&sentences=1"
r = requests.get(url)
return r.json()[0]
def display_history(history):
for msg in history:
st.info(f"**Bot**: {msg['out']}")
st.info(f"**You**: {msg['in']}")
def text_changed(output):
st.session_state["history"].append({"out": output, "in": st.session_state["input"]})
st.session_state["input"] = ""
st.session_state.setdefault("history", [])
with st.spinner(text="Thinking..."):
output = generate_output(history=st.session_state["history"])
st.write(output)
st.text_input("You:",key='input',on_change=text_changed, kwargs={"output": output})
display_history(history=st.session_state["history"])
Now that you have the basic functionality for a conversation you need to make the bot smarter, find a way to decide when to stop, and what to do next.
This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.
These cookies are necessary for the website to function and cannot be switched off. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms.
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us understand how visitors move around the site and which pages are most frequently visited.
These cookies are used to record your choices and settings, maintain your preferences over time and recognize you when you return to our website. These cookies help us to personalize our content for you and remember your preferences.
These cookies may be deployed to our site by our advertising partners to build a profile of your interest and provide you with content that is relevant to you, including showing you relevant ads on other websites.