How to place text box to the bottom?

I wanna make chat app with streamlit (without using streamlit-chat). So how can i put my text box to the bottom ? Please help !

1 Like

Hey @sahil_bhatt,

Thanks for sharing your question! Please edit your post to include a code snippet so we can see how you’ve tried to implement this and make suggestions.

Hi there Caroline! New here, hope i can jump in for a similar question.
Im wondering if it is possible to create a ‘chat’ environment like whatsapp/telegram where the text input is at the bottom of the page and the messages appear above the textbox.

currently my solution looks like this

But the ideal solution is something like this

This is my current streamlit code:

st.set_page_config(
    page_title="Assistant",
    page_icon=":robot:"
)

st.header("Welcome to Assistant")

if 'generated' not in st.session_state:
    st.session_state['generated'] = []

if 'past' not in st.session_state:
    st.session_state['past'] = []

def get_text():
    input_text = st.text_input("Input Message: ","", key="input")
    return input_text 

user_input = get_text()

if user_input:
    output = chatquery({
        "inputs": {
            "past_user_inputs": st.session_state.past,
            "generated_responses": st.session_state.generated,
            "text": user_input,
        },"parameters": {"repetition_penalty": 1.33},
    })

    st.session_state.past.append(user_input)
    st.session_state.generated.append(output)

if st.session_state['generated']:

    # for i in range(len(st.session_state['generated'])-1, -1, -1):
    #     message(st.session_state["generated"][i], key=str(i))
    #     message(st.session_state['past'][i], is_user=True, key=str(i) + '_user')
    for i in range(len(st.session_state['generated'])-1, -1, -1):
        message(st.session_state['past'][::-1][i], is_user=True, key=str(i) + '_user')
        message(st.session_state["generated"][::-1][i], key=str(i))

To sum up:

  1. How can the textbox be moved to the bottom of the page?
  2. How can a large collection of messages in chat format be displayed? (when the message list exceeds items on the screen they should not make the page scroll)

Thank you :slight_smile:

Your code has undefined variables that prevent me from running it and I cannot understand everything it does just by reading. This the general idea:

  1. Display the history (stored in session_state).
  2. Ask for user input using text_input with a callback.
  3. In the callback, modify the history as appropriate and clear the input. You can also generate the bot answers here.
def on_message_change():
    st.session_state.history.append(st.session_state.message)
    st.session_state.message = ""

if "history" not in st.session_state:
    st.session_state.history = []

for message in st.session_state.history:
    st.write(message)

st.text_input(label="Message", key="message", on_change=on_message_change)

Then what should they do?

I just had a quick attempt at your solution but i couldnt figure it out. Its already late so i will take a better look at it tommorow.

here is actually my full code

import streamlit as st
from streamlit_chat import message
########################################________LANGCHAIN________####################
import os

###############################################____LLM____################
# Using OPENAI LLM's
from langchain.llms import OpenAI
# Creating Prompt Templates
from langchain.prompts import PromptTemplate
# Creating Chains
from langchain.chains import LLMChain
def query(payload):
    llm = OpenAI(temperature=0.9)
    prompt = PromptTemplate(input_variables=["Product"],
                            template="{Product}")
    chain = LLMChain(llm=llm, prompt=prompt)
    response = chain.run(payload["inputs"]["text"])
    # respone is a string
    return response 
###############################################____CHAT MODEL____################
from langchain.chat_models import ChatOpenAI
from langchain import LLMChain
from langchain.prompts.chat import (
    ChatPromptTemplate,
    SystemMessagePromptTemplate,
    HumanMessagePromptTemplate,
)
def chatquery(payload):
    chat = ChatOpenAI(temperature=0, streaming=True)

    template="You are a helpful assistant"
    system_message_prompt = SystemMessagePromptTemplate.from_template(template)
    human_template="{text}"
    human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
    chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])

    chain = LLMChain(llm=chat, prompt=chat_prompt)
    result = chain.run(text=payload["inputs"]["text"])
    # Result is a string
    return result

st.set_page_config(
    page_title="Assistant",
    page_icon=":robot:"
)

st.header("Welcome to Assistant")

# state to hold generated output of llm
if 'generated' not in st.session_state:
    st.session_state['generated'] = []

# state to hold past user messages
if 'past' not in st.session_state:
    st.session_state['past'] = []

# streamlit text input
def get_text():
    input_text = st.text_input("Input Message: ","", key="input")
    return input_text 

user_input = get_text()

# check if text input has been filled in
if user_input:
    # run langchain llm function returns a string as output
    output = chatquery({
        "inputs": {
            "past_user_inputs": st.session_state.past,
            "generated_responses": st.session_state.generated,
            "text": user_input,
        },"parameters": {"repetition_penalty": 1.33},
    })

    # append user_input and output to state
    st.session_state.past.append(user_input)
    st.session_state.generated.append(output)

# If responses have been generated by the model
if st.session_state['generated']:
    # Reverse iteration through the list
    for i in range(len(st.session_state['generated'])-1, -1, -1):
        # message from streamlit_chat
        message(st.session_state['past'][::-1][i], is_user=True, key=str(i) + '_user')
        message(st.session_state["generated"][::-1][i], key=str(i))

# I would expect get_text() needs to be called here as a callback
# But i have issues with retreving user_input

And when it comes to point 2 i mean this. the text input needs to be sticky, but the messages can be scrolling. in the image you can see the text input disapearing from the screen

I will be back tomorrow with an update. thanks for the help. much appreciated

use a place holder, then you can move the text box to the bottom.

fixed the textbox. i used the on_change callback in user_input and retreived the data stored in the state through the key param in text_input:

# check if text input has been filled in
def inputchange():
        # run langchain llm function returns a string as output
        output = chatquery({
            "inputs": {
                "past_user_inputs": st.session_state.past,
                "generated_responses": st.session_state.generated,
                "text": st.session_state.input,
            },"parameters": {"repetition_penalty": 1.33},
        })

        # append user_input and output to state
        st.session_state.past.append(st.session_state.input)
        st.session_state.generated.append(output)

# If responses have been generated by the model
if st.session_state['generated']:
    # Reverse iteration through the list
    for i in range(len(st.session_state['generated'])-1, -1, -1):
        # message from streamlit_chat
        message(st.session_state['past'][::-1][i], is_user=True, key=str(i) + '_user')
        message(st.session_state['generated'][::-1][i], key=str(i))

user_input = st.text_input("Input Message: ","", key="input", on_change=inputchange)

now only the problem left is the textbox disapearing after too many messages

1 Like

Hi @sahil_bhatt , are you able to solve the disappearing input text box? The chat history should go up and input text should remain at its position?

I guess you could hack your way using CSS or JS or both, but I can’t help with that.

1 Like

Hi @Goyo any one who could help me achieve this?

styl = f"""
<style>
    .stTextInput {{
      position: fixed;
      bottom: 3rem;
    }}
</style>
"""
st.markdown(styl, unsafe_allow_html=True)
3 Likes

Thank you very much @alanjo It works like a charm. Do you know how to make the last message to be above the text input?