Streamlit llm generated answer goes out of message box or became scrollable

Hi! I am running a GPT4ALL model to work on study bot, but for some reason the LLM answer exceed out the box or became scrollable sideway instead of sitting inside the message box.

I am running the model and app locally.

I wrote my code as message function, but it happens same for markdown as well.

        message(user_query, is_user=True)
        message(generated_response)
        # st.chat_message("user").markdown(user_query)
        # st.chat_message("ai").markdown(generated_response)

Hi @chrisim,

Thanks for sharing this question!

So it seems the same issue has been logged by another user in the GitHub repo of the component.

I would recommend using the Streamlit version of the chat interface st.chat_input and st.chat_message that way it is always standard and you can have direct support from us on the issues you run into.

You can also reach out to the creator of Streamlit-Chat for support.

1 Like

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.