A step-by-step guide using the unofficial HuggingChat API (no APIs required)
Hey, Streamlit-ers! 👋
My name is Chanin Nantasenamat, PhD. I’m working as a Senior Developer Advocate creating educational content on building Streamlit data apps. In my spare time, I love to create coding and data science tutorials on my YouTube channel, Data Professor.
Are you looking to build an AI-powered chatbot using LLM models but without the heavy API cost? If you answered yes, then keep reading!
You’ll build a chatbot that can generate responses to the user-provided prompt input (i.e., questions) using an open-source, no-cost LLM model OpenAssistant/oasst-sft-6-llama-30b-xor from the unofficial HuggingChat API known as HugChat. You’ll deploy the chatbot as a Streamlit app that can be shared with the world!
In this post, you’ll learn how to:
- Set up the app on the Streamlit Community Cloud
- Build the chatbot
What the HugChat app can do
Before we proceed with the tutorial, let's quickly grasp the app's functionality. Head over to the app and get familiar with its layout—(1) the sidebar provides app info, and (2) the main panel displays conversational messages:
Interact with it by (1) entering your prompt into the text input box and (2) reading the human/bot messages.
app-starter-kit repo to use as the template for creating the chatbot app. Then click on "Use this template":
Give the repo a name (such as mychatbot). Next, click "Create repository from the template." A copy of the repo will be placed in your account:
Next, follow this blog post to get the newly cloned repo deployed on the Streamlit Community Cloud. When done, you should be able to see the deployed app:
requirements.txt file by adding the following prerequisite Python libraries:
streamlit hugchat streamlit-chat streamlit-extras
This will spin up a server with these prerequisites pre-installed.
Let's take a look at the contents of
import streamlit as st st.title('🎈 App Name') st.write('Hello world!')
In subsequent sections, you will modify the contents of this file with code snippets about the chatbot.
Finally, before proceeding with app building, let's take a look at how the user will interact with it:
Front-end: The user submits an input prompt (by providing a string of text to the text box via
st.text_input()), and the app generates a response.
Back-end: Input prompt is sent to
hugchat(the unofficial port to the HuggingChat API) via
streamlit-chatfor generating a response.
Front-end: Generated responses are displayed in the app via's
Build the chatbot
Fire up the
streamlit_app.py file and replace the original content with code snippets mentioned below.
1. Required libraries
Import prerequisite Python libraries:
import streamlit as st from streamlit_chat import message from streamlit_extras.colored_header import colored_header from streamlit_extras.add_vertical_space import add_vertical_space from hugchat import hugchat
2. Page config
Name the app using the
page_title input argument in the
st.set_page_config method (it'll be used as the app title and as the title in the preview when sharing on social media):
st.set_page_config(page_title="HugChat - An LLM-powered Streamlit app")
Create a sidebar with some information about your chatbot:
with st.sidebar: st.title('🤗💬 HugChat App') st.markdown(''' ## About This app is an LLM-powered chatbot built using: - [Streamlit](<https://streamlit.io/>) - [HugChat](<https://github.com/Soulter/hugging-chat-api>) - [OpenAssistant/oasst-sft-6-llama-30b-xor](<https://huggingface.co/OpenAssistant/oasst-sft-6-llama-30b-xor>) LLM model 💡 Note: No API key required! ''') add_vertical_space(5) st.write('Made with ❤️ by [Data Professor](<https://youtube.com/dataprofessor>)')
with statement to confine the constituent contents to the sidebar. They include:
- The app title is specified via
- A short description of the app via
- Vertical space added via
- A short credit message via
4. Session state
Initialize the chatbot by giving it a starter message at the first app run:
if 'generated' not in st.session_state: st.session_state['generated'] = ["I'm HugChat, How may I help you?"] if 'past' not in st.session_state: st.session_state['past'] = ['Hi!']
past denotes the human user's input and
generated indicates the bot's response.
5. App layout
Give the app a general layout. The main panel will display the chat query and responses:
input_container = st.container() colored_header(label='', description='', color_name='blue-30') response_container = st.container()
st.container() as a placeholder where the
response_container variables correspond to the human user and chatbot, respectively.
6. Human user input
get_text() custom function that will take prompts provided by the human user as input using
st.text_input(). This custom function displays a text box in the
# User input ## Function for taking user provided prompt as input def get_text(): input_text = st.text_input("You: ", "", key="input") return input_text ## Applying the user input box with input_container: user_input = get_text()
7. Bot response output
generate_response(prompt) custom function for taking in the user's input prompt as an argument to generate an AI response using the HuggingChat API via the
hugchat.ChatBot() method (this LLM model can be swapped with any other one):
# Response output ## Function for taking user prompt as input followed by producing AI generated responses def generate_response(prompt): chatbot = hugchat.ChatBot() response = chatbot.chat(prompt) return response
response_container with the AI-generated response with the two underlying
- If the user has entered their input query, the
if user_inputstatement will become
Trueand the underlying statements will run.
- The user-provided prompt (
user_input) will serve as an input argument to
generate_response()to make the AI-generated response.
- Subsequently, the generated output will be assigned to the
- Both values for
responsewill be saved to the session state via the
- When there are bot-generated messages, the
if st.session_state['generated']statement returns
Trueand the underlying statements will run.
forloop iterates through the list of generated messages in
- The human (
st.session_state['past']) and the bot (
st.session_state['generated']) messages are displayed via the
message()command from the
## Conditional display of AI generated responses as a function of user provided prompts with response_container: if user_input: response = generate_response(user_input) st.session_state.past.append(user_input) st.session_state.generated.append(response) if st.session_state['generated']: for i in range(len(st.session_state['generated'])): message(st.session_state['past'][i], is_user=True, key=str(i) + '_user') message(st.session_state['generated'][i], key=str(i))
In this post, I've shown you how to create a chatbot app using an open-source LLM from the unofficial HuggingChat API and Streamlit. You can create your own AI-powered chatbot in only a few lines of code without needing API keys.
I hope this tutorial encourages you to explore the endless possibilities of chatbot development using different models and techniques. The sky is the limit!
If you have any questions, please leave them in the comments below or contact me on Twitter at @thedataprof or on LinkedIn. Share your app creations on social media and tag me or the Streamlit account, and I'll be happy to provide feedback or help retweet!
Happy Streamlit-ing! 🎈
This is a companion discussion topic for the original entry at https://blog.streamlit.io/how-to-build-an-llm-powered-chatbot-with-streamlit/