In the rapidly evolving world of artificial intelligence (AI), chatbots have emerged as powerful tools for businesses to enhance customer interactions and streamline operations. Leveraging the capabilities of Language Models, chatbots are becoming increasingly intelligent, capable of understanding and responding to human language with remarkable accuracy.
In this blog, we will explore the fascinating world of building a chatbot using LLM (Large Language Models) and two popular frameworks: HugChat and Streamlit. LLMs, such as OpenAI’s GPT-3.5, have revolutionized natural language processing and understanding, enabling chatbots to converse more naturally and provide contextually relevant responses.
We will dive into a step-by-step process of developing an LLM-powered chatbot using HugChat, a powerful Python library that simplifies the integration of LLMs into chatbot applications. Furthermore, we will leverage Streamlit, a user-friendly framework for creating interactive web applications, to provide a seamless user interface and deployment platform for our chatbot.
Getting Started
Table of contents
- What is HugChat
- Creating a basic chatbot
- What is Streamlit
- Creating a basic streamlit app
- Building a LLM powered chatbot
What is HugChat
HugChat is an innovative and versatile Python package designed to simplify the development of chatbot applications. With HugChat, developers can quickly and effortlessly create intelligent conversational agents that interact with users in a natural and engaging manner.
HuggingChat is a freely available and open source option that serves as an alternative to commercial chat platforms like ChatGPT. It utilizes the LLaMa 30B SFT 6 (oasst-sft-6-llama-30b) model developed by OpenAssistant. Although the model may not be on par with GPT-4 in terms of capabilities, it is a highly competent LLM with a fascinating training background that is worth exploring.
hugchat – HuggingChat Python API Leave a star |pypi.org
Creating a basic chatbot
In this section, we will create a basic chatbot using hugchat python package.
Installing the dependencies
Install the hugchat package using the following commands.
- Create and activate a virtual environment by executing the following command.
python -m venv venvsource venv/bin/activate #for ubuntuvenv/Scripts/activate #for windowspython -m venv venv source venv/bin/activate #for ubuntu venv/Scripts/activate #for windowspython -m venv venv source venv/bin/activate #for ubuntu venv/Scripts/activate #for windows
Enter fullscreen mode Exit fullscreen mode
- Install hugchat package using pip.
pip install hugchatpip install hugchatpip install hugchat
Enter fullscreen mode Exit fullscreen mode
Setting up the authentication
Hugchat chatbot class requires a cookie_path
parameter to authenticate the server. Use the following steps provided by the author to create a cookies.json
file.
- Install the Cookie-Editor extension for Chrome or Firefox
- Go to HuggingChat and login
- Open the extension
- Click Export on the bottom right, then Export as JSON(This saves your cookies to the clipboard)
- Create a cookies.json file and paste the clipboard content to it.
Creating the app
Create a python file basic.py
and add the following code to it.
from hugchat import hugchatchatbot = hugchat.ChatBot(cookie_path="cookies.json")id = chatbot.new_conversation()chatbot.change_conversation(id)print('Welcome to chatMATE')print('\'q\' or \'quit\' to exit')print('\'c\' or \'change\' to change conversation')print('\'n\' or \'new\' to start a new conversation')while True:user_input = input('> ')if user_input.lower() == '':passelif user_input.lower() in ['q', 'quit']:breakelif user_input.lower() in ['c', 'change']:print('Choose a conversation to switch to:')print(chatbot.get_conversation_list())elif user_input.lower() in ['n', 'new']:print('Clean slate!')id = chatbot.new_conversation()chatbot.change_conversation(id)else:print(chatbot.chat(user_input))from hugchat import hugchat chatbot = hugchat.ChatBot(cookie_path="cookies.json") id = chatbot.new_conversation() chatbot.change_conversation(id) print('Welcome to chatMATE') print('\'q\' or \'quit\' to exit') print('\'c\' or \'change\' to change conversation') print('\'n\' or \'new\' to start a new conversation') while True: user_input = input('> ') if user_input.lower() == '': pass elif user_input.lower() in ['q', 'quit']: break elif user_input.lower() in ['c', 'change']: print('Choose a conversation to switch to:') print(chatbot.get_conversation_list()) elif user_input.lower() in ['n', 'new']: print('Clean slate!') id = chatbot.new_conversation() chatbot.change_conversation(id) else: print(chatbot.chat(user_input))from hugchat import hugchat chatbot = hugchat.ChatBot(cookie_path="cookies.json") id = chatbot.new_conversation() chatbot.change_conversation(id) print('Welcome to chatMATE') print('\'q\' or \'quit\' to exit') print('\'c\' or \'change\' to change conversation') print('\'n\' or \'new\' to start a new conversation') while True: user_input = input('> ') if user_input.lower() == '': pass elif user_input.lower() in ['q', 'quit']: break elif user_input.lower() in ['c', 'change']: print('Choose a conversation to switch to:') print(chatbot.get_conversation_list()) elif user_input.lower() in ['n', 'new']: print('Clean slate!') id = chatbot.new_conversation() chatbot.change_conversation(id) else: print(chatbot.chat(user_input))
Enter fullscreen mode Exit fullscreen mode
Running the app
Execute the following command to run the app.
python basic.pypython basic.pypython basic.py
Enter fullscreen mode Exit fullscreen mode
You will get the output as below.
What is Streamlit
Streamlit is an open-source Python library that simplifies the process of building interactive web applications for data science and machine learning tasks. It provides a user-friendly framework for creating and sharing data-focused applications, without requiring extensive web development knowledge. With Streamlit, developers can quickly prototype and deploy interactive dashboards, visualizations, and data-driven applications.
Streamlit * A faster way to build and share data apps | streamlit.io
- Install streamlit using the following command
pip install streamlitpip install streamlitpip install streamlit
Enter fullscreen mode Exit fullscreen mode
- Start a basic hello from streamlit using the following command
streamlit hellostreamlit hellostreamlit hello
Enter fullscreen mode Exit fullscreen mode
Creating a basic streamlit app
In this section we will create a basic streamlit app.
Installing the dependencies
Install the streamlit package using the following commands.
- Create and activate a virtual environment by executing the following command.
python -m venv venvsource venv/bin/activate #for ubuntuvenv/Scripts/activate #for windowspython -m venv venv source venv/bin/activate #for ubuntu venv/Scripts/activate #for windowspython -m venv venv source venv/bin/activate #for ubuntu venv/Scripts/activate #for windows
Enter fullscreen mode Exit fullscreen mode
- Install streamlit package using pip.
pip install streamlitpip install streamlitpip install streamlit
Enter fullscreen mode Exit fullscreen mode
Creating the app
Create a python file greeting.py
and add the following code to it.
import streamlit as stst.title("ChatMATE")st.write("Your new virtual assistant")name = st.text_input(label="",placeholder="Enter your name", max_chars=50)if st.button('Submit'):st.write('Hi ' + name + ". How can I assist you today ?")import streamlit as st st.title("ChatMATE") st.write("Your new virtual assistant") name = st.text_input(label="",placeholder="Enter your name", max_chars=50) if st.button('Submit'): st.write('Hi ' + name + ". How can I assist you today ?")import streamlit as st st.title("ChatMATE") st.write("Your new virtual assistant") name = st.text_input(label="",placeholder="Enter your name", max_chars=50) if st.button('Submit'): st.write('Hi ' + name + ". How can I assist you today ?")
Enter fullscreen mode Exit fullscreen mode
For more information on inputs controls and methods, checkout the following streamlit documentation reference.
Get started – Streamlit Docs | docs.streamlit.io
Running the app
Execute the following command to run the app.
streamlit run greeting.pystreamlit run greeting.pystreamlit run greeting.py
Enter fullscreen mode Exit fullscreen mode
You will get the output as below.
Building a LLM powered chatbot
We have discussed HugChat package and Streamlit so far. In this session, we will build a chatbot that can generate responses to the user input using a free, open-source LLM model OpenAssistant/oasst-sft-6-llama-30b-xor from the unofficial HuggingChat API known as HugChat and Streamlit.
Installing the dependencies
Install the hugchat and streamlit packages using the following commands.
- Create and activate a virtual environment by executing the following command.
python -m venv venvsource venv/bin/activate #for ubuntuvenv/Scripts/activate #for windowspython -m venv venv source venv/bin/activate #for ubuntu venv/Scripts/activate #for windowspython -m venv venv source venv/bin/activate #for ubuntu venv/Scripts/activate #for windows
Enter fullscreen mode Exit fullscreen mode
- Install streamlit, streamlit-chat, streamlit-extras, hugchat packages using pip.
pip install streamlit streamlit-chat streamlit-extras hugchatpip install streamlit streamlit-chat streamlit-extras hugchatpip install streamlit streamlit-chat streamlit-extras hugchat
Enter fullscreen mode Exit fullscreen mode
- Generate
cookies.json
file using the steps mentioned in the “Setting up the authentication” part or use the existingcookies.json
file. Place thecookies.json
file in the root folder where theapp.py
is located. - Create a file app.py and add the following code to it.
import streamlit as stfrom streamlit_chat import messagefrom streamlit_extras.colored_header import colored_headerfrom streamlit_extras.add_vertical_space import add_vertical_spacefrom hugchat import hugchatst.set_page_config(page_title="ChatMATE - Chat Multi-purpose AI Technology")# Generate empty lists for bot_response and user_input.## bot_response stores AI generated responsesif 'bot_response' not in st.session_state:st.session_state['bot_response'] = ["I'm ChatMATE, How may I help you?"]## user_input stores User's questionsif 'user_input' not in st.session_state:st.session_state['user_input'] = ['Hi!']# Layout of input/response containersinput_container = st.container()colored_header(label='', description='', color_name='blue-30')response_container = st.container()# User input## Function for taking user provided prompt as inputdef get_input():input_text = st.text_input("You: ", "", key="input")return input_text## Applying the user input boxwith input_container:user_input = get_input()# Response output## Function for taking user prompt as input followed by producing AI generated responsesdef generate_response(prompt):chatbot = hugchat.ChatBot(cookie_path="cookies.json")response = chatbot.chat(prompt)return response## Conditional display of AI generated responses as a function of user provided promptswith response_container:if user_input:response = generate_response(user_input)st.session_state.user_input.append(user_input)st.session_state.bot_response.append(response)if st.session_state['bot_response']:for i in range(len(st.session_state['bot_response'])):message(st.session_state['user_input'][i], is_user=True, key=str(i) + '_user')message(st.session_state['bot_response'][i], key=str(i))import streamlit as st from streamlit_chat import message from streamlit_extras.colored_header import colored_header from streamlit_extras.add_vertical_space import add_vertical_space from hugchat import hugchat st.set_page_config(page_title="ChatMATE - Chat Multi-purpose AI Technology") # Generate empty lists for bot_response and user_input. ## bot_response stores AI generated responses if 'bot_response' not in st.session_state: st.session_state['bot_response'] = ["I'm ChatMATE, How may I help you?"] ## user_input stores User's questions if 'user_input' not in st.session_state: st.session_state['user_input'] = ['Hi!'] # Layout of input/response containers input_container = st.container() colored_header(label='', description='', color_name='blue-30') response_container = st.container() # User input ## Function for taking user provided prompt as input def get_input(): input_text = st.text_input("You: ", "", key="input") return input_text ## Applying the user input box with input_container: user_input = get_input() # Response output ## Function for taking user prompt as input followed by producing AI generated responses def generate_response(prompt): chatbot = hugchat.ChatBot(cookie_path="cookies.json") response = chatbot.chat(prompt) return response ## Conditional display of AI generated responses as a function of user provided prompts with response_container: if user_input: response = generate_response(user_input) st.session_state.user_input.append(user_input) st.session_state.bot_response.append(response) if st.session_state['bot_response']: for i in range(len(st.session_state['bot_response'])): message(st.session_state['user_input'][i], is_user=True, key=str(i) + '_user') message(st.session_state['bot_response'][i], key=str(i))import streamlit as st from streamlit_chat import message from streamlit_extras.colored_header import colored_header from streamlit_extras.add_vertical_space import add_vertical_space from hugchat import hugchat st.set_page_config(page_title="ChatMATE - Chat Multi-purpose AI Technology") # Generate empty lists for bot_response and user_input. ## bot_response stores AI generated responses if 'bot_response' not in st.session_state: st.session_state['bot_response'] = ["I'm ChatMATE, How may I help you?"] ## user_input stores User's questions if 'user_input' not in st.session_state: st.session_state['user_input'] = ['Hi!'] # Layout of input/response containers input_container = st.container() colored_header(label='', description='', color_name='blue-30') response_container = st.container() # User input ## Function for taking user provided prompt as input def get_input(): input_text = st.text_input("You: ", "", key="input") return input_text ## Applying the user input box with input_container: user_input = get_input() # Response output ## Function for taking user prompt as input followed by producing AI generated responses def generate_response(prompt): chatbot = hugchat.ChatBot(cookie_path="cookies.json") response = chatbot.chat(prompt) return response ## Conditional display of AI generated responses as a function of user provided prompts with response_container: if user_input: response = generate_response(user_input) st.session_state.user_input.append(user_input) st.session_state.bot_response.append(response) if st.session_state['bot_response']: for i in range(len(st.session_state['bot_response'])): message(st.session_state['user_input'][i], is_user=True, key=str(i) + '_user') message(st.session_state['bot_response'][i], key=str(i))
Enter fullscreen mode Exit fullscreen mode
Understanding the code:
- Import prerequisite Python libraries such as
streamlit
andhugchat
. - Name the app using the
page_title
input argument in thest.set_page_config
method. - Initialize the chatbot by providing it a starter message, keep data in
bot_response
anduser_input
states. - Create a general app layout using
st.container()
as a placeholder where theinput_container
andresponse_container
variables correspond to the user and chatbot respectively. - The
get_input()
method takes inputs from the user usingst.text_input()
. - The
generate_response(prompt)
method takes user input as an argument to generate response using the HuggingChat API viahugchat.ChatBot()
.
Running the app
Execute the following command to run the app.
streamlit run app.pystreamlit run app.pystreamlit run app.py
Enter fullscreen mode Exit fullscreen mode
You will get the output as below,
There you have it! Your first free open source LLM based chatbot 🙂
Thanks for reading this article.
Thanks Gowri M Bhatt for reviewing the content.
If you enjoyed this article, please click on the heart button and share to help others find it!
The full source code for this tutorial can be found here,
GitHub – codemaker2015/chatMATE: An LLM-powered ChatBot with HugChat and Streamlit | github.com
The article is also available on Medium.
Here are some useful links,
- hugchat – HuggingChat Python API Leave a star | pypi.org
- Streamlit * A faster way to build and share data apps | streamlit.io
原文链接:Build LLM-powered chatbot in 5 minutes using HugChat and Streamlit
暂无评论内容