Knowledgebase

How to build a Chatbot in Rust with Large Language Models on Cloud GPU Print

  • 0

Introduction

Large language models (LLM) are artificial intelligence models that process natural language inputs and generate human-like outputs. You can implement LLM operations for various tasks such as chatbots, virtual assistants, and customer service. Rust is a statically and strongly typed programming language that focuses on performance and safety. It's a good choice for developing chatbots and other LLM-powered applications.

This article explains how to build a Chatbot in Rust with Large Language Models on a Rcs Cloud GPU server. You will use the Leptos web framework to build a web application in Rust. Then, integrate a Large Language Model to enable the Chatbot application processes.

Prerequisites

Before you begin:

Set Up the Development Server

  1. Update the server.

    console
    $ sudo apt-get update
    
  2. Install build-essential and libssl-dev dependency packages.

    console
    $ sudo apt-get install -y build-essential libssl-dev
    
  3. Install the latest Rust toolchain and Cargo packages.

    console
    $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
    

    When prompted for your desired installation type, enter 1 to select the Proceed with installation (default) option.

  4. Activate Rust in your server session.

    console
    $ source "$HOME/.cargo/env"
    
  5. View the installed Rust and Cargo versions.

    console
    $ rustc --version && cargo --version
    
  6. Install the cargo-leptos package using cargo.

    console
    $ cargo install cargo-leptos@0.2.4
    
  7. Create a new project rust-chatbot using Cargo and the start-axum-workspace template.

    console
    $ cargo leptos new --git https://github.com/quanhua92/start-axum-workspace --name rust-chatbot
    
  8. Switch to the rust-chatbot directory.

    console
    $ cd rust-chatbot
    
  9. Install the wasm32 target using rustup

    console
    $ rustup target add wasm32-unknown-unknown
    

Build the Rust Chatbot Application

  1. Edit the Cargo.toml file in the app directory using a text editor such as Nano.

    console
    $ nano app/Cargo.toml
    
  2. Add the following serde crate to the dependencies section within the file.

    toml
    [dependencies]
    serde = { version = "1.0.188", features = ["derive"] }
    
  3. Create a models sub-directory in the app/src directory.

    console
    $ mkdir -p app/src/models
    
  4. Create a new file conversation.rs in the app/src/models directory to implement the conversation area flow.

    console
    $ nano app/src/models/conversation.rs
    
  5. Add the following code to the file.

    rust
    use serde::Deserialize;
    use serde::Serialize;
    
    #[derive(Serialize, Deserialize, Clone, Debug)]
    pub struct Message {
        pub text: String,
        pub sender: String,
    }
    
    #[derive(Serialize, Deserialize, Clone, Debug)]
    pub struct Conversation {
        pub messages: Vec<Message>,
    }
    
    impl Conversation {
        pub fn new() -> Self {
            Self {
                messages: Vec::new(),
            }
        }
    }
    

    Save and close the file.

  6. Create a new mod.rs in the app/src/models directory.

    console
    $ nano app/src/models/mod.rs
    
  7. Add the following code to the file.

    rust
    pub mod conversation;
    pub use conversation::{Conversation, Message};
    

    The above code creates the Conversation struct and Message struct to manage the chat data.

Create the Conversation Area

  1. Create a components sub-directory in the app/src directory.

    console
    $ mkdir -p app/src/components
    
  2. Create a new file conversation_area.rs in the app/src/components directory.

    console
    $ nano app/src/components/conversation_area.rs
    
  3. Add the following code to the file.

    rust
    use crate::models::Conversation;
    use leptos::html::Div;
    use leptos::logging::log;
    use leptos::*;
    
    #[component]
    pub fn ConversationArea(conversation: ReadSignal<Conversation>) -> impl IntoView {
        let div_ref = create_node_ref::<Div>();
    
        create_effect(move |_| {
            let c = conversation.get();
            log!("ConversationArea: {:?}", c);
            if let Some(div) = div_ref.get() {
                request_animation_frame(move || {
                    div.set_scroll_top(div.scroll_height());
                });
            }
        });
    
        view! {
            <div class="conversation-area" node_ref=div_ref>
                { move || conversation.get().messages.iter().map(move |message| {
                    view! {
                        <div class="message">
                            <span class="message-sender">{message.sender.clone()}</span>
                            <p class="message-text">{message.text.clone()}</p>
                        </div>
                    }
                })
                .collect::<Vec<_>>()
                }
    
            </div>
        }
    }
    

    Save and close the file.

    This creates a component named ConversationArea that displays all messages in the conversation.

Create the Application Input Area

  1. Create a new file input_area.rs in the app/src/components directory.

    console
    $ nano app/src/components/input_area.rs
    
  2. Add the following code to the file.

    rust
    use crate::models::Conversation;
    use leptos::html::Input;
    use leptos::*;
    
    #[component]
    pub fn InputArea(submit: Action<String, Result<Conversation, ServerFnError>>) -> impl IntoView {
        let text_ref = create_node_ref::<Input>();
        view! {
            <form class="input-area" on:submit=move |ev| {
                ev.prevent_default();
                let input = text_ref.get().expect("input exists");
                let user_input = input.value();
                let user_input = user_input.trim();
                if !user_input.is_empty() {
                    submit.dispatch(user_input.to_string());
                    input.set_value("");
                }
            }>
                <input type="text" class="input-area-text" placeholder="Enter a prompt here" node_ref=text_ref/>
                <input type="submit" class="input-area-button" value="Send"/>
            </form>
        }
    }
    

    Save and close the file.

    The above code creates a new component InputArea that displays a form with a text input field and a submit button to send a new prompt. The button uses a submit signal to send the new prompt message to the parent component.

  3. Create a new mod.rs file in the app/src/components/ directory to import the input area.

    console
    $ nano app/src/components/mod.rs
    
  4. Add the following code to the file.

    rust
    pub mod conversation_area;
    pub mod input_area;
    
    pub use conversation_area::ConversationArea;
    pub use input_area::InputArea;
    

    Save and close the file.

Apply CSS Styling to the Application Interface

  1. Back up the original style/main.scss file.

    console
    $ mv style/main.scss style/main.scss.ORIG
    
  2. Create a new style/main.scss file to include the application style elements.

    console
    $ nano style/main.scss
    
  3. Add the following CSS code to the file.

    css
    body {
      font-family: sans-serif;
      text-align: center;
      margin: 0;
      padding: 0;
    }
    
    .chat-area {
      display: flex;
      flex-direction: column;
      height: 100vh;
      justify-content: space-between;
    }
    
    .conversation-area {
      overflow: auto;
      display: flex;
      flex-direction: column;
      padding: 0.25rem;
    }
    
    .conversation-area > .message {
      display: flex;
      align-items: center;
      gap: 0.5rem;
      border-bottom: 1px solid hsl(0, 0, 0%, 10%);
    }
    
    .conversation-area > .message > .message-sender {
      min-width: 40px;
      height: 40px;
      border-radius: 20px;
      background-color: hsl(0, 0, 0%, 10%);
      display: flex;
      align-items: center;
      justify-content: center;
      font-size: 0.7em;
    }
    
    .input-area {
      display: flex;
      justify-content: space-between;
      gap: 0.5rem;
      padding: 0.25rem;
    }
    
    .input-area-text {
      flex-grow: 1;
      min-height: 2em;
    }
    

    Save and close the file.

Create the Application Server Function

In this section, create a server function to handle the chat conversation. The function uses the Leptos framework to call both frontend and backend server functions. When accessed, the Leptos framework sends a fetch request to the server, serializes arguments, and deserializes the return value from the response.

  1. Create a new file api.rs in the app/src directory.

    console
    $ nano app/src/api.rs
    
  2. Add the following code to the file.

    rust
    use crate::models::Conversation;
    use crate::models::Message;
    use leptos::logging::log;
    use leptos::*;
    
    #[server(ProcessConversation, "/api")]
    pub async fn process_conversation(
        conversation: Conversation,
    ) -> Result<Conversation, ServerFnError> {
        log!("process_conversation {:?}", conversation);
        let mut conversation = conversation;
    
        conversation.messages.push(Message {
            text: "Response from AI".to_string(),
            sender: "AI".to_string(),
        });
        Ok(conversation)
    }
    

    Save and close the file.

    The above code creates a server function process_conversation that displays the text Response from AI within the application conversation area.

  3. Back up the original lib.rs in the app/src directory

    console
    $ mv app/src/lib.rs app/src/lib.ORIG
    
  4. Create the lib.rs file.

    console
    $ nano app/src/lib.rs
    
  5. Add the following contents to the file.

    rust
    use leptos::*;
    use leptos_meta::*;
    use leptos_router::*;
    
    pub mod api;
    pub mod components;
    pub mod error_template;
    pub mod models;
    
    use crate::api::process_conversation;
    use crate::components::{ConversationArea, InputArea};
    use crate::models::{Conversation, Message};
    
    #[component]
    pub fn App() -> impl IntoView {
        // Provides context that manages stylesheets, titles, meta tags, etc.
        provide_meta_context();
    
        view! {
            // injects a stylesheet into the document <head>
            // id=leptos means cargo-leptos will hot-reload this stylesheet
            <Stylesheet id="leptos" href="/pkg/start-axum-workspace.css"/>
    
            // sets the document title
            <Title text="Welcome to Rust Chatbot"/>
    
            // content for this welcome page
            <Router>
                <main>
                    <Routes>
                        <Route path="" view=|| view! { <HomePage/> }/>
                    </Routes>
                </main>
            </Router>
        }
    }
    
    /// Renders the home page of your application.
    #[component]
    fn HomePage() -> impl IntoView {
        // Creates a reactive value to update the button
        let (conversation, set_conversation) = create_signal(Conversation::new());
        let send_message = create_action(move |input: &String| {
            let message = Message {
                text: input.clone(),
                sender: "User".to_string(),
            };
            set_conversation.update(move |c| {
                c.messages.push(message);
            });
    
            process_conversation(conversation.get())
        });
    
        create_effect(move |_| {
            if let Some(_) = send_message.input().get() {
                set_conversation.update(move |c| {
                    c.messages.push(Message {
                        text: "...".to_string(),
                        sender: "AI".to_string(),
                    });
                });
            }
        });
    
        create_effect(move |_| {
            if let Some(Ok(response)) = send_message.value().get() {
                set_conversation.set(response);
            }
        });
    
        view! {
            <div class="chat-area">
                <ConversationArea conversation />
                <InputArea submit=send_message />
            </div>
        }
    }
    

    Save and close the file.

    In the above code, the web application renders the HomePage component that contains the ConversationArea and InputArea. The signal send_message calls the process_conversation server function to process the conversation.

  6. Build the application using cargo.

    console
    $ cargo leptos build
    

    When successful, verify the build process does not return any errors.

  7. By default, UFW is active on Rcs Ubuntu servers. To enable the application interface, allow the HTTP server port 3000 through the firewall.

    console
    $ sudo ufw allow 3000
    
  8. Allow the web socket port 3001.

    console
    $ sudo ufw allow 3001
    
  9. Reload the Firewall rules to apply changes.

    console
    $ sudo ufw reload
    
  10. Run the application using cargo to test the application interface.

    console
    $ LEPTOS_SITE_ADDR=0.0.0.0:3000 cargo leptos watch
    
  11. Visit your Server IP on port 3000 to access the application.

    http://<SERVER-IP>:3000

    The Rust Chatbot Application Interface

  12. In your server terminal session, press Ctrl + C on your keyboard to stop the running application process.

Add a Language Model to the Application

To enable all application processes, integrate a pre-trained Large Language Model (LLM) that generates a response within the application. The model loads o


Was this answer helpful?
Back

Powered by WHMCompleteSolution