Icon

Solution - KNIME AI Learnathon - Build a Chat Bot on a Knowledge Base

Learn how to build your own AI powered chatbot without writing a line of code.

During this event we will run a hands-on session in which you will get the chance to get familiar with KNIME Analytics Platform v5.1 and you will learn how to use the new KNIME AI Extension (Labs). The event will take place in English.

More info in the links below.

D) [OPTIONAL] Deploy Chat Bot Data App to KNIME Business Hub

(continues from steps outside of the component)

Step 1: Connect to KNIME Business Hub

Save the workflow (optionally you could delete the vector store creation part) and close it.
Then on the KNIME AP explorer connect to KNIME Business Hub with the credentials given you during the AI Learnathon.

Step 4: Version and Deploy Data App to other users

You can version the workflow with the "History" button on the workflow page. Then you can deploy it a data app.
Finally you can add users that upon authentication can find the data app and use it from their web browsers.

Step 2: Upload this Data App Workflow

Upload to your personal space in the team "Learnathon" the workflow. You can do this by right clicking on the workflow from the explorer. Make sure to check the box "Reset workflow"

Step 3: Open Data App on your Web Browser

Log in from the web browser, locate the workflow in your space and execute it. The data app should open in a new tab.

B) Use an LLM for a Completion Task via the Knowledge Base

Step 1: Authenticate with OpenAI and select an LLM model

Similarly to the past exercise use the OpenAI Authenticator with the Credentials Configuration flow variable input.Then add an OpenAI LLM Connector and select "gpt-3.5-turbo".

Step 2: LLM Hyperparameter Tuning

Inside the OpenAI LLM Connector customize the parameters "max tokens" and "temperature".

Step 3: Adopt the Knowledge Base from the Vector Store

The Model Reader node reads from the workflow data area the vector store you previously created. Reading again from disk the vector store is good practice to learn how to deploy with a fixed knowledge base. You can also use the output of the FAISS Vector Store Creator from the past exercise.

Step 4: Connect to the Vector Store Retriever and the LLM Prompter

Drag in the Vector Store Retriever node and the LLM Prompter node,
in-between add a String Manipulation node for Prompt Engineering.

Step 5: Export Results to an Excel File

You can save the table via an Excel Writer node. Optionally you can compare with a Table View node the answers by the LLM and the ones we imported with the questions for reference.

A) Create Knowledge Base

Step 1: Authenticate with OpenAI

Search, drag and drop OpenAI Authenticator, add the flow variable connection from the Credential Configuration node. Configure the node by selecting "credentials", then execute.

Step 2: Select embedding model to create vectors

Search, drag and drop OpenAI Embedding Connector node and execute with default settings. This will pick the AI embedding model "text-embedding-ada-002" to create vectors from the input sentences.

Step 3: Divide PDF text into sentences

Search, drag and drop Sentence Extractor node and execute on the column "Document" from the "PDF Parser" node. This will split the document cell in multiple rows: one row for each sentence. Then use a Row Filter node to remove all sentences below 5 terms.

Step 4: Create the vector store

Search, drag and drop FAISS Vector Store Creator node, connect to the OpenAI Embeddings Connector and your string sections output. Execute the node on the column with the strings to create the vector store.

Step 5: Save the vector store

Save the vector store by adding a Model Writer node. To save properly you can use a relative path: selected "Relative to" then "Workflow Data Area" and specify the name of the vector store such as "vector_store.model"

C) Create Chat Bot Data App on Knowledge Base

Step 1: Authenticate with OpenAI and select a chat model

Similarly to the past exercise use the OpenAI Authenticator with the Credentials Configuration flow variable input.Then add an OpenAI Chat Model Connector and select "gpt-3.5-turbo".

Step 2: Chat Model Hyperparameter Tuning

Inside the OpenAI Chat Model Connector customize the parameters "max tokens" and "temperature".

Step 3: Adopt the Knowledge Base from the Vector Store

Add a Model Reader node to read from the workflow data area the vector store you previously created. Reading again from disk the vector store is good practice to learn how to deploy a data app with a fixed knowledge base.

Step 4: Connect to the input of the Component

Connect the Connector output to the top input of the component, the Reader output to the bottom input.

Then open the component (Ctrl + Double Left Click on it) for the next steps.

OpenAI Authenticator
OpenAI Credentials
Credentials Configuration
OpenAI LLM Selector
LLM Prompter
Credentials Configuration
Table View
FAISS Vector Store Creator
Excel Reader
knowledge base
Model Reader
Model Writer
Excel Writer
Credentials Configuration
Column Appender
Table View
extract sentencesin different rows
Sentence Extractor
remove sentences with less than 5 terms
Row Filter (deprecated)
to open:Ctrl + Double Left Click
Custom Chat Bot on Manual Book
Strip text from PDFs
PDF Parser
OpenAI LLM Connector (deprecated)
Vector Store Retriever
prompt engineering
String Manipulation
OpenAI Authenticator
double clickto open
Prompt Engineering Hint
OpenAI Authenticator
remove groundtruth
Column Filter
OpenAI Embedding Model Selector
knowledge base
Model Reader
keep onlyLLM response
Column Filter

Nodes

Extensions

Links