Icon

Ollama - Chat with your PDF and Llama3

Ollama - Chat with your PDF or Log Files - create and use a local vector store

Ollama - Chat with your PDF or Log Files - create and use a local vector store

To keep up with the fast pace of local LLMs I try to use more generic nodes and Python code to access Ollama and Llama3 - this workflow will run with KNIME 4.7
The chroma vector store will be persisted in a local SQLite3 database.

To get this to work you will have to install Ollama and a Python environment with the necessary packages (py3_knime_llama), downlaod the Llama3 model and an embedding model (https://ollama.com/blog/embedding-models)

---
Medium: Llama3 and KNIME - Build your local Vector Store from PDFs and other Documents
https://medium.com/p/237eda761c1c

Medium - Chat with local Llama3 Model via Ollama in KNIME Analytics Platform - Also extract Logs into structured JSON Files
https://medium.com/p/aca61e4a690a

---
You can get more example of how to work with your documents by checking these Python Codes that you could then adapt
https://github.com/ml-score/

P.S.: yes I am aware of the large empty white space but I have no idea how to remove it in KNIME 4 and have already contacted KNIME support

URL: Blog: Ask Questions from your CSV with an Open Source LLM, LangChain & a Vector DB https://www.tetranyde.com/blog/langchain-vectordb
URL: Blog: Document Loaders in LangChain https://medium.com/@varsha.rainer/document-loaders-in-langchain-7c2db9851123
URL: Blog: Unleashing Conversational Power: A Guide to Building Dynamic Chat Applications with LangChain, Qdrant, and Ollama (or OpenAI’s GPT-3.5 Turbo) https://medium.com/@ingridwickstevens/langchain-chat-with-your-data-qdrant-ollama-openai-913020ec504b
URL: Medium: Chat with local Llama3 Model via Ollama in KNIME Analytics Platform — Also extract Logs into structured JSON Files https://medium.com/@mlxl/chat-with-local-llama3-model-via-ollama-in-knime-analytics-platform-also-extract-logs-into-aca61e4a690a
URL: KNIME - LLM Workspace on the Hub https://hub.knime.com/mlauber71/spaces/LLM_Space/~17k4zAECNryrZw1X/
URL: Blog: Running models with Ollama step-by-step https://medium.com/@gabrielrodewald/running-models-with-ollama-step-by-step-60b6f6125807
URL: GitHub - Work with Ollama and Llama models https://github.com/ml-score/ollama/
URL: Medium: Llama3 and KNIME - Build your local Vector Store from PDFs and other Documents https://medium.com/p/237eda761c1c
URL: Chroma Vector Store https://github.com/chroma-core/chroma
URL: Ollama - Embedding models https://ollama.com/blog/embedding-models

Nodes

Extensions

Links