Icon

local_​llm_​try_​on_​01

<p><strong>Overview</strong></p><p>This workflow demonstrates how to run Large Language Models (LLMs) locally using <strong>Ollama</strong> within KNIME Analytics Platform. It bypasses the need for cloud-based API calls by routing the <strong>OpenAI Authenticator</strong> to a local Ollama server, ensuring data privacy and zero API costs.</p><p></p><p><strong>Prerequisites</strong></p><ol><li><p><strong>Ollama Installed</strong>: Download from 'https://ollama.com/download'</p></li><li><p><strong>Model Downloaded</strong>: Run `ollama pull gpt-oss:20b` (or your preferred model) in your terminal.</p></li><li><p><strong>Server Running</strong>: Ensure Ollama is running `ollama serve`.</p><p></p></li></ol><p><strong>Workflow Steps</strong></p><ol><li><p><strong>Credentials Configuration</strong>:</p><ul><li><p>Provides "dummy" credentials.</p></li><li><p><strong>Username</strong>: dummy | <strong>Password</strong>: any_text (Ollama requires a key format but doesn't validate the content).</p></li></ul></li><li><p><strong>OpenAI Authenticator</strong>:</p><ul><li><p>Set to use the credentials flow variable.</p></li><li><p><strong>Advanced Settings</strong>: The "OpenAI base URL" is redirected to your local instance: http://localhost:11434/v1.</p></li></ul></li><li><p><strong>OpenAI LLM Selector</strong>:</p><ul><li><p>Connects to the Authenticator.</p></li><li><p>Enter your local model ID (e.g., gpt-oss:20b) manually if it doesn't appear in the dropdown.</p></li></ul></li><li><p><strong>Table Creator (Prompt + Query)</strong>:</p><ul><li><p>Define your input data. The first column (e.g., column1) should contain your instructions or questions.</p></li></ul></li><li><p><strong>LLM Prompter</strong>:</p><ul><li><p>Receives the model connection and the input table.</p></li><li><p>Select column1 as the <strong>Prompt column</strong> and define your <strong>Response column</strong> name.</p></li><li><p>Execute to receive the AI-generated output locally.</p></li></ul></li></ol>
Set dummy creds
Credentials Configuration
Point to Ollama serverhttp://localhost:11434/v1
OpenAI Authenticator
Select local modelgpt-oss-20b
OpenAI LLM Selector
Response
LLM Prompter
Prompt + Query
Table Creator

Nodes

Extensions

Links