Icon

Tasks Extractor

<p>This workflow is referenced in the tutorial on building a <strong>Local Task Extraction Workflow</strong>. It provides a secure environment for processing sensitive documents locally by following this logical sequence:</p><ol><li><p><strong>Connect to Ollama:</strong> Establishes a local bridge between KNIME and your Ollama server, ensuring all data remains on your machine.</p></li><li><p><strong>Read the transcript data:</strong> Uses the <strong>Tika Parser</strong> to autonomously extract text from unstructured formats like PDF or DOCX.</p></li><li><p><strong>Get structured tasks:</strong> Employs the <strong>LLM Prompter</strong> to execute "Structured Output" logic, transforming raw text into the specific headers defined in Step 5.</p></li></ol>

This workflow is referenced in the tutorial on building a Local Task Extraction Workflow. It provides a secure environment for processing sensitive documents locally by following this logical sequence:

  1. Connect to Ollama: Establishes a local bridge between KNIME and your Ollama server, ensuring all data remains on your machine.

  2. Read the transcript data: Uses the Tika Parser to autonomously extract text from unstructured formats like PDF or DOCX.

  3. Get structured tasks: Employs the LLM Prompter to execute "Structured Output" logic, transforming raw text into the specific headers defined in Step 5.

1. Connect to Ollama

2. Read the transcript data

Build a Local Task Extraction Workflow

3. Get structured tasks

Open the LLM Prompter and configure the following Output Columns:

Project (String, Single)

Description: The project this task belongs to.


Responsible (String, Single)

Description: Person owning the task (can be null).


Deadline (String, Single)

Description: Due date or time reference (can be null).


Stakeholders (String, Single)

Description: People or groups involved or impacted.


Status (String, Single)

Description: Current task state (open, planned, in progress, tentative, confirmed).

transcript.docx
Tika Parser
Dummy credentials
Credentials Configuration
llama 3.2:3B
OpenAI LLM Selector
LLM Prompter
Ollama local server athttp://localhost:11434/v1
OpenAI Authenticator

Nodes

Extensions

Links