Transcribes audio into the input language.
gpt-4o-transcribe
, gpt-4o-mini-transcribe
, and whisper-1
(which is powered by our open source Whisper V2 model).en
) format will improve accuracy and latency.json
, text
, srt
, verbose_json
, or vtt
. For gpt-4o-transcribe
and gpt-4o-mini-transcribe
, the only supported format is json
.If set to true, the model response data will be streamed to the client as it is generated using server-sent events. See the Streaming section of the Speech-to-Text guide for more information.
Note: Streaming is not supported for the whisper-1
model and will be ignored.
"auto"
, the server first normalizes loudness and then uses voice activity detection (VAD) to choose boundaries. server_vad
object can be provided to tweak VAD detection parameters manually. If unset, the audio is transcribed as a single block.response_format
must be set verbose_json
to use timestamp granularities. Either or both of these options are supported: word
, or segment
. Note: There is no additional latency for segment timestamps, but generating word timestamps incurs additional latency.logprobs
will return the log probabilities of the tokens in the
response to understand the model's confidence in the transcription.
logprobs
only works with response_format set to json
and only with
the models gpt-4o-transcribe
and gpt-4o-mini-transcribe
.Specify how the response should be mapped to the table output. The following formats are available:
Structured Table: Returns a parsed table with data split into rows and columns.
gpt-4o-transcribe
and gpt-4o-mini-transcribe
if logprobs
is added to the include
array.Raw Response: Returns the raw response in a single row with the following columns:
You want to see the source code for this node? Click the following button and we’ll use our super-powers to find it for you.
To use this node in KNIME, install the extension OpenAI Nodes from the below update site following our NodePit Product and Node Installation Guide:
A zipped version of the software site can be downloaded here.
Deploy, schedule, execute, and monitor your KNIME workflows locally, in the cloud or on-premises – with our brand new NodePit Runner.
Try NodePit Runner!Do you have feedback, questions, comments about NodePit, want to support this platform, or want your own nodes or workflows listed here as well? Do you think, the search results could be improved or something is missing? Then please get in touch! Alternatively, you can send us an email to mail@nodepit.com.
Please note that this is only about NodePit. We do not provide general support for KNIME — please use the KNIME forums instead.