On this page:
prompt!
current-prompt-port
current-send-prompt!
current-response-timeout
llm-lang-logger
8.15.0.2

1 LLM API Library🔗ℹ

procedure

(prompt! strs ...)  (or/c void? string?)

  strs : string?
Sends the current prompt, from current-prompt-port, to the current LLM backend via current-send-prompt!. The values strs are first written to current-prompt-port. No prompt is sent os the current-prompt-port is empty. The result is the string returned by the LLM, or (void) if no prompt is sent.

Examples:
> (require llm llm/ollama/phi3)
> (display
   (prompt!
    "Please write a haiku about the reliability and performance of Phi3 for use in software engineering."
    "Make it a short haiku."))

Phi3, swiftly reliable,

Engineering trust built solid—

Code flows like truth's river.

parameter

(current-prompt-port)  string-port?

(current-prompt-port port)  void?
  port : string-port?
A parameter to which the prompt is written before being sent to the current backend. Default value is a new output string-port?.

parameter

(current-send-prompt!)  (-> string? ... void?)

(current-send-prompt! prompt!)  void?
  prompt! : (-> string? ... void?)
A parameter that defines how to send a prompt to the current backend. Typically configured by importing a backend, rather than accessed manually.

A parameter that defines the how many seconds to wait for a response from the LLM after sending a prompt.

A logger? that reports debug and cost information about llm.