OpenAI

Contents

Responses

OpenAI’s responses API is an alternative to the standard chat completions API and it has a number of advantages over the standard chat completions API. Perhaps most notably, it maintains message state on OpenAI’s servers by default but that’s not all. It also provides access to remote tools, such as web search and file search tools, and more.

The following example stores message state on OpenAI’s servers – and in turn a client can avoid maintaining state manually as well as avoid sending the entire conversation on each turn in a conversation:

#!/usr/bin/env ruby
require "llm"

llm  = LLM.openai(key: ENV["KEY"])
bot  = LLM::Bot.new(llm)
url  = "https://en.wikipedia.org/wiki/Special:FilePath/Cognac_glass.jpg"

bot.respond "Your task is to answer all user queries", role: :developer
bot.respond ["Tell me about this URL", URI(url)], role: :user
bot.respond ["Tell me about this PDF", File.open("handbook.pdf", "rb")], role: :user
bot.respond "Are the URL and PDF similar to each other?", role: :user

# At this point, we execute a single request
bot.messages.each { print "[#{_1.role}] ", _1.content, "\n" }

The next example performs a web search with OpenAI’s web search tool – and this is done on top of the responses API:

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(key: ENV["KEY"])
res = llm.responses.create("Summarize today's news", tools: [{type: "web_search"]}])
print "[assistant] ", res.ouput_text, "\n"

Moderations

OpenAI’s moderations API offers a service that can determine if a piece of text, or an image URL is considered harmful or not – across multiple categories. The interface is similar to the one provided by the official OpenAI Python and JavaScript libraries:

#!/usr/bin/env ruby
require "llm"

# Text
llm = LLM.openai(key: ENV["KEY"])
mod = llm.moderations.create input: "I hate you"
print "categories: ", mod.categories, "\n"
print "category scores: ", mod.scores, "\n"

# Image
llm = LLM.openai(key: ENV["KEY"])
mod = llm.moderations.create input: URI("https://example.com/image.png")
print "categories: ", mod.categories, "\n"
print "category scores: ", mod.scores, "\n"

Vector Stores

OpenAI’s Vector Stores API offers a vector database as a managed service. It allows a client to store a set of files, which are automatically indexed and made searchable through vector queries, with the option to apply filters to refine the results:

#!/usr/bin/env ruby
require "llm"

pdfs = ["handbook.pdf"]
llm  = LLM.openai(key: ENV["OPENAI_SECRET"])
files = pdfs.map { llm.files.create(file: _1) }
store = llm.vector_stores.create(name: "PDF Store", file_ids: files.map(&:id))
store = llm.vector_stores.poll(vector: store)
puts "Vector store is online"

puts "Search the vector store"
res = llm.vector_stores.search(vector: store, query: "What is FreeBSD?")
chunks = res.flat_map { _1["content"] }
puts "Found #{chunks.size} chunks"
files.each { llm.files.delete(file: _1) }
llm.vector_stores.delete(vector: store)

##
# Vector store is online
# Search the vector store
# Found 10 chunks

Headers

Project, Organization

The LLM::Provider#with method can add client headers to all requests, and it works for all providers but might be especially useful for OpenAI – since it allows the client to set the OpenAI-Organization and OpenAI-Project headers.

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(key: ENV["KEY"])
llm.with(headers: {"OpenAI-Project" => ENV["PROJECT"]})
   .with(headers: {"OpenAI-Organization" => ENV["ORG"]})