Module: LLM
- Defined in:
- lib/llm.rb,
lib/llm/bot.rb,
lib/llm/error.rb,
lib/llm/buffer.rb,
lib/llm/client.rb,
lib/llm/message.rb,
lib/llm/version.rb,
lib/llm/response.rb,
lib/llm/eventhandler.rb,
lib/llm/providers/xai.rb,
lib/llm/providers/zai.rb,
lib/llm/providers/gemini.rb,
lib/llm/providers/ollama.rb,
lib/llm/providers/openai.rb,
lib/llm/providers/deepseek.rb,
lib/llm/providers/llamacpp.rb,
lib/llm/providers/anthropic.rb
Defined Under Namespace
Modules: Client Classes: Anthropic, Bot, Buffer, DeepSeek, Error, File, Function, Gemini, LlamaCpp, Message, Object, Ollama, OpenAI, Provider, Response, ResponseError, Schema, ServerTool, Tool, XAI, ZAI
Constant Summary collapse
Class.new(ResponseError)
- RateLimitError =
HTTPTooManyRequests
Class.new(ResponseError)
- ServerError =
HTTPServerError
Class.new(ResponseError)
- NoImageError =
When no images are found in a response
Class.new(ResponseError)
- FormatError =
When an given an input object that is not understood
Class.new(Error)
- PromptError =
When given a prompt object that is not understood
Class.new(FormatError)
- VERSION =
"1.0.0"
Class Method Summary collapse
-
.File(obj) ⇒ LLM::File
-
.gemini ⇒ Gemini
A new instance of Gemini.
-
.ollama(key: nil) ⇒ Ollama
A new instance of Ollama.
-
.llamacpp(key: nil) ⇒ LLM::LlamaCpp
-
.deepseek ⇒ LLM::DeepSeek
-
.openai ⇒ OpenAI
A new instance of OpenAI.
-
.xai ⇒ XAI
A new instance of XAI.
-
.zai ⇒ ZAI
A new instance of ZAI.
-
.function(key, &b) ⇒ LLM::Function
Define a function.
-
.lock(name) ⇒ void
Provides a thread-safe lock.
-
.anthropic ⇒ Anthropic
A new instance of Anthropic.
Class Method Details
.File(obj) ⇒ LLM::File
82 83 84 85 86 87 88 89 90 91 |
# File 'lib/llm/file.rb', line 82 def LLM.File(obj) case obj when File obj.close unless obj.closed? LLM.File(obj.path) when LLM::File, LLM::Response then obj when String then LLM::File.new(obj) else raise TypeError, "don't know how to handle #{obj.class} objects" end end |
.gemini ⇒ Gemini
Returns a new instance of Gemini.
41 42 43 44 |
# File 'lib/llm.rb', line 41 def gemini(**) lock(:require) { require_relative "llm/providers/gemini" unless defined?(LLM::Gemini) } LLM::Gemini.new(**) end |
.ollama(key: nil) ⇒ Ollama
Returns a new instance of Ollama.
49 50 51 52 |
# File 'lib/llm.rb', line 49 def ollama(key: nil, **) lock(:require) { require_relative "llm/providers/ollama" unless defined?(LLM::Ollama) } LLM::Ollama.new(key:, **) end |
.llamacpp(key: nil) ⇒ LLM::LlamaCpp
57 58 59 60 |
# File 'lib/llm.rb', line 57 def llamacpp(key: nil, **) lock(:require) { require_relative "llm/providers/llamacpp" unless defined?(LLM::LlamaCpp) } LLM::LlamaCpp.new(key:, **) end |
.deepseek ⇒ LLM::DeepSeek
65 66 67 68 |
# File 'lib/llm.rb', line 65 def deepseek(**) lock(:require) { require_relative "llm/providers/deepseek" unless defined?(LLM::DeepSeek) } LLM::DeepSeek.new(**) end |
.openai ⇒ OpenAI
Returns a new instance of OpenAI.
73 74 75 76 |
# File 'lib/llm.rb', line 73 def openai(**) lock(:require) { require_relative "llm/providers/openai" unless defined?(LLM::OpenAI) } LLM::OpenAI.new(**) end |
.xai ⇒ XAI
Returns a new instance of XAI.
82 83 84 85 |
# File 'lib/llm.rb', line 82 def xai(**) lock(:require) { require_relative "llm/providers/xai" unless defined?(LLM::XAI) } LLM::XAI.new(**) end |
.zai ⇒ ZAI
Returns a new instance of ZAI.
91 92 93 94 |
# File 'lib/llm.rb', line 91 def zai(**) lock(:require) { require_relative "llm/providers/zai" unless defined?(LLM::ZAI) } LLM::ZAI.new(**) end |
.function(key, &b) ⇒ LLM::Function
Define a function
111 112 113 |
# File 'lib/llm.rb', line 111 def function(key, &b) LLM::Function.new(key, &b) end |
.lock(name) ⇒ void
This method returns an undefined value.
Provides a thread-safe lock
120 |
# File 'lib/llm.rb', line 120 def lock(name, &) = @monitors[name].synchronize(&) |