Module: LLM
- Defined in:
- lib/llm.rb,
lib/llm/bot.rb,
lib/llm/error.rb,
lib/llm/buffer.rb,
lib/llm/client.rb,
lib/llm/message.rb,
lib/llm/version.rb,
lib/llm/response.rb,
lib/llm/eventhandler.rb,
lib/llm/providers/xai.rb,
lib/llm/providers/gemini.rb,
lib/llm/providers/ollama.rb,
lib/llm/providers/openai.rb,
lib/llm/providers/deepseek.rb,
lib/llm/providers/llamacpp.rb,
lib/llm/providers/anthropic.rb
Defined Under Namespace
Modules: Client Classes: Anthropic, Bot, Buffer, DeepSeek, Error, File, Function, Gemini, LlamaCpp, Message, Object, Ollama, OpenAI, Provider, Response, ResponseError, Schema, Tool, XAI
Constant Summary collapse
Class.new(ResponseError)
- RateLimitError =
HTTPTooManyRequests
Class.new(ResponseError)
- ServerError =
HTTPServerError
Class.new(ResponseError)
- FormatError =
When an given an input object that is not understood
Class.new(Error)
- PromptError =
When given a prompt object that is not understood
Class.new(FormatError)
- VERSION =
"0.15.0"
Class Method Summary collapse
-
.File(obj) ⇒ LLM::File
-
.gemini ⇒ Gemini
A new instance of Gemini.
-
.ollama(key: nil) ⇒ Ollama
A new instance of Ollama.
-
.llamacpp(key: nil) ⇒ LLM::LlamaCpp
-
.deepseek ⇒ LLM::DeepSeek
-
.openai ⇒ OpenAI
A new instance of OpenAI.
-
.xai ⇒ XAI
A new instance of XAI.
-
.function(name, &b) ⇒ LLM::Function
Define or get a function.
-
.functions ⇒ Hash<String,LLM::Function>
Returns all known functions.
-
.anthropic ⇒ Anthropic
A new instance of Anthropic.
Class Method Details
.File(obj) ⇒ LLM::File
82 83 84 85 86 87 88 89 90 91 |
# File 'lib/llm/file.rb', line 82 def LLM.File(obj) case obj when File obj.close unless obj.closed? LLM.File(obj.path) when LLM::File, LLM::Response then obj when String then LLM::File.new(obj) else raise TypeError, "don't know how to handle #{obj.class} objects" end end |
.gemini ⇒ Gemini
Returns a new instance of Gemini.
36 37 38 39 |
# File 'lib/llm.rb', line 36 def gemini(**) require_relative "llm/providers/gemini" unless defined?(LLM::Gemini) LLM::Gemini.new(**) end |
.ollama(key: nil) ⇒ Ollama
Returns a new instance of Ollama.
44 45 46 47 |
# File 'lib/llm.rb', line 44 def ollama(key: nil, **) require_relative "llm/providers/ollama" unless defined?(LLM::Ollama) LLM::Ollama.new(key:, **) end |
.llamacpp(key: nil) ⇒ LLM::LlamaCpp
52 53 54 55 |
# File 'lib/llm.rb', line 52 def llamacpp(key: nil, **) require_relative "llm/providers/llamacpp" unless defined?(LLM::LlamaCpp) LLM::LlamaCpp.new(key:, **) end |
.deepseek ⇒ LLM::DeepSeek
60 61 62 63 |
# File 'lib/llm.rb', line 60 def deepseek(**) require_relative "llm/providers/deepseek" unless defined?(LLM::DeepSeek) LLM::DeepSeek.new(**) end |
.openai ⇒ OpenAI
Returns a new instance of OpenAI.
68 69 70 71 |
# File 'lib/llm.rb', line 68 def openai(**) require_relative "llm/providers/openai" unless defined?(LLM::OpenAI) LLM::OpenAI.new(**) end |
.xai ⇒ XAI
Returns a new instance of XAI.
77 78 79 80 |
# File 'lib/llm.rb', line 77 def xai(**) require_relative "llm/providers/xai" unless defined?(LLM::XAI) LLM::XAI.new(**) end |
.function(name, &b) ⇒ LLM::Function
Define or get a function
97 98 99 100 101 102 103 |
# File 'lib/llm.rb', line 97 def function(name, &b) if block_given? functions[name.to_s] = LLM::Function.new(name, &b) else functions[name.to_s] end end |
.functions ⇒ Hash<String,LLM::Function>
Returns all known functions
108 109 110 |
# File 'lib/llm.rb', line 108 def functions @functions ||= {} end |