Class: LLM::Gemini
- Defined in:
- lib/llm/providers/gemini.rb,
lib/llm/providers/gemini/audio.rb,
lib/llm/providers/gemini/files.rb,
lib/llm/providers/gemini/format.rb,
lib/llm/providers/gemini/images.rb,
lib/llm/providers/gemini/models.rb,
lib/llm/providers/gemini/error_handler.rb,
lib/llm/providers/gemini/stream_parser.rb
Overview
The Gemini class implements a provider for Gemini. The Gemini provider can accept multiple inputs (text, images, audio, and video). The inputs can be provided inline via the prompt for files under 20MB or via the Gemini Files API for files that are over 20MB.
Defined Under Namespace
Modules: Response Classes: Audio, Files, Images, Models
Constant Summary collapse
- HOST =
"generativelanguage.googleapis.com"
Instance Method Summary collapse
-
#default_model ⇒ String
Returns the default model for chat completions.
-
#embed(input, model: "text-embedding-004", **params) ⇒ LLM::Response
Provides an embedding.
-
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API.
-
#audio ⇒ LLM::Gemini::Audio
Provides an interface to Gemini’s audio API.
-
#images ⇒ see LLM::Gemini::Images
Provides an interface to Gemini’s image generation API.
-
#files ⇒ LLM::Gemini::Files
Provides an interface to Gemini’s file management API.
-
#models ⇒ LLM::Gemini::Models
Provides an interface to Gemini’s models API.
-
#assistant_role ⇒ String
Returns the role of the assistant in the conversation.
-
#initialize ⇒ Gemini
constructor
A new instance of Gemini.
Methods inherited from Provider
#chat, #chat!, #inspect, #moderations, #respond, #respond!, #responses, #schema, #vector_stores, #with
Constructor Details
Instance Method Details
#default_model ⇒ String
Returns the default model for chat completions
124 125 126 |
# File 'lib/llm/providers/gemini.rb', line 124 def default_model "gemini-2.5-flash" end |
#embed(input, model: "text-embedding-004", **params) ⇒ LLM::Response
Provides an embedding
48 49 50 51 52 53 54 55 |
# File 'lib/llm/providers/gemini.rb', line 48 def (input, model: "text-embedding-004", **params) model = model.respond_to?(:id) ? model.id : model path = ["/v1beta/models/#{model}", "embedContent?key=#{@key}"].join(":") req = Net::HTTP::Post.new(path, headers) req.body = JSON.dump({content: {parts: [{text: input}]}}) res = execute(request: req) LLM::Response.new(res).extend(LLM::Gemini::Response::Embedding) end |
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API
67 68 69 70 71 72 73 74 75 76 77 78 79 80 |
# File 'lib/llm/providers/gemini.rb', line 67 def complete(prompt, params = {}) params = {role: :user, model: default_model}.merge!(params) params = [params, format_schema(params), format_tools(params)].inject({}, &:merge!).compact role, model, stream = [:role, :model, :stream].map { params.delete(_1) } action = stream ? "streamGenerateContent?key=#{@key}&alt=sse" : "generateContent?key=#{@key}" model.respond_to?(:id) ? model.id : model path = ["/v1beta/models/#{model}", action].join(":") req = Net::HTTP::Post.new(path, headers) = [*(params.delete(:messages) || []), LLM::Message.new(role, prompt)] body = JSON.dump({contents: format()}.merge!(params)) set_body_stream(req, StringIO.new(body)) res = execute(request: req, stream:) LLM::Response.new(res).extend(LLM::Gemini::Response::Completion) end |
#audio ⇒ LLM::Gemini::Audio
Provides an interface to Gemini’s audio API
86 87 88 |
# File 'lib/llm/providers/gemini.rb', line 86 def audio LLM::Gemini::Audio.new(self) end |
#images ⇒ see LLM::Gemini::Images
Provides an interface to Gemini’s image generation API
94 95 96 |
# File 'lib/llm/providers/gemini.rb', line 94 def images LLM::Gemini::Images.new(self) end |
#files ⇒ LLM::Gemini::Files
Provides an interface to Gemini’s file management API
102 103 104 |
# File 'lib/llm/providers/gemini.rb', line 102 def files LLM::Gemini::Files.new(self) end |
#models ⇒ LLM::Gemini::Models
Provides an interface to Gemini’s models API
110 111 112 |
# File 'lib/llm/providers/gemini.rb', line 110 def models LLM::Gemini::Models.new(self) end |
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually “assistant” or “model”
116 117 118 |
# File 'lib/llm/providers/gemini.rb', line 116 def assistant_role "model" end |