Class: LLM::Model
- Inherits:
-
Object
- Object
- LLM::Model
- Defined in:
- lib/llm/model.rb
Overview
The LLM::Model class provides a normalized view of a provider model record returned by the models API.
Instance Attribute Summary collapse
-
#raw ⇒
LLM::Object readonly
The provider-specific model payload.
Instance Method Summary collapse
-
#initialize(raw) ⇒
Model constructor
A new instance of Model.
-
#id ⇒
String?
Returns a normalized identifier suitable for API calls.
-
#name ⇒
String?
Returns a display-friendly model name.
-
#chat? ⇒
Boolean
Best-effort predicate for chat support.
-
#to_h ⇒
Hash
Returns a Hash representation of the normalized model.
Constructor Details
#initialize(raw) ⇒ Model
Returns a new instance of Model.
14 15 16 |
# File 'lib/llm/model.rb', line 14 def initialize(raw) @raw = raw end |
Instance Attribute Details
#raw ⇒ LLM::Object (readonly)
The provider-specific model payload.
10 11 12 |
# File 'lib/llm/model.rb', line 10 def raw @raw end |
Instance Method Details
#id ⇒ String?
Returns a normalized identifier suitable for API calls.
21 22 23 |
# File 'lib/llm/model.rb', line 21 def id normalize_id(raw.id || raw.model || raw.name) end |
#name ⇒ String?
Returns a display-friendly model name.
28 29 30 |
# File 'lib/llm/model.rb', line 28 def name raw.display_name || raw.displayName || id end |
#chat? ⇒ Boolean
Best-effort predicate for chat support.
35 36 37 38 39 |
# File 'lib/llm/model.rb', line 35 def chat? return true if anthropic? return [*(raw.supportedGenerationMethods || [])].include?("generateContent") if google? openai_compatible_chat? end |
#to_h ⇒ Hash
Returns a Hash representation of the normalized model.
44 45 46 |
# File 'lib/llm/model.rb', line 44 def to_h {id:, name:, chat?: chat?}.compact end |