Class: LLM::Anthropic
- Defined in:
- lib/llm/providers/anthropic.rb,
lib/llm/providers/anthropic/format.rb,
lib/llm/providers/anthropic/error_handler.rb,
lib/llm/providers/anthropic/response_parser.rb
Overview
The Anthropic class implements a provider for Anthropic
Constant Summary collapse
- HOST =
"api.anthropic.com"
Instance Method Summary collapse
-
#initialize(secret) ⇒ Anthropic
constructor
A new instance of Anthropic.
-
#embed(input, token:, model: "voyage-2", **params) ⇒ LLM::Response::Embedding
Provides an embedding via VoyageAI per Anthropic’s recommendation.
-
#complete(prompt, role = :user, model: "claude-3-5-sonnet-20240620", max_tokens: 1024, **params) ⇒ LLM::Response::Completion
Provides an interface to the chat completions API.
-
#assistant_role ⇒ String
Returns the role of the assistant in the conversation.
-
#models ⇒ Hash<String, LLM::Model>
Returns a hash of available models.
Methods inherited from Provider
#audio, #chat, #chat!, #files, #images, #inspect, #respond, #respond!, #responses
Constructor Details
Instance Method Details
#embed(input, token:, model: "voyage-2", **params) ⇒ LLM::Response::Embedding
Provides an embedding via VoyageAI per Anthropic’s recommendation
33 34 35 36 |
# File 'lib/llm/providers/anthropic.rb', line 33 def (input, token:, model: "voyage-2", **params) llm = LLM.voyageai(token) llm.(input, **params.merge(model:)) end |
#complete(prompt, role = :user, model: "claude-3-5-sonnet-20240620", max_tokens: 1024, **params) ⇒ LLM::Response::Completion
Provides an interface to the chat completions API
49 50 51 52 53 54 55 56 |
# File 'lib/llm/providers/anthropic.rb', line 49 def complete(prompt, role = :user, model: "claude-3-5-sonnet-20240620", max_tokens: 1024, **params) params = {max_tokens:, model:}.merge!(params) req = Net::HTTP::Post.new("/v1/messages", headers) = [*(params.delete(:messages) || []), Message.new(role, prompt)] req.body = JSON.dump({messages: format()}.merge!(params)) res = request(@http, req) Response::Completion.new(res).extend(response_parser) end |
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually “assistant” or “model”
60 61 62 |
# File 'lib/llm/providers/anthropic.rb', line 60 def assistant_role "assistant" end |
#models ⇒ Hash<String, LLM::Model>
Returns a hash of available models
66 67 68 |
# File 'lib/llm/providers/anthropic.rb', line 66 def models @models ||= load_models!("anthropic") end |