Class: LLM::Anthropic

Inherits:
Provider show all
Defined in:
lib/llm/providers/anthropic.rb,
lib/llm/providers/anthropic/format.rb,
lib/llm/providers/anthropic/models.rb,
lib/llm/providers/anthropic/error_handler.rb,
lib/llm/providers/anthropic/response_parser.rb

Overview

The Anthropic class implements a provider for Anthropic

Defined Under Namespace

Classes: Models

Constant Summary collapse

HOST =
"api.anthropic.com"

Instance Method Summary collapse

Methods inherited from Provider

#audio, #chat, #chat!, #files, #images, #inspect, #respond, #respond!, #responses, #schema, #with

Constructor Details

#initializeAnthropic

Returns a new instance of Anthropic.

Parameters:

  • key (String, nil)

    The secret key for authentication



19
20
21
# File 'lib/llm/providers/anthropic.rb', line 19

def initialize(**)
  super(host: HOST, **)
end

Instance Method Details

#embed(input, key:, model: "voyage-2", **params) ⇒ LLM::Response::Embedding

Provides an embedding via VoyageAI per Anthropic’s recommendation

Parameters:

  • key (String)

    Valid key for the VoyageAI API

  • model (String) (defaults to: "voyage-2")

    The embedding model to use

  • params (Hash)

    Other embedding parameters

  • input (String, Array<String>)

    The input to embed

Returns:

Raises:



35
36
37
38
# File 'lib/llm/providers/anthropic.rb', line 35

def embed(input, key:, model: "voyage-2", **params)
  llm = LLM.voyageai(key:)
  llm.embed(input, **params.merge(model:))
end

#complete(prompt, params = {}) ⇒ LLM::Response::Completion

Provides an interface to the chat completions API

Examples:

llm = LLM.openai(ENV["KEY"])
messages = [{role: "system", content: "Your task is to answer all of my questions"}]
res = llm.complete("5 + 2 ?", messages:)
print "[#{res.choices[0].role}]", res.choices[0].content, "\n"

Parameters:

  • prompt (String)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Returns:

Raises:

See Also:



50
51
52
53
54
55
56
57
58
59
60
# File 'lib/llm/providers/anthropic.rb', line 50

def complete(prompt, params = {})
  params = {role: :user, model: default_model, max_tokens: 1024}.merge!(params)
  params = [params, format_tools(params)].inject({}, &:merge!).compact
  role = params.delete(:role)
  req = Net::HTTP::Post.new("/v1/messages", headers)
  messages = [*(params.delete(:messages) || []), Message.new(role, prompt)]
  body = JSON.dump({messages: [format(messages)].flatten}.merge!(params))
  set_body_stream(req, StringIO.new(body))
  res = request(@http, req)
  Response::Completion.new(res).extend(response_parser)
end

#modelsLLM::Anthropic::Models

Provides an interface to Anthropic’s models API



66
67
68
# File 'lib/llm/providers/anthropic.rb', line 66

def models
  LLM::Anthropic::Models.new(self)
end

#assistant_roleString

Returns the role of the assistant in the conversation. Usually “assistant” or “model”

Returns:

  • (String)

    Returns the role of the assistant in the conversation. Usually “assistant” or “model”



72
73
74
# File 'lib/llm/providers/anthropic.rb', line 72

def assistant_role
  "assistant"
end

#default_modelString

Returns the default model for chat completions

Returns:

  • (String)

See Also:



80
81
82
# File 'lib/llm/providers/anthropic.rb', line 80

def default_model
  "claude-3-5-sonnet-20240620"
end