Class: LLM::LlamaCpp

Inherits:
OpenAI show all
Defined in:
lib/llm/providers/llamacpp.rb

Overview

The LlamaCpp class implements a provider for llama.cpp through the OpenAI-compatible API provided by the llama-server binary. Similar to the ollama provider, this provider supports a wide range of models and is straightforward to run on your own hardware.

Examples:

#!/usr/bin/env ruby
require "llm"

llm = LLM.llamacpp(key: nil)
ses = LLM::Session.new(llm)
ses.talk ["Tell me about this photo", ses.local_file("/images/photo.png")]
ses.messages.select(&:assistant?).each { print "[#{_1.role}]", _1.content, "\n" }

Constant Summary

Constants inherited from OpenAI

OpenAI::HOST

Instance Method Summary collapse

Methods inherited from OpenAI

#assistant_role, #complete, #embed, #models, #server_tools, #web_search

Methods inherited from Provider

#assistant_role, #chat, clients, #complete, #developer_role, #embed, #inspect, #models, #persist!, #respond, #schema, #server_tool, #server_tools, #system_role, #tool_role, #tracer, #tracer=, #user_role, #web_search, #with

Constructor Details

#initialize(host: "localhost", port: 8080, ssl: false) ⇒ LLM::LlamaCpp

Parameters:

  • key (String, nil)

    The secret key for authentication

  • host (String) (defaults to: "localhost")

    The host address of the LLM provider

  • port (Integer) (defaults to: 8080)

    The port number

  • timeout (Integer)

    The number of seconds to wait for a response

  • ssl (Boolean) (defaults to: false)

    Whether to use SSL for the connection

  • persistent (Boolean)

    Whether to use a persistent connection. Requires the net-http-persistent gem.



26
27
28
# File 'lib/llm/providers/llamacpp.rb', line 26

def initialize(host: "localhost", port: 8080, ssl: false, **)
  super
end

Instance Method Details

#default_modelString

Returns the default model for chat completions

Returns:

  • (String)

See Also:



77
78
79
# File 'lib/llm/providers/llamacpp.rb', line 77

def default_model
  "qwen3"
end

#nameSymbol

Returns the provider's name

Returns:

  • (Symbol)

    Returns the provider's name



33
34
35
# File 'lib/llm/providers/llamacpp.rb', line 33

def name
  :llamacpp
end

#filesObject

Raises:

  • (NotImplementedError)


39
40
41
# File 'lib/llm/providers/llamacpp.rb', line 39

def files
  raise NotImplementedError
end

#imagesObject

Raises:

  • (NotImplementedError)


45
46
47
# File 'lib/llm/providers/llamacpp.rb', line 45

def images
  raise NotImplementedError
end

#audioObject

Raises:

  • (NotImplementedError)


51
52
53
# File 'lib/llm/providers/llamacpp.rb', line 51

def audio
  raise NotImplementedError
end

#moderationsObject

Raises:

  • (NotImplementedError)


57
58
59
# File 'lib/llm/providers/llamacpp.rb', line 57

def moderations
  raise NotImplementedError
end

#responsesObject

Raises:

  • (NotImplementedError)


63
64
65
# File 'lib/llm/providers/llamacpp.rb', line 63

def responses
  raise NotImplementedError
end

#vector_storesObject

Raises:

  • (NotImplementedError)


69
70
71
# File 'lib/llm/providers/llamacpp.rb', line 69

def vector_stores
  raise NotImplementedError
end