About

llm-shell is an extensible, developer-oriented command-line console that can interact with multiple Large Language Models (LLMs). It serves as both a demo of the llmrb/llm library and a tool to help improve the library through real-world usage and feedback. Jump to the Demos section to see it in action!

Features

General

  • 🌟 Unified interface for multiple Large Language Models (LLMs)
  • 🀝 Supports Gemini, OpenAI, Anthropic, DeepSeek, LlamaCpp and Ollama

Customize

  • πŸ“€ Attach local files as conversation context
  • πŸ”§ Extend with your own functions and tool calls
  • πŸš€ Extend with your own console commands

Shell

  • πŸ€– Builtin auto-complete powered by Readline
  • 🎨 Builtin syntax highlighting powered by Coderay
  • πŸ“„ Deploys the less pager for long outputs
  • πŸ“ Advanced Markdown formatting and output

Demos

1. Tools: "system" function
2. Files: import at runtime
3. Files: import at boot time

Customization

Functions

For security and safety reasons, a user must confirm the execution of all function calls before they happen and also add the function to an allowlist before it will be loaded by llm-shell automatically at boot time.

The ~/.llm-shell/tools/ directory can contain one or more llmrb/llm functions that the LLM can call once you confirm you are okay with executing the code locally (along with any arguments it provides). See the earlier demo for an example:

LLM.function(:system) do |fn|
  fn.description "Run a shell command"
  fn.params do |schema|
    schema.object(command: schema.string.required)
  end
  fn.define do |params|
    ro, wo = IO.pipe
    re, we = IO.pipe
    Process.wait Process.spawn(params.command, out: wo, err: we)
    [wo,we].each(&:close)
    {stderr: re.read, stdout: ro.read}
  end
end

Commands

llm-shell can be extended with your own console commands. This can be done by creating a Ruby file in the ~/.llm-shell/commands/ directory – with one file per command. The commands are loaded at boot time. See the file-import, dir-import, show-history, clear-screen and system-prompt commands for more realistic examples:

LLM.command "say-hello" do |cmd|
  cmd.description "Say hello to somebody"
  cmd.define do |name|
    io.rewind.print "Hello #{name}!"
  end
end

Prompts

It is recommended that custom prompts instruct the LLM to emit markdown, otherwise you might see unexpected results because llm-shell assumes the LLM will emit markdown.

The first message in a conversation is sometimes known as a β€œsystem prompt”, and it defines the expectations and rules to be followed by an LLM throughout a conversation. The default prompt used by llm-shell can be found at default.txt.

The prompt can be changed by adding a file to the ~/.llm-shell/prompts/ directory, and then choosing it at boot time with the -r PROMPT, --prompt PROMPT options. Generally you probably want to fork default.txt to conserve the original prompt rules around markdown and files, then modify it to suit your own needs and preferences.

Settings

YAML

The console client can be configured at the command line through option switches, or through a YAML file. The YAML file can contain the same options that could be specified at the command line. For cloud providers the key option is the only required parameter, everything else has defaults. The YAML file is read from the path ${HOME}/.llm-shell/config.yml and it has the following format:

# ~/.config/llm-shell.yml
openai:
  key: YOURKEY
  model: gpt-4o-mini
gemini:
  key: YOURKEY
  model: gemini-2.0-flash-001
anthropic:
  key: YOURKEY
  model: claude-3-7-sonnet-20250219
deepseek:
  key: YOURKEY
  model: deepseek-chat
ollama:
  host: localhost
  model: deepseek-coder:6.7b
llamacpp:
  host: localhost
  model: qwen3
tools:
  - system

Usage

CLI

Usage: llm-shell <span class="o">[</span>OPTIONS]
    <span class="nt">-p</span>, <span class="nt">--provider</span> NAME      Required. Options: gemini, openai, anthropic, ollama or llamacpp.
    <span class="nt">-k</span>, <span class="nt">--key</span> <span class="o">[</span>KEY]          Optional. Required by gemini, openai, and anthropic.
    <span class="nt">-m</span>, <span class="nt">--model</span> <span class="o">[</span>MODEL]      Optional. The name of a model.
    <span class="nt">-h</span>, <span class="nt">--host</span> <span class="o">[</span>HOST]        Optional. Sometimes required by ollama.
    <span class="nt">-o</span>, <span class="nt">--port</span> <span class="o">[</span>PORT]        Optional. Sometimes required by ollama.
    <span class="nt">-f</span>, <span class="nt">--files</span> <span class="o">[</span>GLOB]       Optional. Glob pattern<span class="o">(</span>s<span class="o">)</span> separated by a comma.
    <span class="nt">-t</span>, <span class="nt">--tools</span> <span class="o">[</span>TOOLS]      Optional. One or more tool names to load automatically.
    <span class="nt">-r</span>, <span class="nt">--prompt</span> <span class="o">[</span>PROMPT]    Optional. The prompt to use.
    <span class="nt">-v</span>, <span class="nt">--version</span>            Optional. Print the version and <span class="nb">exit</span>

Install

llm-shell can be installed via rubygems.org

gem install llm-shell

License

BSD Zero Clause
See LICENSE