ai-cli-lib

ai-cli-lib

Add AI capabilities to any readline-enabled command-line program

Stars: 109

Visit
 screenshot

The ai-cli-lib is a library designed to enhance interactive command-line editing programs by integrating with GPT large language model servers. It allows users to obtain AI help from servers like Anthropic's or OpenAI's, or a llama.cpp server. The library acts as a command line copilot, providing natural language prompts and responses to enhance user experience and productivity. It supports various platforms such as Debian GNU/Linux, macOS, and Cygwin, and requires specific packages for installation and operation. Users can configure the library to activate during shell startup and interact with command-line programs like bash, mysql, psql, gdb, sqlite3, and bc. Additionally, the library provides options for configuring API keys, setting up llama.cpp servers, and ensuring data privacy by managing context settings.

README:

Build Status

ai-cli-lib: AI help for CLI programs

The ai-cli library detects programs that offer interactive command-line editing through the readline library, and modifies their interface to allow obtaining help from a GPT large language model server, such as Anthropic's or OpenAI's, or one provided through a llama.cpp server. Think of it as a command line copilot.

Demonstration

Build

The ai-cli library has been built and tested under the Debian GNU/Linux (bullseye) distribution (natively under version 11 and the x86_64 and armv7l architectures and under Windows Subsystem for Linux version 2), under macOS (Ventura 13.4) on the arm64 architecture using Homebrew packages and executables linked against GNU Readline (not the macOS-supplied editline compatibility layer), and under Cygwin (3.4.7). On Linux, in addition to make, a C compiler, and the GNU C library, the following packages are required: libcurl4-openssl-dev libjansson-dev libreadline-dev. On macOS, in addition to an Xcode installation, the following Homebrew packages are required: jansson readline. On Cygwin in addition to make, a C compiler, and the GNU C library, the following packages are required: libcurl-devel, libjansson-devel, libreadline-devel. Package names may be different on other systems.

cd src
make

Test

Unit testing

cd src
make unit-test

End-to-end testing

cd src
make e2e-test

This will provide you a simple read-print loop where you can test the ai-cli library's capability to link with the Readline API of third party programs.

Install

cd src

# Global installation for all users
sudo make install

# Local installation for the user executing the command
make install PREFIX=~

Run

  • Configure the ai-cli library to be activated when your bash shell starts up by adding the following lines in your .bashrc file (ideally near its beginning for performance reasons). Adjust the provided path match the ai-cli library installation path; it is currently set for a local installation in your home directory.
    # Initialize the ai-cli library
    source $HOME/share/ai-cli/ai-cli-activate-bash.sh
  • Alternatively, implement one of the following system-specific configurations.
    • Under Linux and Cygwin set the LD_PRELOAD environment variable to load the library using its full path. For example, under bash run export LD_PRELOAD=/usr/local/lib/ai_cli.so (global installation) or export LD_PRELOAD=/home/myname/lib/ai_cli.so (local installation).
    • Under macOS set the DYLD_INSERT_LIBRARIES environment variable to load the library using its full path. For example, under bash run export DYLD_INSERT_LIBRARIES=/Users/myname/lib/ai_cli.dylib. Also set the DYLD_LIBRARY_PATH environment variable to include the Homebrew library directory, e.g. export DYLD_LIBRARY_PATH=/opt/homebrew/lib:$DYLD_LIBRARY_PATH.
  • Perform one of the following.
    • Obtain your Anthropic API key or OpenAI API key and configure it in the .aicliconfig file in your home directory. This is done with a key={key} entry in the file's [anthropic] or [openai] section. In addition, add api=anthropic or api=openai in the file's [general] section. See the file ai-cli-config to understand how configuration files are structured. Anthropic currently provides free trial credits to new users. Note that OpenAI API access requires a different (usage-based) subscription from the ChatGPT one.
    • Configure a llama.cpp server and list its endpoint (e.g. endpoint=http://localhost:8080/completion in the configuration file's [llamacpp] section. In addition, add api=llamacpp in the file's [general] section. In brief running a llama.cpp server involves
      • compiling llama.cpp (ideally with GPU support),
      • downloading, converting, and quantizing suitable model files (use files with more than 7 billion parameters only on GPUs with sufficient memory to hold them),
      • Running the server with a command such as server -m models/llama-2-13b-chat/ggml-model-q4_0.gguf -c 2048 --n-gpu-layers 100.
  • Run the interactive command-line programs, such as bash, mysql, psql, gdb, sqlite3, bc, as you normally would.
  • If the program you want to prompt in natural language isn't linked with the GNU Readline library, you can still make it work with Readline, by invoking it through rlwrap. This however looses the program-specific context provision, because the program's name appears to The ai-cli library as rlwrap.
  • To obtain AI help, enter a natural language prompt and press ^X-a (Ctrl-X followed by a) in the (default) Emacs key binding mode or V if you have configured vi key bindings.
  • Keep in mind that by default ai-cli-lib is sending previously entered commands as context to the model engine you are using. This may leak secrets that you enter, for example by setting an environment variable to contain a key or by configuring a database password. To avoid this problem configure the context setting to zero, or use the command-line program's offered method to avoid storing an entered line. For instance, in bash you can do this by starting the line with a space character.

Note for macOS users

Note that macOS ships with the editline line-editing library, which is currently not compatible with the ai-cli library (it has been designed to tap onto GNU Readline). However, Homebrew tools link with GNU Readline, so they can be used with the ai-cli library. To find out whether a tool you're using links with GNU Readline (libreadline) or with editline (libedit), use the which command to determine the command's full path, and then the otool command to see the libraries it is linked with. In the example below, /usr/bin/sqlite3 isn't linked with GNU Readline, but /opt/homebrew/opt/sqlite/bin/sqlite3 is linked with editline.

$ which sqlite3
/usr/bin/sqlite3

$ otool -L /usr/bin/sqlite3
/usr/bin/sqlite3:
        /usr/lib/libncurses.5.4.dylib (compatibility version 5.4.0, current version 5.4.0)
        /usr/lib/libedit.3.dylib (compatibility version 2.0.0, current version 3.0.0)
        /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1319.100.3)

$ otool -L /opt/homebrew/opt/sqlite/bin/sqlite3
/opt/homebrew/opt/sqlite/bin/sqlite3:
        /opt/homebrew/opt/readline/lib/libreadline.8.dylib (compatibility version 8.2.0, current version 8.2.0)
        /usr/lib/libncurses.5.4.dylib (compatibility version 5.4.0, current version 5.4.0)
        /usr/lib/libz.1.dylib (compatibility version 1.0.0, current version 1.2.11)
        /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1319.100.3)

Consequently, if you want to use the capabilities of the ai-cli library, configure your system to use the Homebrew commands in preference to the ones supplied with macOS.

Reference documentation

The ai-cli reference documentation is provided as Unix manual pages.

Contribute

Contributions are welcomed through GitHub pull requests. Before working on something substantial, open an issue to signify your interest and coordinate with others. Particular useful are:

  • multi-shot prompts for systems not yet supported (see the ai-cli-config file),
  • support for other large language models (start from the openai_fetch.c file),
  • support for other libraries (mainly editline),
  • ports to other platforms and distributions.

See also

Acknowledgements

  • API requests are made using libcurl.
  • The configuration file parsing is based on inih.
  • Unit testing uses CuTest.
  • JSON is parsed using Jansson.

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for ai-cli-lib

Similar Open Source Tools

For similar tasks

For similar jobs