llm4s

llm4s

Scala 3 bindings for llama.cpp

Stars: 56

Visit
 screenshot

llm4s is an experimental Scala 3 bindings tool for llama.cpp using Slinc. It provides version compatibility with Scala 3.3.0 and JDK 17, 19 for llama.cpp. Users can utilize llm4s to work with llama.cpp shared library and model, enabling completion and embeddings functionalities in Scala.

README:

llm4s

Sonatype Nexus (Releases)

Experimental Scala 3 bindings for llama.cpp using Slinc.

Setup

Add llm4s to your build.sbt:

libraryDependencies += "com.donderom" %% "llm4s" % "0.11.0"

For JDK 17 add .jvmopts file in the project root:

--add-modules=jdk.incubator.foreign
--enable-native-access=ALL-UNNAMED

Version compatibility:

llm4s Scala JDK llama.cpp (commit hash)
0.11+ 3.3.0 17, 19 229ffff (May 8, 2024)
Older versions
llm4s Scala JDK llama.cpp (commit hash)
0.10+ 3.3.0 17, 19 49e7cb5 (Jul 31, 2023)
0.6+ --- --- 49e7cb5 (Jul 31, 2023)
0.4+ --- --- 70d26ac (Jul 23, 2023)
0.3+ --- --- a6803ca (Jul 14, 2023)
0.1+ 3.3.0-RC3 17, 19 447ccbe (Jun 25, 2023)

Usage

import java.nio.file.Paths
import com.donderom.llm4s.*

// Path to the llama.cpp shared library
System.load("llama.cpp/libllama.so")

// Path to the model supported by llama.cpp
val model = Paths.get("models/llama-7b-v2/llama-2-7b.Q4_K_M.gguf")
val prompt = "Large Language Model is"

Completion

val llm = Llm(model)

// To print generation as it goes
llm(prompt).foreach: stream =>
  stream.foreach: token =>
    print(token)

// Or build a string
llm(prompt).foreach(stream => println(stream.mkString))

llm.close()

Embeddings

val llm = Llm(model)
llm.embeddings(prompt).foreach: embeddings =>
  embeddings.foreach: embd =>
    print(embd)
    print(' ')
llm.close()

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for llm4s

Similar Open Source Tools

For similar tasks

For similar jobs