nobodywho

nobodywho

NobodyWho is a plugin for the Godot game engine that lets you interact with local LLMs.

Stars: 117

Visit
 screenshot

NobodyWho is a plugin for the Godot game engine that enables interaction with local LLMs for interactive storytelling. Users can install it from Godot editor or GitHub releases page, providing their own LLM in GGUF format. The plugin consists of `NobodyWhoModel` node for model file, `NobodyWhoChat` node for chat interaction, and `NobodyWhoEmbedding` node for generating embeddings. It offers a programming interface for sending text to LLM, receiving responses, and starting the LLM worker.

README:

Nobody Who

Discord Matrix Mastodon Godot Engine GitHub Sponsors

NobodyWho is a plugin for the Godot game engine that lets you interact with local LLMs for interactive storytelling.

At a Glance

  • 🏃 Run LLM-driven characters locally without internet
  • ⚡ Super fast inference on GPU powered by Vulkan or Metal
  • 🔧 Easy setup - just two nodes to get started
  • 🎯 Perfect for games, interactive stories, and NPCs
  • 💻 Cross-platform: Windows, Linux, macOS

How to Install

You can install it from inside the Godot editor: In Godot 4.3+, go to AssetLib and search for "NobodyWho".

...or you can grab a specific version from our github releases page. You can install these zip files by going to the "AssetLib" tab in Godot and selecting "Import".

How to Help

  • ⭐ Star the repo and spread the word about NobodyWho!
  • Join our Discord or Matrix communities
  • Found a bug? Open an issue!
  • Submit your own PR - contributions welcome
  • 💝 Become a sponsor to support development
  • Help improve docs or write tutorials

Getting started

The plugin does not include a large language model (LLM). You need to provide an LLM in the GGUF file format. A good place to start is something like Gemma 2 2B

Once you have a GGUF model file, you can add a NobodyWhoModel node to your Godot scene. On this node, set the model file to the GGUF model you just downloaded.

NobodyWhoModel contains the weights of the model. The model takes up a lot of RAM, and can take a little while to initialize, so if you plan on having several characters/conversations, it's a big advantage to point to the same NobodyWhoModel node.

Now you can add a NobodyWhoChat node to your scene. From the node inspector, set the "Model Node" field, to show this chat node where to find the NobodyWhoModel. Also in the inspector, you can provide a prompt, which gives the LLM instructions on how to carry out the chat.

Now you can add a script to the NobodyWhoChat node, to provide your chat interaction.

NobodyWhoChat uses this programming interface:

  • say(text: String): a function that can be used to send text from the user to the LLM.
  • response_updated(token: String): a signal that is emitted every time the LLM produces more text. Contains roughly one word per invocation.
  • response_finished(response: String): a signal which indicates that the LLM is done speaking.
  • start_worker(): a function that starts the LLM worker. The LLM needs a few seconds to get ready before chatting, so you may want to call this ahead of time.

Example NobodyWhoChat script

extends NobodyWhoChat

func _ready():
	# configure node
	model_node = get_node("../ChatModel")
	system_prompt = "You are an evil wizard. Always try to curse anyone who talks to you."

	# say something
	say("Hi there! Who are you?")

	# wait for the response
	var response = await response_finished
	print("Got response: " + response)

    # in this example we just use the `response_finished` signal to get the complete response
    # in real-world-use you definitely want to connect `response_updated`, which gives one word at a time
    # the whole interaction feels *much* smoother if you stream the response out word-by-word.

Example NobodyWhoEmbedding script

extends NobodyWhoEmbedding

func _ready():
    # configure node
    self.model_node = get_node("../EmbeddingModel")

    # generate some embeddings
    embed("The dragon is on the hill.")
    var dragon_hill_embd = await self.embedding_finished

    embed("The dragon is hungry for humans.")
    var dragon_hungry_embd = await self.embedding_finished

    embed("This doesn't matter.")
    var irrelevant_embd = await self.embedding_finished

    # test similarity,
    # here we show that two embeddings will have high similarity, if they mean similar things
    var low_similarity = cosine_similarity(irrelevant_embd, dragon_hill_embd)
    var high_similarity = cosine_similarity(dragon_hill_embd, dragon_hungry_embd) 
    assert(low_similarity < high_similarity)

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for nobodywho

Similar Open Source Tools

For similar tasks

For similar jobs