machinascript-for-robots

machinascript-for-robots

Build LLM-powered robots in your garage with MachinaScript For Robots!

Stars: 158

Visit
 screenshot

MachinaScript For Robots is a dynamic set of tools and a LLM-JSON-based language designed to empower humans in the creation of their own robots. It facilitates the animation of generative movements, the integration of personality, and the teaching of new skills with a high degree of autonomy. With MachinaScript, users can control a wide range of electronic components, including Arduinos, Raspberry Pis, servo motors, cameras, sensors, and more. The tool enables the creation of intelligent robots accessible to everyone, allowing for complex tasks to be performed with elegance and precision.

README:

MachinaScript For Robots example – 2 example – 3 example – 10 example – 4 example – 7

Patch 0.3 - Presenting Machina3: SEES. THINKS. ACTS.

ezgif-6-02cb4e9eea machinas3

MACHINA3 embodies a loop of perception and action that simulates the flow of human thought. Through the integration of a vision systems and a serial to analogic signals parser, MACHINA3 interprets visual data and crafts responses in near real-time, enabling machines to perform complex tasks with elegance and some level of precision - a fantastic achievement for such an early stage of the technology.

Added support for Llama 3.2 Vision at ultra fast Groq Inference (+300tps) Added support for JSON mode

See full release Here


Patch 0.2.1 - Presenting Machina2A: Autogen Self-Controlled Robots

Prancheta – 3



Markdownify
MachinaScript For Robots (early beta)

🤘🤖🤘 Build modular ai-powered robots in your garage right now.

Discord

IntroHow MachinaScript WorksGetting StartedInstallationCommunity


The future is not a gift. It is an achievement.



Meet MachinaScript For Robots

MachinaScript is a dynamic set of tools and a LLM-JSON-based language designed to empower humans in the creation of their own robots.

It facilitates the animation of generative movements, the integration of personality, and the teaching of new skills with a high degree of autonomy. With MachinaScript, you can control a wide range of electronic components, including Arduinos, Raspberry Pis, servo motors, cameras, sensors, and much more.

MachinaScript's mission is to make cutting-edge intelligent robotics accessible for everyone.

Read all about it on the medium article

bar1 bar git 2 bar git 3



Installation:

Read the user manual in the code directory here.



A New Way to Build Robots

A Simple, Modular Pipeline

  1. Input Reception: Upon receiving an input, the brain unit, (a central processing unit like a raspberry pi or a computer of your choice) initiates the process. For example listen for a wake up word, or a function to keep reading images in real time on a multimodal LLM.

  2. Instruction Generation: A Language Model (LLM) then crafts a sequence of instructions for actions, movements and skills. These are formatted in MachinaScript, optimized for sequential execution.

  3. Instruction Parsing: The robot's brain unit interprets the generated MachinaScript instructions.

  4. Action Serialization: Instructions are relayed to the microcontroller, the entity governing the robot's physical operations like servo motors and sensors.

example – 6



MachinaScript LLM-JSON-Language Basics

bar git

The MachinaScript language LLM-JSON-based synthax is incredibly modular because it is generative. It is composed of three major nested components: Actions, Movements and Skills.

Actions, Movements, Skills

Actions: a set of instructions to be executed in a specific order. They may contain multiple movements and multiple skill usages.

Movements: they address motors to move and parameters like degrees and the speed. This can be used to create very personal animations.

Skills: function calling the MachinaScript way, to make use of cameras, sensors and even to speak with text-to-speech.

As long as your brain unit code is adapted to interpret it, you have no ending for your creativity.

example – 5

This is an example of the complete language structure in its current latest version. Note you can change the complete synthax for the language structure for your needs, no strings attached. Just make sure it will work with your brain module generating, parsing and serializing.

Teaching MachinaScript to LLMs

The project was designed to be used accross the wide ecosystem of large language models, multimodals and non-multimodals, locals and non-locals. Note that autopilot units like Machina2 would require some form of multi-modality to sense the world via images and plan actions by itself.

To instruct a LLM to talk in the MachinaScript Synthax, we pass a system message that looks like this:

You are a MachinaScript for Robots generator.
MachinaScript is a LLM-JSON-based format used to define robotic actions, including 
motor movements and skill usage, under specific contexts given by the user. 

Each action can involve multiple movements, motors and skills, with defined parameters 
like motor positions, speeds, and skill-specific details, like this:
(...)
Please generate a new MachinaScript using the exact given format and project specifications.

This piece of code is refered as machinascript_language.txt and is recommended to stay unchanged.

Ideally you will only change the specs of your project.


Declaring Specs: Teaching the LLM about your unique robot design - and personality.

No artisanal robot is the same. They are all beautifully unique.

One of the most mind blowing things about MachinaScript is that it can embody any design ever. You just need to tell it in a set of specs what are their physical properties and limitations, as well as instructions for the behavior of the LLM. Should it be funny? Serious? What are its goals? Favorite color? The machinascript_project_specs.txt is where you put everything related to your robot personality.

For this to work, we will append a little extra information in the system message containing the following information:

Project specs:
{
  "Motors": [
    {"id": "motor_neck_vertical", "range": [0, 180]},
    {"id": "motor_neck_horizontal", "range": [0, 180]}
  ],
  "Skills": [
    {"id": "photograph", "description": "Captures a photograph using an attached camera and send to a multimodal LLM."},
    {"id": "blink_led", "parameters": {"led_pin": 10, "duration": 500, "times": 3}, "description": "Blinks an LED to indicate action."}
  ],
  "Limitations": [
    {"motor": "motor_neck_vertical", "max_speed": "medium"}
    {"motor speeds": [slow, medium, high]}
  ]
  Personality: Funny, delicate
  Agency Level: high
}

note the JSON-style here can be completely reworked into any kind of text you want. You can even describe it in a single paragraph if you feel like. However for sake of human readability and developer experience, you can use this template for better "mental mapping" your project specs. This is all in very early beta so take it with a grain of salt.

Finetuned Models

We are releasing a set of finetuned models for MachinaScript soon to make its generations even better. You can also finetune models for your own specific usecase too.

Bonus: Animated Movements and Motion Design Principles

An action can contain multiple movements in an order to perform animations (set of movements). It may even contain embodied personality in the motion.

Check out Disney's latest robot that combines engineering with their team of motion designers to create a more human friendly machine in the style of BD-1.

You can learn more about the 12 principles of animation here.

bar git 3



Getting Started

Step 1: Make the Robot First

  • Begin with Arduino: The easiest entry point is to start by programming your robot with Arduino code.

    • Construct your robot and get it moving with simple programmed commands.
    • Modify the Arduino code to accept dynamic commands, similar to how a remote-controlled car operates.
  • Components: Utilize a variety of components to enhance your robot:

    • Servo motors, sensors, buttons, LEDs, and any other compatible electronics.

Step 2: Hand Over Control to the AI

  • Connect the Hardware: Link your Arduino to a computing device of your choice. This could be a Raspberry Pi, a personal computer, or even an older laptop with internet access.

  • Edit the Brain Code:

    • Map Arduino components within your code and establish their rules and functions for interaction. For instance, a servo motor might be named head_motor_vertical and programmed to move up to 180 degrees.
    • Modify the "system prompt" passed to the LLM with your defined rules and component names.

Step 3: Learning New Skills

  • Skills encompass any function callable from the LLM, ranging from complex movement sequences (e.g., making a drink, dancing) to interactive tasks like taking pictures or utilizing text-to-speech.

Here's a quick overview:

  1. Clone/Download: Clone or download this repository into a chosen directory.
  2. Edit the Brain Code: Customize the brain code's system prompt to describe your robot's capabilities.
  3. Connect Hardware: Integrate your robot's locomotion and sensory systems as previously outlined.

Community

Ready to share your projects to the world? Join our community on discord: https://discord.gg/SQFZNkQP3x

Note from the author

MachinaScript is my gift for the maker community,
wich has teached me so much about being a human.
Let the robots live forever.

Made with love for all the makers out there!
This project is and always will be free and open source for everyone.

babycommando

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for machinascript-for-robots

Similar Open Source Tools

For similar tasks

For similar jobs