supallm

supallm

AI straight into your frontend

Stars: 176

Visit
 screenshot

Supallm is a Python library for super resolution of images using deep learning techniques. It provides pre-trained models for enhancing image quality by increasing resolution. The library is easy to use and allows users to upscale images with high fidelity and detail. Supallm is suitable for tasks such as enhancing image quality, improving visual appearance, and increasing the resolution of low-quality images. It is a valuable tool for researchers, photographers, graphic designers, and anyone looking to enhance image quality using AI technology.

README:

Supallm

.

Supallm is an open-source platform allowing to build multi-AI-powered flows in no-code. Then you can trigger them and stream their output in realtime into your application.

Supallm is built in Golang for high performance and scalability making it ideal to prototype but also for real-world Enterprise usecases.

License version Docker Image CI

.

🌟 Give us some love by starring this repository! 🌟

.

Demo

https://github.com/user-attachments/assets/d3b67c6c-4059-4f7e-b5ad-3e27c1c7c858

Main Concepts

1. Build advanced AI-based flows

Our powerful editor allows you to build and run complex AI flows in seconds. Unlike other tools, you can customize the inputs and outputs of your flows. Every output field result can be streamed in realtime to your frontend with almost no latency.

main-concept-1

2. Test your flows in our editor

Stop writing code, just test your flows right in the editor.

main-concept-2

3. Use them in realtime in your code

We integrate with all the major authentication providers to securely run your flows either from your frontend or backend. Our simple SDK allows you to run your flows in seconds and listen to the results in realtime.

main-concept-3

⚡ Quick Start

Option 1: One-click deployment

Deploy on Railway

Option 2: Host locally

Prerequisites:

  • Node.js 20+
  • Docker and Docker Compose
npx install-supallm@latest

Et voilà! The CLI will walk you through the installation process.

Once finished you will get a docker-compose.yml and a .env file that you will be able to customize but it's not required.

🐳 Customize your installation

We recommend using the CLI (above) to install Supallm. The CLI will download the required files and help you to setup your first required variables.

Once installed you will be able to customize your docker-compose.yml and your environment variables.

If you can't use the CLI for any reason simply open an issue.

The CLI will guide you to configure the required variables. In case you want to go further, you can open the .env file and check for the variable definitions you can customize.

👨‍💻 Integrate in your application

Once you've built your flow, you can use our isomorphic Javascript SDK to run it from your code.

1. Install the package using npm or yarn.

npm i supallm

2. Run your flow with realtime updates:

import { initSupallm } from 'supallm';

const supallm = initSupallm({
    secretKey: 'your-api-key',
    projectId: 'your-project-id',
});

const sub = await supallm.runFlow({
    flowId: 'your-flow-id',
    inputs: {
        yourCustomInput: 'What is the capital of France?',
    },
}).subscribe();

sub.on('flowResultStream', (data) => {
    console.log('Received realtime result chunk', data);
});

sub.on('flowEnded', (event) => {
    console.log('Flow ended with result', event.result);
});

sub.on('flowFail', (event) => {
    console.log('Flow failed with error', event.result);
});

sub.on('nodeStart', (event) => {
    console.log('Node started', event.nodeId);
});

sub.on('nodeEnd', (event) => {
    console.log('Node ended', event.nodeId);
});

sub.on('nodeFail', (event) => {
    console.log('Node failed', event.nodeId);
});

Or you can use wait for the flow to complete and get the full result:

const response = await supallm.runFlow({
    flowId: 'your-flow-id',
    inputs: {
        yourCustomInput: 'What is the capital of France?',
    },
}).wait();

// Since we believe exceptions are hell, we don't throw errors.
// Instead we return a response object with the following properties:

response.isSuccess // true if the flow ran successfully
response.result

⚙️ Our low-latency, high-performance and scalable stack

Unlike other tools, we're crafting Supallm with performance in mind.

  • We use Postgres as the main database.
  • Backend in Golang is stateless, horizontally scalable and highly-available.
  • Our runners pull jobs from a Redis Queue and run code execution in a sandboxed environment.
  • Our frontend is built with Next.js and TypeScript.

🔒 Security

  • Code is compiled and run in a sandboxed environment using the great nsjail project by Google.
  • All sensitive data is encrypted and never shared in the logs.

📈 Performance

Once a flow started, there is no overhead compared to running the same flow from your code. The added latency from a job being pulled from the queue, started, and then having its result sent back to the database is ~50ms.

Our backend API and runners are designed to be stateless and horizontally scalable.

.

🌟 Give us some love by starring this repository! 🌟

.

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for supallm

Similar Open Source Tools

For similar tasks

For similar jobs