browser-ai

browser-ai

TypeScript library for using in-browser AI models with the Vercel AI SDK, with support for seamless fallback to server-side models

Stars: 120

Visit
 screenshot

Browser AI is a TypeScript library that provides access to in-browser AI model providers with seamless fallback to server-side models. It offers different packages for Chrome/Edge built-in browser AI models, open-source models via WebLLM, and πŸ€— Transformers.js models. The library simplifies the process of integrating AI models into web applications by handling the complexities of custom hooks, UI components, state management, and compatibility with server-side models.

README:

Browser AI model providers for Vercel AI SDK

NPM Downloads NPM Downloads NPM Downloads

Formerly known as @built-in-ai

TypeScript libraries that provide access to in-browser AI model providers with seamless fallback to using server-side models.

Documentation

For detailed documentation, browser requirements and advanced usage, refer to the official documentation site.

Package Versions

Package AI SDK v5 AI SDK v6
@browser-ai/core βœ“ 1.0.0 βœ“ β‰₯ 2.0.0
@browser-ai/transformers-js βœ“ 1.0.0 βœ“ β‰₯ 2.0.0
@browser-ai/web-llm βœ“ 1.0.0 βœ“ β‰₯ 2.0.0
# For Chrome/Edge built-in browser AI models
npm i @browser-ai/core

# For open-source models via WebLLM
npm i @browser-ai/web-llm

# For πŸ€— Transformers.js models
npm i @browser-ai/transformers-js

Basic Usage with Chrome/Edge AI

import { streamText } from "ai";
import { browserAI } from "@browser-ai/core";

const result = streamText({
  model: browserAI(),
  prompt: "Invent a new holiday and describe its traditions.",
});

for await (const chunk of result.textStream) {
  console.log(chunk);
}

Basic Usage with WebLLM

import { streamText } from "ai";
import { webLLM } from "@browser-ai/web-llm";

const result = streamText({
  model: webLLM("Llama-3.2-3B-Instruct-q4f16_1-MLC"),
  prompt: "Invent a new holiday and describe its traditions.",
});

for await (const chunk of result.textStream) {
  console.log(chunk);
}

Basic Usage with Transformers.js

import { streamText } from "ai";
import { transformersJS } from "@browser-ai/transformers-js";

const result = streamText({
  model: transformersJS("HuggingFaceTB/SmolLM2-360M-Instruct"),
  prompt: "Invent a new holiday and describe its traditions.",
});

for await (const chunk of result.textStream) {
  console.log(chunk);
}

Example applications

Sponsors

This project is proudly sponsored by Chrome for Developers.

Contributing

Contributions are more than welcome! However, please make sure to check out the contribution guidelines before contributing.

Why?

If you've ever built apps with local language models, you're likely familiar with the challenges: creating custom hooks, UI components and state management (lots of it), while also building complex integration layers to fall back to server-side models when compatibility is an issue.

Read more about this here.

Author

2025 Β© Jakob Hoeg MΓΈrk

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for browser-ai

Similar Open Source Tools

For similar tasks

For similar jobs