orama-core

orama-core

Orama Core is the database you need for your AI projects, answer engines, copilots, and search. It includes a fully-fledged full-text search engine, vector database, LLM interface, and many more utilities.

Stars: 70

Visit
 screenshot

OramaCore is a database designed for AI projects, answer engines, copilots, and search functionalities. It offers features such as a full-text search engine, vector database, LLM interface, and various utilities. The tool is currently under active development and not recommended for production use due to potential API changes. OramaCore aims to provide a comprehensive solution for managing data and enabling advanced AI capabilities in projects.

README:

OramaCore

🚧 Under active development. Do not use in production - APIs will change 🚧

OramaCore is the database you need for your AI projects, answer engines, copilots, and search.

It includes a fully-fledged full-text search engine, vector database, LLM interface, and many more utilities.

Roadmap

  • v0.1.0. ETA Jan 31st, 2025 (🚧 beta release)

    • βœ… Full-text search
    • βœ… Vector search
    • βœ… Search filters
    • βœ… Automatic embeddings generation
    • βœ… Built-in multiple LLM inference setup
    • βœ… Basic JavaScript integration
    • βœ… Disk persistence
    • 🚧 Vector compression
    • βœ… Unified configuration
    • 🚧 Dockerfile for load testing in production environment
    • 🚧 Benchmarks
  • v1.0.0. ETA Feb 28th, 2025 (πŸŽ‰ production ready!)

    • πŸ”œ Long-term user memory
    • πŸ”œ Multi-node setup
    • πŸ”œ Content expansion APIs
    • πŸ”œ JavaScript API integration
    • πŸ”œ Production-ready build
    • πŸ”œ Geosearch
    • πŸ”œ Zero-downtime upgrades

Requirements

To run Orama Core locally, you need to have the following programming languages installed:

  • Python >= 3.11
  • Rust >= 1.83.0

The Rust part of Orama Core communicates with Python via gRPC. So you'll also need to install a protobuf compiler:

apt-get install protobuf-compiler

After that, just install the dependencies:

cargo build
cd ./src/ai_server && pip install -r requirements.txt

An NVIDIA GPU is highly recommended for running the application.

Getting Started

How to run:

RUST_LOG=trace PROTOC=/usr/bin/protoc cargo run --bin oramacore

or, for release mode:

RUST_LOG=trace PROTOC=/usr/bin/protoc cargo run --bin oramacore --release

The configuration file is located at config.jsonc and contains an example of the configuration.

Disk persistence

You can persist the database status on disk by runnng the following commands:

curl 'http://localhost:8080/v0/reader/dump_all' -X POST
curl 'http://localhost:8080/v0/writer/dump_all' -X POST

After killing and restarting the server, you'll find your data back in memory.

Tests

To run the tests, use:

cargo test

E2E tests

Install hurl:

cargo install hurl

Run the tests:

hurl --very-verbose --test --variables-file api-test.hurl.property api-test.hurl
hurl --very-verbose --test --variables-file api-test.hurl.property embedding-api-test.hurl

NB: you need to have the server running before running the tests.

License

AGPLv3

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for orama-core

Similar Open Source Tools

For similar tasks

For similar jobs