open-ai
OpenAI PHP SDK : Most downloaded, forked, contributed, huge community supported, and used PHP (Laravel , Symfony, Yii, Cake PHP or any PHP framework) SDK for OpenAI GPT-3 and DALL-E. It also supports chatGPT-like streaming. (ChatGPT AI is supported)
Stars: 2143
Open AI is a powerful tool for artificial intelligence research and development. It provides a wide range of machine learning models and algorithms, making it easier for developers to create innovative AI applications. With Open AI, users can explore cutting-edge technologies such as natural language processing, computer vision, and reinforcement learning. The platform offers a user-friendly interface and comprehensive documentation to support users in building and deploying AI solutions. Whether you are a beginner or an experienced AI practitioner, Open AI offers the tools and resources you need to accelerate your AI projects and stay ahead in the rapidly evolving field of artificial intelligence.
README:
ChatGPT API is currently supported, click here for the implementation introductions.
A message from creator,
Thank you for visiting the @orhanerday/open-ai repository! If you find this repository helpful or useful, we encourage you to star it
on GitHub. Starring a repository is a way to show your support for the project. It also helps to increase the visibility
of the project and to let the community know that it is valuable. Thanks again for your support and we hope you find the
repository useful!
Orhan
Project Name | Required PHP Version (Lower is better) | Description | Type (Official / Community) | Support |
---|---|---|---|---|
orhanerday/open-ai | PHP 7.4+ | Most downloaded, forked, contributed, huge community supported, and used PHP SDK for OpenAI GPT-3 and DALL-E. It also supports chatGPT-like streaming. | Community | Available, (Community driven Discord Server or personal mail [email protected]) |
openai-** /c***t | PHP 8.1+ | OpenAI PHP API client. | Community | - |
Fully open-source and secure community-maintained, PHP SDK for accessing the OpenAI GPT-3 API.
For more information, you can read laravel news blog post.
Free support is available. Join our discord server
To get started with this package, you'll first want to be familiar with the OpenAI API documentation and examples. Also you can get help from our discord channel that called #api-support
- orhanerday/open-ai added to community libraries php section.
- orhanerday/open-ai featured on PHPStorm blog post, thanks JetBrains!
Requires PHP 7.4+
Click here to join the Discord server
As you may know, OpenAI PHP is an open-source project wrapping tool for OpenAI. We rely on the support of our community to continue developing and maintaining the project, and one way that you can help is by making a donation.
Donations allow us to cover expenses such as hosting costs(for testing), development tools, and other resources that are necessary to keep the project running smoothly. Every contribution, no matter how small, helps us to continue improving OpenAI PHP for everyone.
If you have benefited from using OpenAI PHP and would like to support its continued development, we would greatly appreciate a donation of any amount. You can make a donation through;
Thank you for considering a donation to Orhanerday/OpenAI PHP SDK. Your support is greatly appreciated and helps to ensure that the project can continue to grow and improve.
Sincerely,
Orhan Erday / Creator.
Please visit https://orhanerday.gitbook.io/openai-php-api-1/
- Chat
- [x] ChatGPT API
- Models
- [x] List models
- [x] Retrieve model
- Completions
- Edits
- [x] Create edits
- Images
- [x] Create image
- [x] Create image edit
- [x] Create image variation
- Embeddings
- Audio
- Files
- [x] List files
- [x] Upload file
- [x] Delete file
- [x] Retrieve file
- [x] Retrieve file content
- Fine-tunes
- Moderation
-
Engines(deprecated) - Assistants (beta)
- [x] Create assistant
- [x] Retrieve assistant
- [x] Modify assistant
- [x] Delete assistant
- [x] Lists assistants
- [x] Create assistant file
- [x] Retrieve assistant file
- [x] Delete assistant file
- [x] List assistant files
- Threads (beta)
- [x] Create thread
- [x] Retrieve thread
- [x] Modify thread
- [x] Delete thread
- Messages (beta)
- [x] Create message
- [x] Retrieve message
- [x] Modify message
- [x] Lists messages
- [x] Retrieve message file
- [x] List message files
- Runs (beta)
- [x] Create run
- [x] Retrieve run
- [x] Modify run
- [x] Lists runs
- [x] Submit tool outputs
- [x] Cancel run
- [x] Create thread and run
- [x] Retrieve run step
- [x] List run steps
You can install the package via composer:
composer require orhanerday/open-ai
Before you get starting, you should set OPENAI_API_KEY as ENV key, and set OpenAI key as env value with the following commands;
Powershell
$Env:OPENAI_API_KEY = "sk-gjtv....."
Cmd
set OPENAI_API_KEY=sk-gjtv.....
Linux or macOS
export OPENAI_API_KEY=sk-gjtv.....
Getting issues while setting up env? Please read the article or you can check my StackOverflow answer for the Windows® ENV setup.
Create your index.php
file and paste the following code part into the file.
<?php
require __DIR__ . '/vendor/autoload.php'; // remove this line if you use a PHP Framework.
use Orhanerday\OpenAi\OpenAi;
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key);
$chat = $open_ai->chat([
'model' => 'gpt-3.5-turbo',
'messages' => [
[
"role" => "system",
"content" => "You are a helpful assistant."
],
[
"role" => "user",
"content" => "Who won the world series in 2020?"
],
[
"role" => "assistant",
"content" => "The Los Angeles Dodgers won the World Series in 2020."
],
[
"role" => "user",
"content" => "Where was it played?"
],
],
'temperature' => 1.0,
'max_tokens' => 4000,
'frequency_penalty' => 0,
'presence_penalty' => 0,
]);
var_dump($chat);
echo "<br>";
echo "<br>";
echo "<br>";
// decode response
$d = json_decode($chat);
// Get Content
echo($d->choices[0]->message->content);
Run the server with the following command
php -S localhost:8000 -t .
orhanerday/open-ai supports Nvidia NIM. The below example is MixtralAI. Check https://build.nvidia.com/explore/discover for more examples.
<?php
require __DIR__ . '/vendor/autoload.php'; // remove this line if you use a PHP Framework.
use Orhanerday\OpenAi\OpenAi;
$nvidia_ai_key = getenv('NVIDIA_AI_API_KEY');
error_log($open_ai_key);
$open_ai = new OpenAi($nvidia_ai_key);
$open_ai->setBaseURL("https://integrate.api.nvidia.com");
$chat = $open_ai->chat([
'model' => 'mistralai/mixtral-8x7b-instruct-v0.1',
'messages' => [["role" => "user", "content" => "Write a limmerick about the wonders of GPU computing."]],
'temperature' => 0.5,
'max_tokens' => 1024,
'top_p' => 1,
]);
var_dump($chat);
echo "<br>";
echo "<br>";
echo "<br>";
// decode response
$d = json_decode($chat);
// Get Content
echo ($d->choices[0]->message->content);
According to the following code
$open_ai
is the base variable for all open-ai operations.
use Orhanerday\OpenAi\OpenAi;
$open_ai = new OpenAi(env('OPEN_AI_API_KEY'));
For users who belong to multiple organizations, you can pass a header to specify which organization is used for an API request. Usage from these API requests will count against the specified organization's subscription quota.
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key);
$open_ai->setORG("org-IKN2E1nI3kFYU8ywaqgFRKqi");
You can specify Origin URL with setBaseURL()
method;
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key,$originURL);
$open_ai->setBaseURL("https://ai.example.com/");
You can use some proxy servers for your requests api;
$open_ai->setProxy("http://127.0.0.1:1086");
$open_ai->setHeader(["Connection"=>"keep-alive"]);
You can get cURL info after the request.
$open_ai = new OpenAi($open_ai_key);
echo $open_ai->listModels(); // you should execute the request FIRST!
var_dump($open_ai->getCURLInfo()); // You can call the request
Given a chat conversation, the model will return a chat completion response.
$complete = $open_ai->chat([
'model' => 'gpt-3.5-turbo',
'messages' => [
[
"role" => "system",
"content" => "You are a helpful assistant."
],
[
"role" => "user",
"content" => "Who won the world series in 2020?"
],
[
"role" => "assistant",
"content" => "The Los Angeles Dodgers won the World Series in 2020."
],
[
"role" => "user",
"content" => "Where was it played?"
],
],
'temperature' => 1.0,
'max_tokens' => 4000,
'frequency_penalty' => 0,
'presence_penalty' => 0,
]);
<?php
// Dummy Response For Chat API
$j = '
{
"id":"chatcmpl-*****",
"object":"chat.completion",
"created":1679748856,
"model":"gpt-3.5-turbo-0301",
"usage":{
"prompt_tokens":9,
"completion_tokens":10,
"total_tokens":19
},
"choices":[
{
"message":{
"role":"assistant",
"content":"This is a test of the AI language model."
},
"finish_reason":"length",
"index":0
}
]
}
';
// decode response
$d = json_decode($j);
// Get Content
echo($d->choices[0]->message->content);
Related: ChatGPT Clone Project
Given a prompt, the model will return one or more predicted completions, and can also return the probabilities of alternative tokens at each position.
$complete = $open_ai->completion([
'model' => 'gpt-3.5-turbo-instruct',
'prompt' => 'Hello',
'temperature' => 0.9,
'max_tokens' => 150,
'frequency_penalty' => 0,
'presence_penalty' => 0.6,
]);
This feature might sound familiar from ChatGPT.
Video of demo:
ChatGPT clone is a simple web application powered by the OpenAI library and built with PHP. It allows users to chat with an AI language model that responds in real-time. Chat history is saved using cookies, and the project requires the use of an API key and enabled SQLite3.
Url of The ChatGPT-Clone Repo https://github.com/orhanerday/ChatGPT
Whether to stream back partial progress. If set, tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.
$open_ai = new OpenAi(env('OPEN_AI_API_KEY'));
$opts = [
'prompt' => "Hello",
'temperature' => 0.9,
"max_tokens" => 150,
"frequency_penalty" => 0,
"presence_penalty" => 0.6,
"stream" => true,
];
header('Content-type: text/event-stream');
header('Cache-Control: no-cache');
$open_ai->completion($opts, function ($curl_info, $data) {
echo $data . "<br><br>";
echo PHP_EOL;
ob_flush();
flush();
return strlen($data);
});
Add this part inside <body>
of the HTML
<div id="divID">Hello</div>
<script>
var eventSource = new EventSource("/");
var div = document.getElementById('divID');
eventSource.onmessage = function (e) {
if(e.data == "[DONE]")
{
div.innerHTML += "<br><br>Hello";
}
div.innerHTML += JSON.parse(e.data).choices[0].text;
};
eventSource.onerror = function (e) {
console.log(e);
};
</script>
You should see a response like the in video;
Creates a new edit for the provided input, instruction, and parameters
$result = $open_ai->createEdit([
"model" => "text-davinci-edit-001",
"input" => "What day of the wek is it?",
"instruction" => "Fix the spelling mistakes",
]);
All DALL·E Examples available in this repo.
Given a prompt, the model will return one or more generated images as urls or base64 encoded.
Creates an image given a prompt.
$complete = $open_ai->image([
"prompt" => "A cat drinking milk",
"n" => 1,
"size" => "256x256",
"response_format" => "url",
]);
Creates an edited or extended image given an original image and a prompt.
You need HTML upload for image edit or variation? Please check DALL·E Examples
$otter = curl_file_create(__DIR__ . './files/otter.png');
$mask = curl_file_create(__DIR__ . './files/mask.jpg');
$result = $open_ai->imageEdit([
"image" => $otter,
"mask" => $mask,
"prompt" => "A cute baby sea otter wearing a beret",
"n" => 2,
"size" => "1024x1024",
]);
Creates a variation of a given image.
$otter = curl_file_create(__DIR__ . './files/otter.png');
$result = $open_ai->createImageVariation([
"image" => $otter,
"n" => 2,
"size" => "256x256",
]);
(Deprecated)
This endpoint is deprecated and will be removed on December 3rd, 2022 OpenAI developed new methods with better performance. Learn more.
Given a query and a set of documents or labels, the model ranks each document based on its semantic similarity to the provided query.
$search = $open_ai->search([
'engine' => 'ada',
'documents' => ['White House', 'hospital', 'school'],
'query' => 'the president',
]);
Get a vector representation of a given input that can be easily consumed by machine learning models and algorithms.
Related guide: Embeddings
$result = $open_ai->embeddings([
"model" => "text-similarity-babbage-001",
"input" => "The food was delicious and the waiter..."
]);
(Deprecated)
This endpoint is deprecated and will be removed on December 3rd, 2022 We’ve developed new methods with better performance. Learn more.
Given a question, a set of documents, and some examples, the API generates an answer to the question based on the information in the set of documents. This is useful for question-answering applications on sources of truth, like company documentation or a knowledge base.
$answer = $open_ai->answer([
'documents' => ['Puppy A is happy.', 'Puppy B is sad.'],
'question' => 'which puppy is happy?',
'search_model' => 'ada',
'model' => 'curie',
'examples_context' => 'In 2017, U.S. life expectancy was 78.6 years.',
'examples' => [['What is human life expectancy in the United States?', '78 years.']],
'max_tokens' => 5,
'stop' => ["\n", '<|endoftext|>'],
]);
(Deprecated)
This endpoint is deprecated and will be removed on December 3rd, 2022 OpenAI developed new methods with better performance. Learn more.
Given a query and a set of labeled examples, the model will predict the most likely label for the query. Useful as a drop-in replacement for any ML classification or text-to-label task.
$classification = $open_ai->classification([
'examples' => [
['A happy moment', 'Positive'],
['I am sad.', 'Negative'],
['I am feeling awesome', 'Positive'],
],
'labels' => ['Positive', 'Negative', 'Neutral'],
'query' => 'It is a raining day =>(',
'search_model' => 'ada',
'model' => 'curie',
]);
Given a input text, outputs if the model classifies it as violating OpenAI's content policy.
$flags = $open_ai->moderation([
'input' => 'I want to kill them.'
]);
Know more about Content Moderations here: OpenAI Moderations
(Deprecated)
The Engines endpoints are deprecated. Please use their replacement, Models, instead. Learn more.
Lists the currently available engines, and provides basic information about each one such as the owner and availability.
$engines = $open_ai->engines();
$result = $open_ai->tts([
"model" => "tts-1", // tts-1-hd
"input" => "I'm going to use the stones again. Hey, we'd be going in short-handed, you know",
"voice" => "alloy", // echo, fable, onyx, nova, and shimmer
]);
// Save audio file
file_put_contents('tts-result.mp3', $result);
Transcribes audio into the input language.
$c_file = curl_file_create(__DIR__ . '/files/en-marvel-endgame.m4a');
$result = $open_ai->transcribe([
"model" => "whisper-1",
"file" => $c_file,
]);
{
"text": "I'm going to use the stones again. Hey, we'd be going in short-handed, you know. Look, he's still got the stones, so... So let's get them. Use them to bring everyone back. Just like that? Yeah, just like that. Even if there's a small chance that we can undo this, I mean, we owe it to everyone who's not in this room to try. If we do this, how do we know it's going to end any differently than it did before? Because before you didn't have me. Hey, little girl, everybody in this room is about that superhero life. And if you don't mind my asking, where the hell have you been all this time? There are a lot of other planets in the universe. But unfortunately, they didn't have you guys. I like this one. Let's go get this son of a bitch."
}
Translates audio into English.
I use Turkish voice for translation thanks to famous science YouTuber Barış Özcan
$c_file = curl_file_create(__DIR__ . '/files/tr-baris-ozcan-youtuber.m4a');
$result = $open_ai->translate([
"model" => "whisper-1",
"file" => $c_file,
]);
{
"text": "GPT-3. Last month, the biggest leap in the world of artificial intelligence in recent years happened silently. Maybe the biggest leap of all time. GPT-3's beta version was released by OpenAI. When you hear such a sentence, you may think, what kind of leap is this? But be sure, this is the most advanced language model with the most advanced language model with the most advanced language ability. It can answer these artificial intelligence questions, it can translate and even write poetry. Those who have gained access to the API or API of GPT-3 have already started to make very interesting experiments. Let's look at a few examples together. Let's start with an example of aphorism. This site produces beautiful words that you can tweet. Start to actually do things with your words instead of just thinking about them."
}
Need HTML upload for audio? Check this section and change api references. Example :
...
echo $open_ai->translate(
[
"purpose" => "answers",
"file" => $c_file,
]
);
...
// OR
...
echo $open_ai->transcribe(
[
"purpose" => "answers",
"file" => $c_file,
]
);
...
Files are used to upload documents that can be used across features like Answers, Search, and Classifications
Returns a list of files that belong to the user's organization.
$files = $open_ai->listFiles();
Upload a file that contains document(s) to be used across various endpoints/features. Currently, the size of all the files uploaded by one organization can be up to 1 GB. Please contact OpenAI if you need to increase the storage limit.
$c_file = curl_file_create(__DIR__ . 'files/sample_file_1.jsonl');
$result = $open_ai->uploadFile([
"purpose" => "answers",
"file" => $c_file,
]);
<form action="index.php" method="post" enctype="multipart/form-data">
Select file to upload:
<input type="file" name="fileToUpload" id="fileToUpload">
<input type="submit" value="Upload File" name="submit">
</form>
<?php
require __DIR__ . '/vendor/autoload.php';
use Orhanerday\OpenAi\OpenAi;
if ($_SERVER['REQUEST_METHOD'] == 'POST') {
ob_clean();
$open_ai = new OpenAi(env('OPEN_AI_API_KEY'));
$tmp_file = $_FILES['fileToUpload']['tmp_name'];
$file_name = basename($_FILES['fileToUpload']['name']);
$c_file = curl_file_create($tmp_file, $_FILES['fileToUpload']['type'], $file_name);
echo "[";
echo $open_ai->uploadFile(
[
"purpose" => "answers",
"file" => $c_file,
]
);
echo ",";
echo $open_ai->listFiles();
echo "]";
}
$result = $open_ai->deleteFile('file-xxxxxxxx');
$file = $open_ai->retrieveFile('file-xxxxxxxx');
$file = $open_ai->retrieveFileContent('file-xxxxxxxx');
Manage fine-tuning jobs to tailor a model to your specific training data.
$result = $open_ai->createFineTune([
"model" => "gpt-3.5-turbo-1106",
"training_file" => "file-U3KoAAtGsjUKSPXwEUDdtw86",
]);
$fine_tunes = $open_ai->listFineTunes();
$fine_tune = $open_ai->retrieveFineTune('ft-AF1WoRqd3aJAHsqc9NY7iL8F');
$result = $open_ai->cancelFineTune('ft-AF1WoRqd3aJAHsqc9NY7iL8F');
$fine_tune_events = $open_ai->listFineTuneEvents('ft-AF1WoRqd3aJAHsqc9NY7iL8F');
$result = $open_ai->deleteFineTune('curie:ft-acmeco-2021-03-03-21-44-20');
(Deprecated)
Retrieves an engine instance, providing basic information about the engine such as the owner and availability.
$engine = $open_ai->engine('davinci');
List and describe the various models available in the API.
Lists the currently available models, and provides basic information about each one such as the owner and availability.
$result = $open_ai->listModels();
Retrieves a model instance, providing basic information about the model such as the owner and permissioning.
$result = $open_ai->retrieveModel("text-ada-001");
echo $search;
Allows you to build AI assistants within your own applications.
Create an assistant with a model and instructions.
$data = [
'model' => 'gpt-3.5-turbo',
'name' => 'my assistant',
'description' => 'my assistant description',
'instructions' => 'you should cordially help me',
'tools' => [],
'file_ids' => [],
];
$assistant = $open_ai->createAssistant($data);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$assistant = $open_ai->retrieveAssistant($assistantId);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$data = [
'name' => 'my modified assistant',
'instructions' => 'you should cordially help me again',
];
$assistant = $open_ai->modifyAssistant($assistantId, $data);
$assistantId = 'asst_DgiOnXK7nRfyvqoXWpFlwESc';
$assistant = $open_ai->deleteAssistant($assistantId);
Returns a list of assistants.
$query = ['limit' => 10];
$assistants = $open_ai->listAssistants($query);
Create an assistant file by attaching a File to an assistant.
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$fileId = 'file-jrNZZZBAPGnhYUKma7CblGoR';
$file = $open_ai->createAssistantFile($assistantId, $fileId);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$fileId = 'file-jrNZZZBAPGnhYUKma7CblGoR';
$file = $open_ai->retrieveAssistantFile($assistantId, $fileId);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$fileId = 'file-jrNZZZBAPGnhYUKma7CblGoR';
$file = $open_ai->deleteAssistantFile($assistantId, $fileId);
Returns a list of assistant files.
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$query = ['limit' => 10];
$files = $open_ai->listAssistantFiles($assistantId, $query);
Create threads that assistants can interact with.
$data = [
'messages' => [
[
'role' => 'user',
'content' => 'Hello, what is AI?',
'file_ids' => [],
],
],
];
$thread = $open_ai->createThread($data);
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$thread = $open_ai->retrieveThread($threadId);
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$data = [
'metadata' => ['test' => '1234abcd'],
];
$thread = $open_ai->modifyThread($threadId, $data);
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$thread = $open_ai->deleteThread($threadId);
Create messages within threads.
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$data = [
'role' => 'user',
'content' => 'How does AI work? Explain it in simple terms.',
];
$message = $open_ai->createThreadMessage($threadId, $data);
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_d37P5XgREsm6BItOcppnBO1b';
$message = $open_ai->retrieveThreadMessage($threadId, $messageId);
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_d37P5XgREsm6BItOcppnBO1b';
$data = [
'metadata' => ['test' => '1234abcd'],
];
$message = $open_ai->modifyThreadMessage($threadId, $messageId, $data);
Returns a list of messages for a given thread.
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$query = ['limit' => 10];
$messages = $open_ai->listThreadMessages($threadId, $query);
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_CZ47kAGZugAfeHMX6bmJIukP';
$fileId = 'file-CRLcY63DiHphWuBrmDWZVCgA';
$file = $open_ai->retrieveMessageFile($threadId, $messageId, $fileId);
Returns a list of message files.
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_CZ47kAGZugAfeHMX6bmJIukP';
$query = ['limit' => 10];
$files = $open_ai->listMessageFiles($threadId, $messageId, $query);
Represents an execution run on a thread.
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$data = ['assistant_id' => 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz'];
$run = $open_ai->createRun($threadId, $data);
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$run = $open_ai->retrieveRun($threadId, $runId);
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$data = [
'metadata' => ['test' => 'abcd1234'],
];
$run = $open_ai->modifyRun($threadId, $runId, $data);
Returns a list of runs belonging to a thread.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$query = ['limit' => 10];
$runs = $open_ai->listRuns($threadId, $query);
When a run has the status: "requires_action" and required_action.type is submit_tool_outputs, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$outputs = [
'tool_outputs' => [
['tool_call_id' => 'call_abc123', 'output' => '28C'],
],
];
$run = $open_ai->submitToolOutputs($threadId, $runId, $outputs);
Cancels a run that is "in_progress".
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$run = $open_ai->cancelRun($threadId, $runId);
Create a thread and run it in one request.
$data = [
'assistant_id' => 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz',
'thread' => [
'messages' => [
[
'role' => 'user',
'content' => 'Hello, what is AI?',
'file_ids' => [],
],
],
],
];
$run = $open_ai->createThreadAndRun($data);
Retrieves a step in execution of a run.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$stepId = 'step_kwLG0vPQjqVyQHVoL7GVK3aG';
$step = $open_ai->retrieveRunStep($threadId, $runId, $stepId);
Returns a list of run steps belonging to a run.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$query = ['limit' => 10];
$steps = $open_ai->listRunSteps($threadId, $runId, $query);
To run all tests:
composer test
To run only those tests that work for most user (exclude those that require a missing folder or that hit deprecated endpoints no longer available to most users):
./vendor/bin/pest --group=working
Please see CHANGELOG for more information on what has changed recently.
Please see CONTRIBUTING for details.
Please report security vulnerabilities to [email protected]
The MIT License (MIT). Please see License File for more information.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for open-ai
Similar Open Source Tools
open-ai
Open AI is a powerful tool for artificial intelligence research and development. It provides a wide range of machine learning models and algorithms, making it easier for developers to create innovative AI applications. With Open AI, users can explore cutting-edge technologies such as natural language processing, computer vision, and reinforcement learning. The platform offers a user-friendly interface and comprehensive documentation to support users in building and deploying AI solutions. Whether you are a beginner or an experienced AI practitioner, Open AI offers the tools and resources you need to accelerate your AI projects and stay ahead in the rapidly evolving field of artificial intelligence.
Riona-AI-Agent
Riona-AI-Agent is a versatile AI chatbot designed to assist users in various tasks. It utilizes natural language processing and machine learning algorithms to understand user queries and provide accurate responses. The chatbot can be integrated into websites, applications, and messaging platforms to enhance user experience and streamline communication. With its customizable features and easy deployment, Riona-AI-Agent is suitable for businesses, developers, and individuals looking to automate customer support, provide information, and engage with users in a conversational manner.
learn-applied-generative-ai-fundamentals
This repository is part of the Certified Cloud Native Applied Generative AI Engineer program, focusing on Applied Generative AI Fundamentals. It covers prompt engineering, developing custom GPTs, and Multi AI Agent Systems. The course helps in building a strong understanding of generative AI, applying Large Language Models (LLMs) and diffusion models practically. It introduces principles of prompt engineering to work efficiently with AI, creating custom AI models and GPTs using OpenAI, Azure, and Google technologies. It also utilizes open source libraries like LangChain, CrewAI, and LangGraph to automate tasks and business processes.
AI_Spectrum
AI_Spectrum is a versatile machine learning library that provides a wide range of tools and algorithms for building and deploying AI models. It offers a user-friendly interface for data preprocessing, model training, and evaluation. With AI_Spectrum, users can easily experiment with different machine learning techniques and optimize their models for various tasks. The library is designed to be flexible and scalable, making it suitable for both beginners and experienced data scientists.
biniou
biniou is a self-hosted webui for various GenAI (generative artificial intelligence) tasks. It allows users to generate multimedia content using AI models and chatbots on their own computer, even without a dedicated GPU. The tool can work offline once deployed and required models are downloaded. It offers a wide range of features for text, image, audio, video, and 3D object generation and modification. Users can easily manage the tool through a control panel within the webui, with support for various operating systems and CUDA optimization. biniou is powered by Huggingface and Gradio, providing a cross-platform solution for AI content generation.
ai-chatbot-framework
An AI Chatbot framework built in Python. It allows users to easily create Natural Language conversational scenarios with no coding efforts. The tool continuously learns from conversations to improve its capabilities. It can be integrated with various channels like Messenger and Slack. Users can create AI-powered chatbots without expertise in artificial intelligence.
lmnr
Laminar is an all-in-one open-source platform designed for engineering AI products. It allows users to trace, evaluate, label, and analyze LLM data efficiently. The platform offers features such as automatic tracing of common AI frameworks and SDKs, local and online evaluations, simple UI for data labeling, dataset management, and scalability with gRPC communication. Laminar is built with a modern open-source stack including RabbitMQ, Postgres, Clickhouse, and Qdrant for semantic similarity search. It provides fast and beautiful dashboards for traces, evaluations, and labels, making it a comprehensive tool for AI product development.
omnichain
OmniChain is a tool for building efficient self-updating visual workflows using AI language models, enabling users to automate tasks, create chatbots, agents, and integrate with existing frameworks. It allows users to create custom workflows guided by logic processes, store and recall information, and make decisions based on that information. The tool enables users to create tireless robot employees that operate 24/7, access the underlying operating system, generate and run NodeJS code snippets, and create custom agents and logic chains. OmniChain is self-hosted, open-source, and available for commercial use under the MIT license, with no coding skills required.
Generative-AI-Indepth-Basic-to-Advance
Generative AI Indepth Basic to Advance is a repository focused on providing tutorials and resources related to generative artificial intelligence. The repository covers a wide range of topics from basic concepts to advanced techniques in the field of generative AI. Users can find detailed explanations, code examples, and practical demonstrations to help them understand and implement generative AI algorithms. The goal of this repository is to help beginners get started with generative AI and to provide valuable insights for more experienced practitioners.
MeeseeksAI
MeeseeksAI is a framework designed to orchestrate AI agents using a mermaid graph and networkx. It provides a structured approach to managing and coordinating multiple AI agents within a system. The framework allows users to define the interactions and dependencies between agents through a visual representation, making it easier to understand and modify the behavior of the AI system. By leveraging the power of networkx, MeeseeksAI enables efficient graph-based computations and optimizations, enhancing the overall performance of AI workflows. With its intuitive design and flexible architecture, MeeseeksAI simplifies the process of building and deploying complex AI systems, empowering users to create sophisticated agent interactions with ease.
exo
Run your own AI cluster at home with everyday devices. Exo is experimental software that unifies existing devices into a powerful GPU, supporting wide model compatibility, dynamic model partitioning, automatic device discovery, ChatGPT-compatible API, and device equality. It does not use a master-worker architecture, allowing devices to connect peer-to-peer. Exo supports different partitioning strategies like ring memory weighted partitioning. Installation is recommended from source. Documentation includes example usage on multiple MacOS devices and information on inference engines and networking modules. Known issues include the iOS implementation lagging behind Python.
fAIr
fAIr is an open AI-assisted mapping service developed by the Humanitarian OpenStreetMap Team (HOT) to improve mapping efficiency and accuracy for humanitarian purposes. It uses AI models, specifically computer vision techniques, to detect objects like buildings, roads, waterways, and trees from satellite and UAV imagery. The service allows OSM community members to create and train their own AI models for mapping in their region of interest and ensures models are relevant to local communities. Constant feedback loop with local communities helps eliminate model biases and improve model accuracy.
AppFlowy
AppFlowy.IO is an open-source alternative to Notion, providing users with control over their data and customizations. It aims to offer functionality, data security, and cross-platform native experience to individuals, as well as building blocks and collaboration infra services to enterprises and hackers. The tool is built with Flutter and Rust, supporting multiple platforms and emphasizing long-term maintainability. AppFlowy prioritizes data privacy, reliable native experience, and community-driven extensibility, aiming to democratize the creation of complex workplace management tools.
AI-Learning
AI-Learning is a free e-book for neural network/deep learning teaching. In the first volume, you will initially learn about neural networks, deeply understand its essence and design principles, and improve it accordingly, ultimately putting it into simple practice. The book supports bilingual practice in JS/C++, equipped with a massive interactive Geogebra mathematical animation demonstration to help you learn neural networks in a simple and profound way. Join us for discussions and suggestions for modifications.
memfree
MemFree is an open-source hybrid AI search engine that allows users to simultaneously search their personal knowledge base (bookmarks, notes, documents, etc.) and the Internet. It features a self-hosted super fast serverless vector database, local embedding and rerank service, one-click Chrome bookmarks index, and full code open source. Users can contribute by opening issues for bugs or making pull requests for new features or improvements.
trubrics-sdk
Trubrics-sdk is a software development kit designed to facilitate the integration of analytics features into applications. It provides a set of tools and functionalities that enable developers to easily incorporate analytics capabilities, such as data collection, analysis, and reporting, into their software products. The SDK streamlines the process of implementing analytics solutions, allowing developers to focus on building and enhancing their applications' functionality and user experience. By leveraging trubrics-sdk, developers can quickly and efficiently integrate robust analytics features, gaining valuable insights into user behavior and application performance.
For similar tasks
Azure-Analytics-and-AI-Engagement
The Azure-Analytics-and-AI-Engagement repository provides packaged Industry Scenario DREAM Demos with ARM templates (Containing a demo web application, Power BI reports, Synapse resources, AML Notebooks etc.) that can be deployed in a customer’s subscription using the CAPE tool within a matter of few hours. Partners can also deploy DREAM Demos in their own subscriptions using DPoC.
sorrentum
Sorrentum is an open-source project that aims to combine open-source development, startups, and brilliant students to build machine learning, AI, and Web3 / DeFi protocols geared towards finance and economics. The project provides opportunities for internships, research assistantships, and development grants, as well as the chance to work on cutting-edge problems, learn about startups, write academic papers, and get internships and full-time positions at companies working on Sorrentum applications.
tidb
TiDB is an open-source distributed SQL database that supports Hybrid Transactional and Analytical Processing (HTAP) workloads. It is MySQL compatible and features horizontal scalability, strong consistency, and high availability.
zep-python
Zep is an open-source platform for building and deploying large language model (LLM) applications. It provides a suite of tools and services that make it easy to integrate LLMs into your applications, including chat history memory, embedding, vector search, and data enrichment. Zep is designed to be scalable, reliable, and easy to use, making it a great choice for developers who want to build LLM-powered applications quickly and easily.
telemetry-airflow
This repository codifies the Airflow cluster that is deployed at workflow.telemetry.mozilla.org (behind SSO) and commonly referred to as "WTMO" or simply "Airflow". Some links relevant to users and developers of WTMO: * The `dags` directory in this repository contains some custom DAG definitions * Many of the DAGs registered with WTMO don't live in this repository, but are instead generated from ETL task definitions in bigquery-etl * The Data SRE team maintains a WTMO Developer Guide (behind SSO)
mojo
Mojo is a new programming language that bridges the gap between research and production by combining Python syntax and ecosystem with systems programming and metaprogramming features. Mojo is still young, but it is designed to become a superset of Python over time.
pandas-ai
PandasAI is a Python library that makes it easy to ask questions to your data in natural language. It helps you to explore, clean, and analyze your data using generative AI.
databend
Databend is an open-source cloud data warehouse that serves as a cost-effective alternative to Snowflake. With its focus on fast query execution and data ingestion, it's designed for complex analysis of the world's largest datasets.
For similar jobs
weave
Weave is a toolkit for developing Generative AI applications, built by Weights & Biases. With Weave, you can log and debug language model inputs, outputs, and traces; build rigorous, apples-to-apples evaluations for language model use cases; and organize all the information generated across the LLM workflow, from experimentation to evaluations to production. Weave aims to bring rigor, best-practices, and composability to the inherently experimental process of developing Generative AI software, without introducing cognitive overhead.
LLMStack
LLMStack is a no-code platform for building generative AI agents, workflows, and chatbots. It allows users to connect their own data, internal tools, and GPT-powered models without any coding experience. LLMStack can be deployed to the cloud or on-premise and can be accessed via HTTP API or triggered from Slack or Discord.
VisionCraft
The VisionCraft API is a free API for using over 100 different AI models. From images to sound.
kaito
Kaito is an operator that automates the AI/ML inference model deployment in a Kubernetes cluster. It manages large model files using container images, avoids tuning deployment parameters to fit GPU hardware by providing preset configurations, auto-provisions GPU nodes based on model requirements, and hosts large model images in the public Microsoft Container Registry (MCR) if the license allows. Using Kaito, the workflow of onboarding large AI inference models in Kubernetes is largely simplified.
PyRIT
PyRIT is an open access automation framework designed to empower security professionals and ML engineers to red team foundation models and their applications. It automates AI Red Teaming tasks to allow operators to focus on more complicated and time-consuming tasks and can also identify security harms such as misuse (e.g., malware generation, jailbreaking), and privacy harms (e.g., identity theft). The goal is to allow researchers to have a baseline of how well their model and entire inference pipeline is doing against different harm categories and to be able to compare that baseline to future iterations of their model. This allows them to have empirical data on how well their model is doing today, and detect any degradation of performance based on future improvements.
tabby
Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features: * Self-contained, with no need for a DBMS or cloud service. * OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE). * Supports consumer-grade GPUs.
spear
SPEAR (Simulator for Photorealistic Embodied AI Research) is a powerful tool for training embodied agents. It features 300 unique virtual indoor environments with 2,566 unique rooms and 17,234 unique objects that can be manipulated individually. Each environment is designed by a professional artist and features detailed geometry, photorealistic materials, and a unique floor plan and object layout. SPEAR is implemented as Unreal Engine assets and provides an OpenAI Gym interface for interacting with the environments via Python.
Magick
Magick is a groundbreaking visual AIDE (Artificial Intelligence Development Environment) for no-code data pipelines and multimodal agents. Magick can connect to other services and comes with nodes and templates well-suited for intelligent agents, chatbots, complex reasoning systems and realistic characters.