
open-ai
OpenAI PHP SDK : Most downloaded, forked, contributed, huge community supported, and used PHP (Laravel , Symfony, Yii, Cake PHP or any PHP framework) SDK for OpenAI GPT-3 and DALL-E. It also supports chatGPT-like streaming. (ChatGPT AI is supported)
Stars: 2143

Open AI is a powerful tool for artificial intelligence research and development. It provides a wide range of machine learning models and algorithms, making it easier for developers to create innovative AI applications. With Open AI, users can explore cutting-edge technologies such as natural language processing, computer vision, and reinforcement learning. The platform offers a user-friendly interface and comprehensive documentation to support users in building and deploying AI solutions. Whether you are a beginner or an experienced AI practitioner, Open AI offers the tools and resources you need to accelerate your AI projects and stay ahead in the rapidly evolving field of artificial intelligence.
README:
ChatGPT API is currently supported, click here for the implementation introductions.
A message from creator,
Thank you for visiting the @orhanerday/open-ai repository! If you find this repository helpful or useful, we encourage you to star it
on GitHub. Starring a repository is a way to show your support for the project. It also helps to increase the visibility
of the project and to let the community know that it is valuable. Thanks again for your support and we hope you find the
repository useful!
Orhan
Project Name | Required PHP Version (Lower is better) | Description | Type (Official / Community) | Support |
---|---|---|---|---|
orhanerday/open-ai | PHP 7.4+ | Most downloaded, forked, contributed, huge community supported, and used PHP SDK for OpenAI GPT-3 and DALL-E. It also supports chatGPT-like streaming. | Community | Available, (Community driven Discord Server or personal mail [email protected]) |
openai-** /c***t | PHP 8.1+ | OpenAI PHP API client. | Community | - |
Fully open-source and secure community-maintained, PHP SDK for accessing the OpenAI GPT-3 API.
For more information, you can read laravel news blog post.
Free support is available. Join our discord server
To get started with this package, you'll first want to be familiar with the OpenAI API documentation and examples. Also you can get help from our discord channel that called #api-support
- orhanerday/open-ai added to community libraries php section.
- orhanerday/open-ai featured on PHPStorm blog post, thanks JetBrains!
Requires PHP 7.4+
Click here to join the Discord server
As you may know, OpenAI PHP is an open-source project wrapping tool for OpenAI. We rely on the support of our community to continue developing and maintaining the project, and one way that you can help is by making a donation.
Donations allow us to cover expenses such as hosting costs(for testing), development tools, and other resources that are necessary to keep the project running smoothly. Every contribution, no matter how small, helps us to continue improving OpenAI PHP for everyone.
If you have benefited from using OpenAI PHP and would like to support its continued development, we would greatly appreciate a donation of any amount. You can make a donation through;
Thank you for considering a donation to Orhanerday/OpenAI PHP SDK. Your support is greatly appreciated and helps to ensure that the project can continue to grow and improve.
Sincerely,
Orhan Erday / Creator.
Please visit https://orhanerday.gitbook.io/openai-php-api-1/
- Chat
- [x] ChatGPT API
- Models
- [x] List models
- [x] Retrieve model
- Completions
- Edits
- [x] Create edits
- Images
- [x] Create image
- [x] Create image edit
- [x] Create image variation
- Embeddings
- Audio
- Files
- [x] List files
- [x] Upload file
- [x] Delete file
- [x] Retrieve file
- [x] Retrieve file content
- Fine-tunes
- Moderation
-
Engines(deprecated) - Assistants (beta)
- [x] Create assistant
- [x] Retrieve assistant
- [x] Modify assistant
- [x] Delete assistant
- [x] Lists assistants
- [x] Create assistant file
- [x] Retrieve assistant file
- [x] Delete assistant file
- [x] List assistant files
- Threads (beta)
- [x] Create thread
- [x] Retrieve thread
- [x] Modify thread
- [x] Delete thread
- Messages (beta)
- [x] Create message
- [x] Retrieve message
- [x] Modify message
- [x] Lists messages
- [x] Retrieve message file
- [x] List message files
- Runs (beta)
- [x] Create run
- [x] Retrieve run
- [x] Modify run
- [x] Lists runs
- [x] Submit tool outputs
- [x] Cancel run
- [x] Create thread and run
- [x] Retrieve run step
- [x] List run steps
You can install the package via composer:
composer require orhanerday/open-ai
Before you get starting, you should set OPENAI_API_KEY as ENV key, and set OpenAI key as env value with the following commands;
Powershell
$Env:OPENAI_API_KEY = "sk-gjtv....."
Cmd
set OPENAI_API_KEY=sk-gjtv.....
Linux or macOS
export OPENAI_API_KEY=sk-gjtv.....
Getting issues while setting up env? Please read the article or you can check my StackOverflow answer for the Windows® ENV setup.
Create your index.php
file and paste the following code part into the file.
<?php
require __DIR__ . '/vendor/autoload.php'; // remove this line if you use a PHP Framework.
use Orhanerday\OpenAi\OpenAi;
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key);
$chat = $open_ai->chat([
'model' => 'gpt-3.5-turbo',
'messages' => [
[
"role" => "system",
"content" => "You are a helpful assistant."
],
[
"role" => "user",
"content" => "Who won the world series in 2020?"
],
[
"role" => "assistant",
"content" => "The Los Angeles Dodgers won the World Series in 2020."
],
[
"role" => "user",
"content" => "Where was it played?"
],
],
'temperature' => 1.0,
'max_tokens' => 4000,
'frequency_penalty' => 0,
'presence_penalty' => 0,
]);
var_dump($chat);
echo "<br>";
echo "<br>";
echo "<br>";
// decode response
$d = json_decode($chat);
// Get Content
echo($d->choices[0]->message->content);
Run the server with the following command
php -S localhost:8000 -t .
orhanerday/open-ai supports Nvidia NIM. The below example is MixtralAI. Check https://build.nvidia.com/explore/discover for more examples.
<?php
require __DIR__ . '/vendor/autoload.php'; // remove this line if you use a PHP Framework.
use Orhanerday\OpenAi\OpenAi;
$nvidia_ai_key = getenv('NVIDIA_AI_API_KEY');
error_log($open_ai_key);
$open_ai = new OpenAi($nvidia_ai_key);
$open_ai->setBaseURL("https://integrate.api.nvidia.com");
$chat = $open_ai->chat([
'model' => 'mistralai/mixtral-8x7b-instruct-v0.1',
'messages' => [["role" => "user", "content" => "Write a limmerick about the wonders of GPU computing."]],
'temperature' => 0.5,
'max_tokens' => 1024,
'top_p' => 1,
]);
var_dump($chat);
echo "<br>";
echo "<br>";
echo "<br>";
// decode response
$d = json_decode($chat);
// Get Content
echo ($d->choices[0]->message->content);
According to the following code
$open_ai
is the base variable for all open-ai operations.
use Orhanerday\OpenAi\OpenAi;
$open_ai = new OpenAi(env('OPEN_AI_API_KEY'));
For users who belong to multiple organizations, you can pass a header to specify which organization is used for an API request. Usage from these API requests will count against the specified organization's subscription quota.
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key);
$open_ai->setORG("org-IKN2E1nI3kFYU8ywaqgFRKqi");
You can specify Origin URL with setBaseURL()
method;
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key,$originURL);
$open_ai->setBaseURL("https://ai.example.com/");
You can use some proxy servers for your requests api;
$open_ai->setProxy("http://127.0.0.1:1086");
$open_ai->setHeader(["Connection"=>"keep-alive"]);
You can get cURL info after the request.
$open_ai = new OpenAi($open_ai_key);
echo $open_ai->listModels(); // you should execute the request FIRST!
var_dump($open_ai->getCURLInfo()); // You can call the request
Given a chat conversation, the model will return a chat completion response.
$complete = $open_ai->chat([
'model' => 'gpt-3.5-turbo',
'messages' => [
[
"role" => "system",
"content" => "You are a helpful assistant."
],
[
"role" => "user",
"content" => "Who won the world series in 2020?"
],
[
"role" => "assistant",
"content" => "The Los Angeles Dodgers won the World Series in 2020."
],
[
"role" => "user",
"content" => "Where was it played?"
],
],
'temperature' => 1.0,
'max_tokens' => 4000,
'frequency_penalty' => 0,
'presence_penalty' => 0,
]);
<?php
// Dummy Response For Chat API
$j = '
{
"id":"chatcmpl-*****",
"object":"chat.completion",
"created":1679748856,
"model":"gpt-3.5-turbo-0301",
"usage":{
"prompt_tokens":9,
"completion_tokens":10,
"total_tokens":19
},
"choices":[
{
"message":{
"role":"assistant",
"content":"This is a test of the AI language model."
},
"finish_reason":"length",
"index":0
}
]
}
';
// decode response
$d = json_decode($j);
// Get Content
echo($d->choices[0]->message->content);
Related: ChatGPT Clone Project
Given a prompt, the model will return one or more predicted completions, and can also return the probabilities of alternative tokens at each position.
$complete = $open_ai->completion([
'model' => 'gpt-3.5-turbo-instruct',
'prompt' => 'Hello',
'temperature' => 0.9,
'max_tokens' => 150,
'frequency_penalty' => 0,
'presence_penalty' => 0.6,
]);
This feature might sound familiar from ChatGPT.
Video of demo:
ChatGPT clone is a simple web application powered by the OpenAI library and built with PHP. It allows users to chat with an AI language model that responds in real-time. Chat history is saved using cookies, and the project requires the use of an API key and enabled SQLite3.
Url of The ChatGPT-Clone Repo https://github.com/orhanerday/ChatGPT
Whether to stream back partial progress. If set, tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.
$open_ai = new OpenAi(env('OPEN_AI_API_KEY'));
$opts = [
'prompt' => "Hello",
'temperature' => 0.9,
"max_tokens" => 150,
"frequency_penalty" => 0,
"presence_penalty" => 0.6,
"stream" => true,
];
header('Content-type: text/event-stream');
header('Cache-Control: no-cache');
$open_ai->completion($opts, function ($curl_info, $data) {
echo $data . "<br><br>";
echo PHP_EOL;
ob_flush();
flush();
return strlen($data);
});
Add this part inside <body>
of the HTML
<div id="divID">Hello</div>
<script>
var eventSource = new EventSource("/");
var div = document.getElementById('divID');
eventSource.onmessage = function (e) {
if(e.data == "[DONE]")
{
div.innerHTML += "<br><br>Hello";
}
div.innerHTML += JSON.parse(e.data).choices[0].text;
};
eventSource.onerror = function (e) {
console.log(e);
};
</script>
You should see a response like the in video;
Creates a new edit for the provided input, instruction, and parameters
$result = $open_ai->createEdit([
"model" => "text-davinci-edit-001",
"input" => "What day of the wek is it?",
"instruction" => "Fix the spelling mistakes",
]);
All DALL·E Examples available in this repo.
Given a prompt, the model will return one or more generated images as urls or base64 encoded.
Creates an image given a prompt.
$complete = $open_ai->image([
"prompt" => "A cat drinking milk",
"n" => 1,
"size" => "256x256",
"response_format" => "url",
]);
Creates an edited or extended image given an original image and a prompt.
You need HTML upload for image edit or variation? Please check DALL·E Examples
$otter = curl_file_create(__DIR__ . './files/otter.png');
$mask = curl_file_create(__DIR__ . './files/mask.jpg');
$result = $open_ai->imageEdit([
"image" => $otter,
"mask" => $mask,
"prompt" => "A cute baby sea otter wearing a beret",
"n" => 2,
"size" => "1024x1024",
]);
Creates a variation of a given image.
$otter = curl_file_create(__DIR__ . './files/otter.png');
$result = $open_ai->createImageVariation([
"image" => $otter,
"n" => 2,
"size" => "256x256",
]);
(Deprecated)
This endpoint is deprecated and will be removed on December 3rd, 2022 OpenAI developed new methods with better performance. Learn more.
Given a query and a set of documents or labels, the model ranks each document based on its semantic similarity to the provided query.
$search = $open_ai->search([
'engine' => 'ada',
'documents' => ['White House', 'hospital', 'school'],
'query' => 'the president',
]);
Get a vector representation of a given input that can be easily consumed by machine learning models and algorithms.
Related guide: Embeddings
$result = $open_ai->embeddings([
"model" => "text-similarity-babbage-001",
"input" => "The food was delicious and the waiter..."
]);
(Deprecated)
This endpoint is deprecated and will be removed on December 3rd, 2022 We’ve developed new methods with better performance. Learn more.
Given a question, a set of documents, and some examples, the API generates an answer to the question based on the information in the set of documents. This is useful for question-answering applications on sources of truth, like company documentation or a knowledge base.
$answer = $open_ai->answer([
'documents' => ['Puppy A is happy.', 'Puppy B is sad.'],
'question' => 'which puppy is happy?',
'search_model' => 'ada',
'model' => 'curie',
'examples_context' => 'In 2017, U.S. life expectancy was 78.6 years.',
'examples' => [['What is human life expectancy in the United States?', '78 years.']],
'max_tokens' => 5,
'stop' => ["\n", '<|endoftext|>'],
]);
(Deprecated)
This endpoint is deprecated and will be removed on December 3rd, 2022 OpenAI developed new methods with better performance. Learn more.
Given a query and a set of labeled examples, the model will predict the most likely label for the query. Useful as a drop-in replacement for any ML classification or text-to-label task.
$classification = $open_ai->classification([
'examples' => [
['A happy moment', 'Positive'],
['I am sad.', 'Negative'],
['I am feeling awesome', 'Positive'],
],
'labels' => ['Positive', 'Negative', 'Neutral'],
'query' => 'It is a raining day =>(',
'search_model' => 'ada',
'model' => 'curie',
]);
Given a input text, outputs if the model classifies it as violating OpenAI's content policy.
$flags = $open_ai->moderation([
'input' => 'I want to kill them.'
]);
Know more about Content Moderations here: OpenAI Moderations
(Deprecated)
The Engines endpoints are deprecated. Please use their replacement, Models, instead. Learn more.
Lists the currently available engines, and provides basic information about each one such as the owner and availability.
$engines = $open_ai->engines();
$result = $open_ai->tts([
"model" => "tts-1", // tts-1-hd
"input" => "I'm going to use the stones again. Hey, we'd be going in short-handed, you know",
"voice" => "alloy", // echo, fable, onyx, nova, and shimmer
]);
// Save audio file
file_put_contents('tts-result.mp3', $result);
Transcribes audio into the input language.
$c_file = curl_file_create(__DIR__ . '/files/en-marvel-endgame.m4a');
$result = $open_ai->transcribe([
"model" => "whisper-1",
"file" => $c_file,
]);
{
"text": "I'm going to use the stones again. Hey, we'd be going in short-handed, you know. Look, he's still got the stones, so... So let's get them. Use them to bring everyone back. Just like that? Yeah, just like that. Even if there's a small chance that we can undo this, I mean, we owe it to everyone who's not in this room to try. If we do this, how do we know it's going to end any differently than it did before? Because before you didn't have me. Hey, little girl, everybody in this room is about that superhero life. And if you don't mind my asking, where the hell have you been all this time? There are a lot of other planets in the universe. But unfortunately, they didn't have you guys. I like this one. Let's go get this son of a bitch."
}
Translates audio into English.
I use Turkish voice for translation thanks to famous science YouTuber Barış Özcan
$c_file = curl_file_create(__DIR__ . '/files/tr-baris-ozcan-youtuber.m4a');
$result = $open_ai->translate([
"model" => "whisper-1",
"file" => $c_file,
]);
{
"text": "GPT-3. Last month, the biggest leap in the world of artificial intelligence in recent years happened silently. Maybe the biggest leap of all time. GPT-3's beta version was released by OpenAI. When you hear such a sentence, you may think, what kind of leap is this? But be sure, this is the most advanced language model with the most advanced language model with the most advanced language ability. It can answer these artificial intelligence questions, it can translate and even write poetry. Those who have gained access to the API or API of GPT-3 have already started to make very interesting experiments. Let's look at a few examples together. Let's start with an example of aphorism. This site produces beautiful words that you can tweet. Start to actually do things with your words instead of just thinking about them."
}
Need HTML upload for audio? Check this section and change api references. Example :
...
echo $open_ai->translate(
[
"purpose" => "answers",
"file" => $c_file,
]
);
...
// OR
...
echo $open_ai->transcribe(
[
"purpose" => "answers",
"file" => $c_file,
]
);
...
Files are used to upload documents that can be used across features like Answers, Search, and Classifications
Returns a list of files that belong to the user's organization.
$files = $open_ai->listFiles();
Upload a file that contains document(s) to be used across various endpoints/features. Currently, the size of all the files uploaded by one organization can be up to 1 GB. Please contact OpenAI if you need to increase the storage limit.
$c_file = curl_file_create(__DIR__ . 'files/sample_file_1.jsonl');
$result = $open_ai->uploadFile([
"purpose" => "answers",
"file" => $c_file,
]);
<form action="index.php" method="post" enctype="multipart/form-data">
Select file to upload:
<input type="file" name="fileToUpload" id="fileToUpload">
<input type="submit" value="Upload File" name="submit">
</form>
<?php
require __DIR__ . '/vendor/autoload.php';
use Orhanerday\OpenAi\OpenAi;
if ($_SERVER['REQUEST_METHOD'] == 'POST') {
ob_clean();
$open_ai = new OpenAi(env('OPEN_AI_API_KEY'));
$tmp_file = $_FILES['fileToUpload']['tmp_name'];
$file_name = basename($_FILES['fileToUpload']['name']);
$c_file = curl_file_create($tmp_file, $_FILES['fileToUpload']['type'], $file_name);
echo "[";
echo $open_ai->uploadFile(
[
"purpose" => "answers",
"file" => $c_file,
]
);
echo ",";
echo $open_ai->listFiles();
echo "]";
}
$result = $open_ai->deleteFile('file-xxxxxxxx');
$file = $open_ai->retrieveFile('file-xxxxxxxx');
$file = $open_ai->retrieveFileContent('file-xxxxxxxx');
Manage fine-tuning jobs to tailor a model to your specific training data.
$result = $open_ai->createFineTune([
"model" => "gpt-3.5-turbo-1106",
"training_file" => "file-U3KoAAtGsjUKSPXwEUDdtw86",
]);
$fine_tunes = $open_ai->listFineTunes();
$fine_tune = $open_ai->retrieveFineTune('ft-AF1WoRqd3aJAHsqc9NY7iL8F');
$result = $open_ai->cancelFineTune('ft-AF1WoRqd3aJAHsqc9NY7iL8F');
$fine_tune_events = $open_ai->listFineTuneEvents('ft-AF1WoRqd3aJAHsqc9NY7iL8F');
$result = $open_ai->deleteFineTune('curie:ft-acmeco-2021-03-03-21-44-20');
(Deprecated)
Retrieves an engine instance, providing basic information about the engine such as the owner and availability.
$engine = $open_ai->engine('davinci');
List and describe the various models available in the API.
Lists the currently available models, and provides basic information about each one such as the owner and availability.
$result = $open_ai->listModels();
Retrieves a model instance, providing basic information about the model such as the owner and permissioning.
$result = $open_ai->retrieveModel("text-ada-001");
echo $search;
Allows you to build AI assistants within your own applications.
Create an assistant with a model and instructions.
$data = [
'model' => 'gpt-3.5-turbo',
'name' => 'my assistant',
'description' => 'my assistant description',
'instructions' => 'you should cordially help me',
'tools' => [],
'file_ids' => [],
];
$assistant = $open_ai->createAssistant($data);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$assistant = $open_ai->retrieveAssistant($assistantId);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$data = [
'name' => 'my modified assistant',
'instructions' => 'you should cordially help me again',
];
$assistant = $open_ai->modifyAssistant($assistantId, $data);
$assistantId = 'asst_DgiOnXK7nRfyvqoXWpFlwESc';
$assistant = $open_ai->deleteAssistant($assistantId);
Returns a list of assistants.
$query = ['limit' => 10];
$assistants = $open_ai->listAssistants($query);
Create an assistant file by attaching a File to an assistant.
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$fileId = 'file-jrNZZZBAPGnhYUKma7CblGoR';
$file = $open_ai->createAssistantFile($assistantId, $fileId);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$fileId = 'file-jrNZZZBAPGnhYUKma7CblGoR';
$file = $open_ai->retrieveAssistantFile($assistantId, $fileId);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$fileId = 'file-jrNZZZBAPGnhYUKma7CblGoR';
$file = $open_ai->deleteAssistantFile($assistantId, $fileId);
Returns a list of assistant files.
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$query = ['limit' => 10];
$files = $open_ai->listAssistantFiles($assistantId, $query);
Create threads that assistants can interact with.
$data = [
'messages' => [
[
'role' => 'user',
'content' => 'Hello, what is AI?',
'file_ids' => [],
],
],
];
$thread = $open_ai->createThread($data);
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$thread = $open_ai->retrieveThread($threadId);
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$data = [
'metadata' => ['test' => '1234abcd'],
];
$thread = $open_ai->modifyThread($threadId, $data);
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$thread = $open_ai->deleteThread($threadId);
Create messages within threads.
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$data = [
'role' => 'user',
'content' => 'How does AI work? Explain it in simple terms.',
];
$message = $open_ai->createThreadMessage($threadId, $data);
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_d37P5XgREsm6BItOcppnBO1b';
$message = $open_ai->retrieveThreadMessage($threadId, $messageId);
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_d37P5XgREsm6BItOcppnBO1b';
$data = [
'metadata' => ['test' => '1234abcd'],
];
$message = $open_ai->modifyThreadMessage($threadId, $messageId, $data);
Returns a list of messages for a given thread.
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$query = ['limit' => 10];
$messages = $open_ai->listThreadMessages($threadId, $query);
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_CZ47kAGZugAfeHMX6bmJIukP';
$fileId = 'file-CRLcY63DiHphWuBrmDWZVCgA';
$file = $open_ai->retrieveMessageFile($threadId, $messageId, $fileId);
Returns a list of message files.
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_CZ47kAGZugAfeHMX6bmJIukP';
$query = ['limit' => 10];
$files = $open_ai->listMessageFiles($threadId, $messageId, $query);
Represents an execution run on a thread.
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$data = ['assistant_id' => 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz'];
$run = $open_ai->createRun($threadId, $data);
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$run = $open_ai->retrieveRun($threadId, $runId);
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$data = [
'metadata' => ['test' => 'abcd1234'],
];
$run = $open_ai->modifyRun($threadId, $runId, $data);
Returns a list of runs belonging to a thread.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$query = ['limit' => 10];
$runs = $open_ai->listRuns($threadId, $query);
When a run has the status: "requires_action" and required_action.type is submit_tool_outputs, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$outputs = [
'tool_outputs' => [
['tool_call_id' => 'call_abc123', 'output' => '28C'],
],
];
$run = $open_ai->submitToolOutputs($threadId, $runId, $outputs);
Cancels a run that is "in_progress".
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$run = $open_ai->cancelRun($threadId, $runId);
Create a thread and run it in one request.
$data = [
'assistant_id' => 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz',
'thread' => [
'messages' => [
[
'role' => 'user',
'content' => 'Hello, what is AI?',
'file_ids' => [],
],
],
],
];
$run = $open_ai->createThreadAndRun($data);
Retrieves a step in execution of a run.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$stepId = 'step_kwLG0vPQjqVyQHVoL7GVK3aG';
$step = $open_ai->retrieveRunStep($threadId, $runId, $stepId);
Returns a list of run steps belonging to a run.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$query = ['limit' => 10];
$steps = $open_ai->listRunSteps($threadId, $runId, $query);
To run all tests:
composer test
To run only those tests that work for most user (exclude those that require a missing folder or that hit deprecated endpoints no longer available to most users):
./vendor/bin/pest --group=working
Please see CHANGELOG for more information on what has changed recently.
Please see CONTRIBUTING for details.
Please report security vulnerabilities to [email protected]
The MIT License (MIT). Please see License File for more information.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for open-ai
Similar Open Source Tools

open-ai
Open AI is a powerful tool for artificial intelligence research and development. It provides a wide range of machine learning models and algorithms, making it easier for developers to create innovative AI applications. With Open AI, users can explore cutting-edge technologies such as natural language processing, computer vision, and reinforcement learning. The platform offers a user-friendly interface and comprehensive documentation to support users in building and deploying AI solutions. Whether you are a beginner or an experienced AI practitioner, Open AI offers the tools and resources you need to accelerate your AI projects and stay ahead in the rapidly evolving field of artificial intelligence.

GEN-AI
GEN-AI is a versatile Python library for implementing various artificial intelligence algorithms and models. It provides a wide range of tools and functionalities to support machine learning, deep learning, natural language processing, computer vision, and reinforcement learning tasks. With GEN-AI, users can easily build, train, and deploy AI models for diverse applications such as image recognition, text classification, sentiment analysis, object detection, and game playing. The library is designed to be user-friendly, efficient, and scalable, making it suitable for both beginners and experienced AI practitioners.

simple-ai
Simple AI is a lightweight Python library for implementing basic artificial intelligence algorithms. It provides easy-to-use functions and classes for tasks such as machine learning, natural language processing, and computer vision. With Simple AI, users can quickly prototype and deploy AI solutions without the complexity of larger frameworks.

spring-ai
The Spring AI project provides a Spring-friendly API and abstractions for developing AI applications. It offers a portable client API for interacting with generative AI models, enabling developers to easily swap out implementations and access various models like OpenAI, Azure OpenAI, and HuggingFace. Spring AI also supports prompt engineering, providing classes and interfaces for creating and parsing prompts, as well as incorporating proprietary data into generative AI without retraining the model. This is achieved through Retrieval Augmented Generation (RAG), which involves extracting, transforming, and loading data into a vector database for use by AI models. Spring AI's VectorStore abstraction allows for seamless transitions between different vector database implementations.

jadx-ai-mcp
JADX-AI-MCP is a plugin for the JADX decompiler that integrates with Model Context Protocol (MCP) to provide live reverse engineering support with LLMs like Claude. It allows for quick analysis, vulnerability detection, and AI code modification, all in real time. The tool combines JADX-AI-MCP and JADX MCP SERVER to analyze Android APKs effortlessly. It offers various prompts for code understanding, vulnerability detection, reverse engineering helpers, static analysis, AI code modification, and documentation. The tool is part of the Zin MCP Suite and aims to connect all android reverse engineering and APK modification tools with a single MCP server for easy reverse engineering of APK files.

dexto
Dexto is a lightweight runtime for creating and running AI agents that turn natural language into real-world actions. It serves as the missing intelligence layer for building AI applications, standalone chatbots, or as the reasoning engine inside larger products. Dexto features a powerful CLI and Web UI for running AI agents, supports multiple interfaces, allows hot-swapping of LLMs from various providers, connects to remote tool servers via the Model Context Protocol, is config-driven with version-controlled YAML, offers production-ready core features, extensibility for custom services, and enables multi-agent collaboration via MCP and A2A.

nndeploy
nndeploy is a tool that allows you to quickly build your visual AI workflow without the need for frontend technology. It provides ready-to-use algorithm nodes for non-AI programmers, including large language models, Stable Diffusion, object detection, image segmentation, etc. The workflow can be exported as a JSON configuration file, supporting Python/C++ API for direct loading and running, deployment on cloud servers, desktops, mobile devices, edge devices, and more. The framework includes mainstream high-performance inference engines and deep optimization strategies to help you transform your workflow into enterprise-level production applications.

azure-ai-docs
Azure AI Docs is a repository that provides detailed documentation and resources for developers looking to leverage Microsoft's AI services on the Azure platform. The repository covers a wide range of topics including machine learning, natural language processing, computer vision, and more. Developers can find tutorials, code samples, best practices, and guidelines to help them integrate AI capabilities into their applications seamlessly.

Disciplined-AI-Software-Development
Disciplined AI Software Development is a comprehensive repository that provides guidelines and best practices for developing AI software in a disciplined manner. It covers topics such as project organization, code structure, documentation, testing, and deployment strategies to ensure the reliability, scalability, and maintainability of AI applications. The repository aims to help developers and teams navigate the complexities of AI development by offering practical advice and examples to follow.

ai-manus
AI Manus is a general-purpose AI Agent system that supports running various tools and operations in a sandbox environment. It offers deployment with minimal dependencies, supports multiple tools like Terminal, Browser, File, Web Search, and messaging tools, allocates separate sandboxes for tasks, manages session history, supports stopping and interrupting conversations, file upload and download, and is multilingual. The system also provides user login and authentication. The project primarily relies on Docker for development and deployment, with model capability requirements and recommended Deepseek and GPT models.

pipelex
Pipelex is an open-source devtool designed to transform how users build repeatable AI workflows. It acts as a Docker or SQL for AI operations, allowing users to create modular 'pipes' using different LLMs for structured outputs. These pipes can be connected sequentially, in parallel, or conditionally to build complex knowledge transformations from reusable components. With Pipelex, users can share and scale proven methods instantly, saving time and effort in AI workflow development.

ml-retreat
ML-Retreat is a comprehensive machine learning library designed to simplify and streamline the process of building and deploying machine learning models. It provides a wide range of tools and utilities for data preprocessing, model training, evaluation, and deployment. With ML-Retreat, users can easily experiment with different algorithms, hyperparameters, and feature engineering techniques to optimize their models. The library is built with a focus on scalability, performance, and ease of use, making it suitable for both beginners and experienced machine learning practitioners.

pdr_ai_v2
pdr_ai_v2 is a Python library for implementing machine learning algorithms and models. It provides a wide range of tools and functionalities for data preprocessing, model training, evaluation, and deployment. The library is designed to be user-friendly and efficient, making it suitable for both beginners and experienced data scientists. With pdr_ai_v2, users can easily build and deploy machine learning models for various applications, such as classification, regression, clustering, and more.

ai
This repository contains a collection of AI algorithms and models for various machine learning tasks. It provides implementations of popular algorithms such as neural networks, decision trees, and support vector machines. The code is well-documented and easy to understand, making it suitable for both beginners and experienced developers. The repository also includes example datasets and tutorials to help users get started with building and training AI models. Whether you are a student learning about AI or a professional working on machine learning projects, this repository can be a valuable resource for your development journey.

ai-workshop-code
The ai-workshop-code repository contains code examples and tutorials for various artificial intelligence concepts and algorithms. It serves as a practical resource for individuals looking to learn and implement AI techniques in their projects. The repository covers a wide range of topics, including machine learning, deep learning, natural language processing, computer vision, and reinforcement learning. By exploring the code and following the tutorials, users can gain hands-on experience with AI technologies and enhance their understanding of how these algorithms work in practice.

deepteam
Deepteam is a powerful open-source tool designed for deep learning projects. It provides a user-friendly interface for training, testing, and deploying deep neural networks. With Deepteam, users can easily create and manage complex models, visualize training progress, and optimize hyperparameters. The tool supports various deep learning frameworks and allows seamless integration with popular libraries like TensorFlow and PyTorch. Whether you are a beginner or an experienced deep learning practitioner, Deepteam simplifies the development process and accelerates model deployment.
For similar tasks

Azure-Analytics-and-AI-Engagement
The Azure-Analytics-and-AI-Engagement repository provides packaged Industry Scenario DREAM Demos with ARM templates (Containing a demo web application, Power BI reports, Synapse resources, AML Notebooks etc.) that can be deployed in a customer’s subscription using the CAPE tool within a matter of few hours. Partners can also deploy DREAM Demos in their own subscriptions using DPoC.

sorrentum
Sorrentum is an open-source project that aims to combine open-source development, startups, and brilliant students to build machine learning, AI, and Web3 / DeFi protocols geared towards finance and economics. The project provides opportunities for internships, research assistantships, and development grants, as well as the chance to work on cutting-edge problems, learn about startups, write academic papers, and get internships and full-time positions at companies working on Sorrentum applications.

tidb
TiDB is an open-source distributed SQL database that supports Hybrid Transactional and Analytical Processing (HTAP) workloads. It is MySQL compatible and features horizontal scalability, strong consistency, and high availability.

zep-python
Zep is an open-source platform for building and deploying large language model (LLM) applications. It provides a suite of tools and services that make it easy to integrate LLMs into your applications, including chat history memory, embedding, vector search, and data enrichment. Zep is designed to be scalable, reliable, and easy to use, making it a great choice for developers who want to build LLM-powered applications quickly and easily.

telemetry-airflow
This repository codifies the Airflow cluster that is deployed at workflow.telemetry.mozilla.org (behind SSO) and commonly referred to as "WTMO" or simply "Airflow". Some links relevant to users and developers of WTMO: * The `dags` directory in this repository contains some custom DAG definitions * Many of the DAGs registered with WTMO don't live in this repository, but are instead generated from ETL task definitions in bigquery-etl * The Data SRE team maintains a WTMO Developer Guide (behind SSO)

mojo
Mojo is a new programming language that bridges the gap between research and production by combining Python syntax and ecosystem with systems programming and metaprogramming features. Mojo is still young, but it is designed to become a superset of Python over time.

pandas-ai
PandasAI is a Python library that makes it easy to ask questions to your data in natural language. It helps you to explore, clean, and analyze your data using generative AI.

databend
Databend is an open-source cloud data warehouse that serves as a cost-effective alternative to Snowflake. With its focus on fast query execution and data ingestion, it's designed for complex analysis of the world's largest datasets.
For similar jobs

weave
Weave is a toolkit for developing Generative AI applications, built by Weights & Biases. With Weave, you can log and debug language model inputs, outputs, and traces; build rigorous, apples-to-apples evaluations for language model use cases; and organize all the information generated across the LLM workflow, from experimentation to evaluations to production. Weave aims to bring rigor, best-practices, and composability to the inherently experimental process of developing Generative AI software, without introducing cognitive overhead.

LLMStack
LLMStack is a no-code platform for building generative AI agents, workflows, and chatbots. It allows users to connect their own data, internal tools, and GPT-powered models without any coding experience. LLMStack can be deployed to the cloud or on-premise and can be accessed via HTTP API or triggered from Slack or Discord.

VisionCraft
The VisionCraft API is a free API for using over 100 different AI models. From images to sound.

kaito
Kaito is an operator that automates the AI/ML inference model deployment in a Kubernetes cluster. It manages large model files using container images, avoids tuning deployment parameters to fit GPU hardware by providing preset configurations, auto-provisions GPU nodes based on model requirements, and hosts large model images in the public Microsoft Container Registry (MCR) if the license allows. Using Kaito, the workflow of onboarding large AI inference models in Kubernetes is largely simplified.

PyRIT
PyRIT is an open access automation framework designed to empower security professionals and ML engineers to red team foundation models and their applications. It automates AI Red Teaming tasks to allow operators to focus on more complicated and time-consuming tasks and can also identify security harms such as misuse (e.g., malware generation, jailbreaking), and privacy harms (e.g., identity theft). The goal is to allow researchers to have a baseline of how well their model and entire inference pipeline is doing against different harm categories and to be able to compare that baseline to future iterations of their model. This allows them to have empirical data on how well their model is doing today, and detect any degradation of performance based on future improvements.

tabby
Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features: * Self-contained, with no need for a DBMS or cloud service. * OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE). * Supports consumer-grade GPUs.

spear
SPEAR (Simulator for Photorealistic Embodied AI Research) is a powerful tool for training embodied agents. It features 300 unique virtual indoor environments with 2,566 unique rooms and 17,234 unique objects that can be manipulated individually. Each environment is designed by a professional artist and features detailed geometry, photorealistic materials, and a unique floor plan and object layout. SPEAR is implemented as Unreal Engine assets and provides an OpenAI Gym interface for interacting with the environments via Python.

Magick
Magick is a groundbreaking visual AIDE (Artificial Intelligence Development Environment) for no-code data pipelines and multimodal agents. Magick can connect to other services and comes with nodes and templates well-suited for intelligent agents, chatbots, complex reasoning systems and realistic characters.