aiotdlib
Python asyncio Telegram client based on TDLib https://github.com/tdlib/td
Stars: 96
aiotdlib is a Python asyncio Telegram client based on TDLib. It provides automatic generation of types and functions from tl schema, validation, good IDE type hinting, and high-level API methods for simpler work with tdlib. The package includes prebuilt TDLib binaries for macOS (arm64) and Debian Bullseye (amd64). Users can use their own binary by passing `library_path` argument to `Client` class constructor. Compatibility with other versions of the library is not guaranteed. The tool requires Python 3.9+ and users need to get their `api_id` and `api_hash` from Telegram docs for installation and usage.
README:
aiotdlib - Python asyncio Telegram client based on TDLib
This wrapper is actual for TDLib v1.8.30 (958fed6e8e440afe87b57c98216a5c8d3f3caed8)
This package includes prebuilt TDLib binaries for macOS (arm64) and Debian Bullseye (amd64). You can use your own binary by passing
library_path
argument toClient
class constructor. Make sure it's built from this commit. Compatibility with other versions of library is not guaranteed.
- All types and functions are generated automatically from tl schema
- All types and functions come with validation and good IDE type hinting (thanks to Pydantic)
- A set of high-level API methods which makes work with tdlib much simpler
- Python 3.9+
- Get your api_id and api_hash. Read more in Telegram docs
pip install aiotdlib
or if you use Poetry
poetry add aiotdlib
import asyncio
import logging
from aiotdlib import Client, ClientSettings
API_ID = 123456
API_HASH = ""
PHONE_NUMBER = ""
async def main():
client = Client(
settings=ClientSettings(
api_id=API_ID,
api_hash=API_HASH,
phone_number=PHONE_NUMBER
)
)
async with client:
me = await client.api.get_me()
logging.info(f"Successfully logged in as {me.model_dump_json()}")
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO)
asyncio.run(main())
Any parameter of Client class could be also set via environment variables with prefix AIOTDLIB_*
.
import asyncio
import logging
from aiotdlib import Client
async def main():
async with Client() as client:
me = await client.api.get_me()
logging.info(f"Successfully logged in as {me.model_dump_json()}")
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO)
asyncio.run(main())
and run it like this:
export AIOTDLIB_API_ID=123456
export AIOTDLIB_API_HASH=<my_api_hash>
export AIOTDLIB_BOT_TOKEN=<my_bot_token>
python main.py
import asyncio
import logging
from aiotdlib import Client, ClientSettings
from aiotdlib.api import API, BaseObject, UpdateNewMessage
API_ID = 123456
API_HASH = ""
PHONE_NUMBER = ""
async def on_update_new_message(client: Client, update: UpdateNewMessage):
chat_id = update.message.chat_id
# api field of client instance contains all TDLib functions, for example get_chat
chat = await client.api.get_chat(chat_id)
logging.info(f'Message received in chat {chat.title}')
async def any_event_handler(client: Client, update: BaseObject):
logging.info(f'Event of type {update.ID} received')
async def main():
client = Client(
settings=ClientSettings(
api_id=API_ID,
api_hash=API_HASH,
phone_number=PHONE_NUMBER
)
)
# Registering event handler for 'updateNewMessage' event
# You can register many handlers for certain event type
client.add_event_handler(on_update_new_message, update_type=API.Types.UPDATE_NEW_MESSAGE)
# You can register handler for special event type "*".
# It will be called for each received event
client.add_event_handler(any_event_handler, update_type=API.Types.ANY)
async with client:
# idle() will run client until it's stopped
await client.idle()
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO)
asyncio.run(main())
import logging
from aiotdlib import Client, ClientSettings
from aiotdlib.api import UpdateNewMessage
API_ID = 123456
API_HASH = ""
BOT_TOKEN = ""
bot = Client(
settings=ClientSettings(
api_id=API_ID,
api_hash=API_HASH,
bot_token=BOT_TOKEN
)
)
# Note: bot_command_handler method is universal and can be used directly or as decorator
# Registering handler for '/help' command
@bot.bot_command_handler(command='help')
async def on_help_command(client: Client, update: UpdateNewMessage):
# Each command handler registered with this method will update update.EXTRA field
# with command related data: {'bot_command': 'help', 'bot_command_args': []}
await client.send_text(update.message.chat_id, "I will help you!")
async def on_start_command(client: Client, update: UpdateNewMessage):
# So this will print "{'bot_command': 'help', 'bot_command_args': []}"
print(update.EXTRA)
await client.send_text(update.message.chat_id, "Have a good day! :)")
async def on_custom_command(client: Client, update: UpdateNewMessage):
# So when you send a message "/custom 1 2 3 test"
# So this will print "{'bot_command': 'custom', 'bot_command_args': ['1', '2', '3', 'test']}"
print(update.EXTRA)
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO)
# Registering handler for '/start' command
bot.bot_command_handler(on_start_command, command='start')
bot.bot_command_handler(on_custom_command, command='custom')
bot.run()
import asyncio
import logging
from aiotdlib import Client
from aiotdlib import ClientSettings
from aiotdlib import ClientProxySettings
from aiotdlib import ClientProxyType
API_ID = 123456
API_HASH = ""
PHONE_NUMBER = ""
async def main():
client = Client(
settings=ClientSettings(
api_id=API_ID,
api_hash=API_HASH,
phone_number=PHONE_NUMBER,
proxy_settings=ClientProxySettings(
host="10.0.0.1",
port=3333,
type=ClientProxyType.SOCKS5,
username="aiotdlib",
password="somepassword",
)
)
)
async with client:
await client.idle()
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO)
asyncio.run(main())
import asyncio
import logging
from aiotdlib import Client, HandlerCallable, ClientSettings
from aiotdlib.api import API, BaseObject, UpdateNewMessage
API_ID = 12345
API_HASH = ""
PHONE_NUMBER = ""
async def some_pre_updates_work(event: BaseObject):
logging.info(f"Before call all update handlers for event {event.ID}")
async def some_post_updates_work(event: BaseObject):
logging.info(f"After call all update handlers for event {event.ID}")
# Note that call_next argument would always be passed as keyword argument,
# so it should be called "call_next" only.
async def my_middleware(client: Client, event: BaseObject, /, *, call_next: HandlerCallable):
# Middlewares useful for opening database connections for example
await some_pre_updates_work(event)
try:
await call_next(client, event)
finally:
await some_post_updates_work(event)
async def on_update_new_message(client: Client, update: UpdateNewMessage):
logging.info('on_update_new_message handler called')
async def main():
client = Client(
settings=ClientSettings(
api_id=API_ID,
api_hash=API_HASH,
phone_number=PHONE_NUMBER
)
)
client.add_event_handler(on_update_new_message, update_type=API.Types.UPDATE_NEW_MESSAGE)
# Registering middleware.
# Note that middleware would be called for EVERY EVENT.
# Don't use them for long-running tasks as it could be heavy performance hit
# You can add as much middlewares as you want.
# They would be called in order you've added them
client.add_middleware(my_middleware)
async with client:
await client.idle()
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO)
asyncio.run(main())
This project is licensed under the terms of the MIT license.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for aiotdlib
Similar Open Source Tools
aiotdlib
aiotdlib is a Python asyncio Telegram client based on TDLib. It provides automatic generation of types and functions from tl schema, validation, good IDE type hinting, and high-level API methods for simpler work with tdlib. The package includes prebuilt TDLib binaries for macOS (arm64) and Debian Bullseye (amd64). Users can use their own binary by passing `library_path` argument to `Client` class constructor. Compatibility with other versions of the library is not guaranteed. The tool requires Python 3.9+ and users need to get their `api_id` and `api_hash` from Telegram docs for installation and usage.
acte
Acte is a framework designed to build GUI-like tools for AI Agents. It aims to address the issues of cognitive load and freedom degrees when interacting with multiple APIs in complex scenarios. By providing a graphical user interface (GUI) for Agents, Acte helps reduce cognitive load and constraints interaction, similar to how humans interact with computers through GUIs. The tool offers APIs for starting new sessions, executing actions, and displaying screens, accessible via HTTP requests or the SessionManager class.
aio-pika
Aio-pika is a wrapper around aiormq for asyncio and humans. It provides a completely asynchronous API, object-oriented API, transparent auto-reconnects with complete state recovery, Python 3.7+ compatibility, transparent publisher confirms support, transactions support, and complete type-hints coverage.
aiocryptopay
The aiocryptopay repository is an asynchronous API wrapper for interacting with the @cryptobot and @CryptoTestnetBot APIs. It provides methods for creating, getting, and deleting invoices and checks, as well as handling webhooks for invoice payments. Users can easily integrate this tool into their applications to manage cryptocurrency payments and transactions.
clarifai-python
The Clarifai Python SDK offers a comprehensive set of tools to integrate Clarifai's AI platform to leverage computer vision capabilities like classification , detection ,segementation and natural language capabilities like classification , summarisation , generation , Q&A ,etc into your applications. With just a few lines of code, you can leverage cutting-edge artificial intelligence to unlock valuable insights from visual and textual content.
polyfire-js
Polyfire is an all-in-one managed backend for AI apps that allows users to build AI apps directly from the frontend, eliminating the need for a separate backend. It simplifies the process by providing most backend services in just a few lines of code. With Polyfire, users can easily create chatbots, transcribe audio files to text, generate simple text, create a long-term memory, and generate images with Dall-E. The tool also offers starter guides and tutorials to help users get started quickly and efficiently.
venom
Venom is a high-performance system developed with JavaScript to create a bot for WhatsApp, support for creating any interaction, such as customer service, media sending, sentence recognition based on artificial intelligence and all types of design architecture for WhatsApp.
island-ai
island-ai is a TypeScript toolkit tailored for developers engaging with structured outputs from Large Language Models. It offers streamlined processes for handling, parsing, streaming, and leveraging AI-generated data across various applications. The toolkit includes packages like zod-stream for interfacing with LLM streams, stream-hooks for integrating streaming JSON data into React applications, and schema-stream for JSON streaming parsing based on Zod schemas. Additionally, related packages like @instructor-ai/instructor-js focus on data validation and retry mechanisms, enhancing the reliability of data processing workflows.
aioaws
Aioaws is an asyncio SDK for some AWS services, providing clean, secure, and easily debuggable access to services like S3, SES, and SNS. It is written from scratch without dependencies on boto or boto3, formatted with black, and includes complete type hints. The library supports various functionalities such as listing, deleting, and generating signed URLs for S3 files, sending emails with attachments and multipart content via SES, and receiving notifications about mail delivery from SES. It also offers AWS Signature Version 4 authentication and has minimal dependencies like aiofiles, cryptography, httpx, and pydantic.
Webscout
WebScout is a versatile tool that allows users to search for anything using Google, DuckDuckGo, and phind.com. It contains AI models, can transcribe YouTube videos, generate temporary email and phone numbers, has TTS support, webai (terminal GPT and open interpreter), and offline LLMs. It also supports features like weather forecasting, YT video downloading, temp mail and number generation, text-to-speech, advanced web searches, and more.
com.openai.unity
com.openai.unity is an OpenAI package for Unity that allows users to interact with OpenAI's API through RESTful requests. It is independently developed and not an official library affiliated with OpenAI. Users can fine-tune models, create assistants, chat completions, and more. The package requires Unity 2021.3 LTS or higher and can be installed via Unity Package Manager or Git URL. Various features like authentication, Azure OpenAI integration, model management, thread creation, chat completions, audio processing, image generation, file management, fine-tuning, batch processing, embeddings, and content moderation are available.
ai21-python
The AI21 Labs Python SDK is a comprehensive tool for interacting with the AI21 API. It provides functionalities for chat completions, conversational RAG, token counting, error handling, and support for various cloud providers like AWS, Azure, and Vertex. The SDK offers both synchronous and asynchronous usage, along with detailed examples and documentation. Users can quickly get started with the SDK to leverage AI21's powerful models for various natural language processing tasks.
funcchain
Funcchain is a Python library that allows you to easily write cognitive systems by leveraging Pydantic models as output schemas and LangChain in the backend. It provides a seamless integration of LLMs into your apps, utilizing OpenAI Functions or LlamaCpp grammars (json-schema-mode) for efficient structured output. Funcchain compiles the Funcchain syntax into LangChain runnables, enabling you to invoke, stream, or batch process your pipelines effortlessly.
crawl4ai
Crawl4AI is a powerful and free web crawling service that extracts valuable data from websites and provides LLM-friendly output formats. It supports crawling multiple URLs simultaneously, replaces media tags with ALT, and is completely free to use and open-source. Users can integrate Crawl4AI into Python projects as a library or run it as a standalone local server. The tool allows users to crawl and extract data from specified URLs using different providers and models, with options to include raw HTML content, force fresh crawls, and extract meaningful text blocks. Configuration settings can be adjusted in the `crawler/config.py` file to customize providers, API keys, chunk processing, and word thresholds. Contributions to Crawl4AI are welcome from the open-source community to enhance its value for AI enthusiasts and developers.
OpenAI-DotNet
OpenAI-DotNet is a simple C# .NET client library for OpenAI to use through their RESTful API. It is independently developed and not an official library affiliated with OpenAI. Users need an OpenAI API account to utilize this library. The library targets .NET 6.0 and above, working across various platforms like console apps, winforms, wpf, asp.net, etc., and on Windows, Linux, and Mac. It provides functionalities for authentication, interacting with models, assistants, threads, chat, audio, images, files, fine-tuning, embeddings, and moderations.
obsei
Obsei is an open-source, low-code, AI powered automation tool that consists of an Observer to collect unstructured data from various sources, an Analyzer to analyze the collected data with various AI tasks, and an Informer to send analyzed data to various destinations. The tool is suitable for scheduled jobs or serverless applications as all Observers can store their state in databases. Obsei is still in alpha stage, so caution is advised when using it in production. The tool can be used for social listening, alerting/notification, automatic customer issue creation, extraction of deeper insights from feedbacks, market research, dataset creation for various AI tasks, and more based on creativity.
For similar tasks
writer-framework
Writer Framework is an open-source framework for creating AI applications. It allows users to build user interfaces using a visual editor and write the backend code in Python. The framework is fast, flexible, and developer-friendly, providing separation of concerns between UI and business logic. It is reactive and state-driven, allowing for highly customizable elements without the need for CSS. Writer Framework is designed to be fast, with minimal overhead on Python code, and uses WebSockets for synchronization. It is contained in a standard Python package, supports local code editing with instant refreshes, and enables editing the UI while the app is running.
aiotdlib
aiotdlib is a Python asyncio Telegram client based on TDLib. It provides automatic generation of types and functions from tl schema, validation, good IDE type hinting, and high-level API methods for simpler work with tdlib. The package includes prebuilt TDLib binaries for macOS (arm64) and Debian Bullseye (amd64). Users can use their own binary by passing `library_path` argument to `Client` class constructor. Compatibility with other versions of the library is not guaranteed. The tool requires Python 3.9+ and users need to get their `api_id` and `api_hash` from Telegram docs for installation and usage.
writer-framework
Writer Framework is an open-source framework for creating AI applications. It allows users to build user interfaces using a visual editor and write the backend code in Python. The framework is fast, flexible, and provides separation of concerns between UI and business logic. It is reactive and state-driven, highly customizable without requiring CSS, fast in event handling, developer-friendly with easy installation and quick start options, and contains full documentation for using its AI module and deployment options.
uAgents
uAgents is a Python library developed by Fetch.ai that allows for the creation of autonomous AI agents. These agents can perform various tasks on a schedule or take action on various events. uAgents are easy to create and manage, and they are connected to a fast-growing network of other uAgents. They are also secure, with cryptographically secured messages and wallets.
venom
Venom is a high-performance system developed with JavaScript to create a bot for WhatsApp, support for creating any interaction, such as customer service, media sending, sentence recognition based on artificial intelligence and all types of design architecture for WhatsApp.
Whatsapp-Ai-BOT
This WhatsApp AI chatbot is built using NodeJS technology and powered by OpenAI. It leverages the advanced deep learning models of ChatGPT, Playground, and DALL·E from OpenAI to provide a unique text-based and image-based conversational experience for users. The bot has two main features: ChatGPT (text) and DALL-E (Text To Image). To use these features, simply use the commands /ai, /img, and /sc respectively. The bot's code is encrypted to protect it from prying eyes, but the key to unlock the full potential of this amazing creation can be obtained by contacting the developer. The bot is free to use, but a PRIME version is available with additional features such as history mode, prime support, and customizable options.
IBRAHIM-AI-10.10
BMW MD is a simple WhatsApp user BOT created by Ibrahim Tech. It allows users to scan pairing codes or QR codes to connect to WhatsApp and deploy the bot on Heroku. The bot can be used to perform various tasks such as sending messages, receiving messages, and managing contacts. It is released under the MIT License and contributions are welcome.
slack-machine
Slack Machine is a simple, yet powerful and extendable Slack bot framework. More than just a bot, Slack Machine is a framework that helps you develop your Slack workspace into a ChatOps powerhouse. Slack Machine is built with an intuitive plugin system that lets you build bots quickly, but also allows for easy code organization.
For similar jobs
EDDI
E.D.D.I (Enhanced Dialog Driven Interface) is an enterprise-certified chatbot middleware that offers advanced prompt and conversation management for Conversational AI APIs. Developed in Java using Quarkus, it is lean, RESTful, scalable, and cloud-native. E.D.D.I is highly scalable and designed to efficiently manage conversations in AI-driven applications, with seamless API integration capabilities. Notable features include configurable NLP and Behavior rules, support for multiple chatbots running concurrently, and integration with MongoDB, OAuth 2.0, and HTML/CSS/JavaScript for UI. The project requires Java 21, Maven 3.8.4, and MongoDB >= 5.0 to run. It can be built as a Docker image and deployed using Docker or Kubernetes, with additional support for integration testing and monitoring through Prometheus and Kubernetes endpoints.
empower-functions
Empower Functions is a family of large language models (LLMs) that provide GPT-4 level capabilities for real-world 'tool using' use cases. These models offer compatibility support to be used as drop-in replacements, enabling interactions with external APIs by recognizing when a function needs to be called and generating JSON containing necessary arguments based on user inputs. This capability is crucial for building conversational agents and applications that convert natural language into API calls, facilitating tasks such as weather inquiries, data extraction, and interactions with knowledge bases. The models can handle multi-turn conversations, choose between tools or standard dialogue, ask for clarification on missing parameters, integrate responses with tool outputs in a streaming fashion, and efficiently execute multiple functions either in parallel or sequentially with dependencies.
aiotdlib
aiotdlib is a Python asyncio Telegram client based on TDLib. It provides automatic generation of types and functions from tl schema, validation, good IDE type hinting, and high-level API methods for simpler work with tdlib. The package includes prebuilt TDLib binaries for macOS (arm64) and Debian Bullseye (amd64). Users can use their own binary by passing `library_path` argument to `Client` class constructor. Compatibility with other versions of the library is not guaranteed. The tool requires Python 3.9+ and users need to get their `api_id` and `api_hash` from Telegram docs for installation and usage.
amadeus-java
Amadeus Java SDK provides a rich set of APIs for the travel industry, allowing developers to access various functionalities such as flight search, booking, airport information, and more. The SDK simplifies interaction with the Amadeus API by providing self-contained code examples and detailed documentation. Developers can easily make API calls, handle responses, and utilize features like pagination and logging. The SDK supports various endpoints for tasks like flight search, booking management, airport information retrieval, and travel analytics. It also offers functionalities for hotel search, booking, and sentiment analysis. Overall, the Amadeus Java SDK is a comprehensive tool for integrating Amadeus APIs into Java applications.
resonance
Resonance is a framework designed to facilitate interoperability and messaging between services in your infrastructure and beyond. It provides AI capabilities and takes full advantage of asynchronous PHP, built on top of Swoole. With Resonance, you can: * Chat with Open-Source LLMs: Create prompt controllers to directly answer user's prompts. LLM takes care of determining user's intention, so you can focus on taking appropriate action. * Asynchronous Where it Matters: Respond asynchronously to incoming RPC or WebSocket messages (or both combined) with little overhead. You can set up all the asynchronous features using attributes. No elaborate configuration is needed. * Simple Things Remain Simple: Writing HTTP controllers is similar to how it's done in the synchronous code. Controllers have new exciting features that take advantage of the asynchronous environment. * Consistency is Key: You can keep the same approach to writing software no matter the size of your project. There are no growing central configuration files or service dependencies registries. Every relation between code modules is local to those modules. * Promises in PHP: Resonance provides a partial implementation of Promise/A+ spec to handle various asynchronous tasks. * GraphQL Out of the Box: You can build elaborate GraphQL schemas by using just the PHP attributes. Resonance takes care of reusing SQL queries and optimizing the resources' usage. All fields can be resolved asynchronously.
aiogram_bot_template
Aiogram bot template is a boilerplate for creating Telegram bots using Aiogram framework. It provides a solid foundation for building robust and scalable bots with a focus on code organization, database integration, and localization.
pluto
Pluto is a development tool dedicated to helping developers **build cloud and AI applications more conveniently** , resolving issues such as the challenging deployment of AI applications and open-source models. Developers are able to write applications in familiar programming languages like **Python and TypeScript** , **directly defining and utilizing the cloud resources necessary for the application within their code base** , such as AWS SageMaker, DynamoDB, and more. Pluto automatically deduces the infrastructure resource needs of the app through **static program analysis** and proceeds to create these resources on the specified cloud platform, **simplifying the resources creation and application deployment process**.
pinecone-ts-client
The official Node.js client for Pinecone, written in TypeScript. This client library provides a high-level interface for interacting with the Pinecone vector database service. With this client, you can create and manage indexes, upsert and query vector data, and perform other operations related to vector search and retrieval. The client is designed to be easy to use and provides a consistent and idiomatic experience for Node.js developers. It supports all the features and functionality of the Pinecone API, making it a comprehensive solution for building vector-powered applications in Node.js.