
aioesphomeapi
Python Client for ESPHome native API. Used by Home Assistant.
Stars: 148

aioesphomeapi allows you to interact with devices flashed with ESPHome. ESPHome is an open-source firmware that allows you to control your devices over Wi-Fi or Ethernet. With aioesphomeapi, you can connect to your ESPHome devices, retrieve their status, and control them from your Python code.
README:
.. image:: https://github.com/esphome/aioesphomeapi/workflows/CI/badge.svg :target: https://github.com/esphome/aioesphomeapi?query=workflow%3ACI+branch%3Amain
.. image:: https://img.shields.io/pypi/v/aioesphomeapi.svg :target: https://pypi.python.org/pypi/aioesphomeapi
.. image:: https://codecov.io/gh/esphome/aioesphomeapi/branch/main/graph/badge.svg :target: https://app.codecov.io/gh/esphome/aioesphomeapi/tree/main
.. image:: https://img.shields.io/endpoint?url=https://codspeed.io/badge.json :target: https://codspeed.io/esphome/aioesphomeapi
aioesphomeapi
allows you to interact with devices flashed with ESPHome <https://esphome.io/>
_.
The module is available from the Python Package Index <https://pypi.python.org/pypi>
_.
.. code:: bash
$ pip3 install aioesphomeapi
An optional cython extension is available for better performance, and the module will try to build it automatically.
The extension requires a C compiler and Python development headers. The module will fall back to the pure Python implementation if they are unavailable.
Building the extension can be forcefully disabled by setting the environment variable SKIP_CYTHON
to 1
.
It's required that you enable the Native API <https://esphome.io/components/api.html>
_ component for the device.
.. code:: yaml
api: password: 'MyPassword'
Check the output to get the local address of the device or use the name:
under esphome:
from the device configuration.
.. code:: bash
[17:56:38][C][api:095]: API Server: [17:56:38][C][api:096]: Address: api_test.local:6053
The sample code below will connect to the device and retrieve details.
.. code:: python
import aioesphomeapi import asyncio
async def main(): """Connect to an ESPHome device and get details."""
# Establish connection
api = aioesphomeapi.APIClient("api_test.local", 6053, "MyPassword")
await api.connect(login=True)
# Get API version of the device's firmware
print(api.api_version)
# Show device details
device_info = await api.device_info()
print(device_info)
# List all entities of the device
entities = await api.list_entities_services()
print(entities)
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Subscribe to state changes of an ESPHome device.
.. code:: python
import aioesphomeapi import asyncio
async def main(): """Connect to an ESPHome device and wait for state changes.""" cli = aioesphomeapi.APIClient("api_test.local", 6053, "MyPassword")
await cli.connect(login=True)
def change_callback(state):
"""Print the state changes of the device.."""
print(state)
# Subscribe to the state changes
await cli.subscribe_states(change_callback)
loop = asyncio.get_event_loop() try: asyncio.ensure_future(main()) loop.run_forever() except KeyboardInterrupt: pass finally: loop.close()
Other examples:
-
Camera <https://gist.github.com/micw/202f9dee5c990f0b0f7e7c36b567d92b>
_ -
Async print <https://gist.github.com/fpletz/d071c72e45d17ba274fd61ca7a465033#file-esphome-print-async-py>
_ -
Simple print <https://gist.github.com/fpletz/d071c72e45d17ba274fd61ca7a465033#file-esphome-print-simple-py>
_ -
InfluxDB <https://gist.github.com/fpletz/d071c72e45d17ba274fd61ca7a465033#file-esphome-sensor-influxdb-py>
_
For development is recommended to use a Python virtual environment (venv
).
.. code:: bash
# Setup virtualenv (optional)
$ python3 -m venv .
$ source bin/activate
# Install aioesphomeapi and development depenencies
$ pip3 install -e .
$ pip3 install -r requirements/test.txt
# Run linters & test
$ script/lint
# Update protobuf _pb2.py definitions (requires a protobuf compiler installation)
$ script/gen-protoc
A cli tool is also available for watching logs:
.. code:: bash
aioesphomeapi-logs --help
A cli tool is also available to discover devices:
.. code:: bash
aioesphomeapi-discover
aioesphomeapi
is licensed under MIT, for more details check LICENSE.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for aioesphomeapi
Similar Open Source Tools

aioesphomeapi
aioesphomeapi allows you to interact with devices flashed with ESPHome. ESPHome is an open-source firmware that allows you to control your devices over Wi-Fi or Ethernet. With aioesphomeapi, you can connect to your ESPHome devices, retrieve their status, and control them from your Python code.

mcphost
MCPHost is a CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP). It acts as a host in the MCP client-server architecture, allowing language models to access external tools and data sources, maintain consistent context across interactions, and execute commands safely. The tool supports interactive conversations with Claude 3.5 Sonnet and Ollama models, multiple concurrent MCP servers, dynamic tool discovery and integration, configurable server locations and arguments, and a consistent command interface across model types.

lollms
LoLLMs Server is a text generation server based on large language models. It provides a Flask-based API for generating text using various pre-trained language models. This server is designed to be easy to install and use, allowing developers to integrate powerful text generation capabilities into their applications.

langchainrb
Langchain.rb is a Ruby library that makes it easy to build LLM-powered applications. It provides a unified interface to a variety of LLMs, vector search databases, and other tools, making it easy to build and deploy RAG (Retrieval Augmented Generation) systems and assistants. Langchain.rb is open source and available under the MIT License.

aiohttp-cors
The aiohttp_cors library provides Cross Origin Resource Sharing (CORS) support for aiohttp, an asyncio-powered asynchronous HTTP server. CORS allows overriding the Same-origin policy for specific resources, enabling web pages to access resources from different origins. The library helps configure CORS settings for resources and routes in aiohttp applications, allowing control over origins, credentials passing, headers, and preflight requests.

claim-ai-phone-bot
AI-powered call center solution with Azure and OpenAI GPT. The bot can answer calls, understand the customer's request, and provide relevant information or assistance. It can also create a todo list of tasks to complete the claim, and send a report after the call. The bot is customizable, and can be used in multiple languages.

OpenAI
OpenAI is a Swift community-maintained implementation over OpenAI public API. It is a non-profit artificial intelligence research organization founded in San Francisco, California in 2015. OpenAI's mission is to ensure safe and responsible use of AI for civic good, economic growth, and other public benefits. The repository provides functionalities for text completions, chats, image generation, audio processing, edits, embeddings, models, moderations, utilities, and Combine extensions.

gemini-srt-translator
Gemini SRT Translator is a tool that utilizes Google Generative AI to provide accurate and efficient translations for subtitle files. Users can customize translation settings, such as model name and batch size, and list available models from the Gemini API. The tool requires a free API key from Google AI Studio for setup and offers features like translating subtitles to a specified target language and resuming partial translations. Users can further customize translation settings with optional parameters like gemini_api_key2, output_file, start_line, model_name, batch_size, and more.

aiohttp-session
aiohttp_session is a Python library that provides session management for aiohttp.web applications. It allows storing user-specific data in session objects with a dict-like interface. The library offers different session storage options, including SimpleCookieStorage for testing, EncryptedCookieStorage for secure data storage, and RedisStorage for storing data in Redis. Users can easily integrate session management into their aiohttp.web applications by registering the session middleware. The library is designed to simplify session handling and enhance the security of web applications.

aiosql
aiosql is a Python module that allows you to organize SQL statements in .sql files and load them into your Python application as methods to call. It supports various database drivers like SQLite, PostgreSQL, MySQL, MariaDB, and DuckDB. The project is an implementation of Kris Jenkins' yesql library to the Python ecosystem, allowing users to easily reuse SQL code in SQL GUIs or CLI tools. With aiosql, you can write, version control, comment, and run SQL code using files without losing the ability to use them as you would any other SQL file. It provides support for PEP 249 and asyncio based drivers, enabling users to execute parametric SQL queries from Python methods.

aiohttp-debugtoolbar
aiohttp_debugtoolbar provides a debug toolbar for aiohttp web applications. It is a port of pyramid_debugtoolbar and offers basic functionality such as basic panels, intercepting redirects, pretty printing exceptions, an interactive python console, and showing source code. The library is still in early development stages and offers various debug panels for monitoring different aspects of the web application. It is a useful tool for developers working with aiohttp to debug and optimize their applications.

syncode
SynCode is a novel framework for the grammar-guided generation of Large Language Models (LLMs) that ensures syntactically valid output with respect to defined Context-Free Grammar (CFG) rules. It supports general-purpose programming languages like Python, Go, SQL, JSON, and more, allowing users to define custom grammars using EBNF syntax. The tool compares favorably to other constrained decoders and offers features like fast grammar-guided generation, compatibility with HuggingFace Language Models, and the ability to work with various decoding strategies.

mistral-inference
Mistral Inference repository contains minimal code to run 7B, 8x7B, and 8x22B models. It provides model download links, installation instructions, and usage guidelines for running models via CLI or Python. The repository also includes information on guardrailing, model platforms, deployment, and references. Users can interact with models through commands like mistral-demo, mistral-chat, and mistral-common. Mistral AI models support function calling and chat interactions for tasks like testing models, chatting with models, and using Codestral as a coding assistant. The repository offers detailed documentation and links to blogs for further information.

instructor
Instructor is a Python library that makes it a breeze to work with structured outputs from large language models (LLMs). Built on top of Pydantic, it provides a simple, transparent, and user-friendly API to manage validation, retries, and streaming responses. Get ready to supercharge your LLM workflows!

call-center-ai
Call Center AI is an AI-powered call center solution that leverages Azure and OpenAI GPT. It is a proof of concept demonstrating the integration of Azure Communication Services, Azure Cognitive Services, and Azure OpenAI to build an automated call center solution. The project showcases features like accessing claims on a public website, customer conversation history, language change during conversation, bot interaction via phone number, multiple voice tones, lexicon understanding, todo list creation, customizable prompts, content filtering, GPT-4 Turbo for customer requests, specific data schema for claims, documentation database access, SMS report sending, conversation resumption, and more. The system architecture includes components like RAG AI Search, SMS gateway, call gateway, moderation, Cosmos DB, event broker, GPT-4 Turbo, Redis cache, translation service, and more. The tool can be deployed remotely using GitHub Actions and locally with prerequisites like Azure environment setup, configuration file creation, and resource hosting. Advanced usage includes custom training data with AI Search, prompt customization, language customization, moderation level customization, claim data schema customization, OpenAI compatible model usage for the LLM, and Twilio integration for SMS.

MCPSharp
MCPSharp is a .NET library that helps build Model Context Protocol (MCP) servers and clients for AI assistants and models. It allows creating MCP-compliant tools, connecting to existing MCP servers, exposing .NET methods as MCP endpoints, and handling MCP protocol details seamlessly. With features like attribute-based API, JSON-RPC support, parameter validation, and type conversion, MCPSharp simplifies the development of AI capabilities in applications through standardized interfaces.
For similar tasks

aioesphomeapi
aioesphomeapi allows you to interact with devices flashed with ESPHome. ESPHome is an open-source firmware that allows you to control your devices over Wi-Fi or Ethernet. With aioesphomeapi, you can connect to your ESPHome devices, retrieve their status, and control them from your Python code.

Fay
Fay is an open-source digital human framework that offers different versions for various purposes. The '带货完整版' is suitable for online and offline salespersons. The '助理完整版' serves as a human-machine interactive digital assistant that can also control devices upon command. The 'agent版' is designed to be an autonomous agent capable of making decisions and contacting its owner. The framework provides updates and improvements across its different versions, including features like emotion analysis integration, model optimizations, and compatibility enhancements. Users can access detailed documentation for each version through the provided links.

aiohomekit
aiohomekit is a Python library that implements the HomeKit protocol for controlling HomeKit accessories using asyncio. It is primarily used with Home Assistant, targeting the same versions of Python and following their code standards. The library is still under development and does not offer API guarantees yet. It aims to match the behavior of real HAP controllers, even when not strictly specified, and works around issues like JSON formatting, boolean encoding, header sensitivity, and TCP packet splitting. aiohomekit is primarily tested with Phillips Hue and Eve Extend bridges via Home Assistant, but is known to work with many more devices. It does not support BLE accessories and is intended for client-side use only.

OmniSteward
OmniSteward is an AI-powered steward system based on large language models that can interact with users through voice or text to help control smart home devices and computer programs. It supports multi-turn dialogue, tool calling for complex tasks, multiple LLM models, voice recognition, smart home control, computer program management, online information retrieval, command line operations, and file management. The system is highly extensible, allowing users to customize and share their own tools.

Jarvis
Jarvis is a powerful virtual AI assistant designed to simplify daily tasks through voice command integration. It features automation, device management, and personalized interactions, transforming technology engagement. Built using Python and AI models, it serves personal and administrative needs efficiently, making processes seamless and productive.

TuyaOpen
TuyaOpen is an open source AI+IoT development framework supporting cross-chip platforms and operating systems. It provides core functionalities for AI+IoT development, including pairing, activation, control, and upgrading. The SDK offers robust security and compliance capabilities, meeting data compliance requirements globally. TuyaOpen enables the development of AI+IoT products that can leverage the Tuya APP ecosystem and cloud services. It continues to expand with more cloud platform integration features and capabilities like voice, video, and facial recognition.

aic_pico
AIC Pico is a small and versatile tool designed for emulating various I/O protocols such as Sega AIME I/O, Bandai Namco I/O, and Spicetools CardIO. It supports card types like FeliCa, ISO/IEC 14443 Type A, and ISO/IEC 15693, allowing users to create virtual AIC from Mifare cards. The tool is open-source and easy to integrate into Raspberry Pi Pico projects. It requires skills in 3D printing and soldering tiny components. AIC Pico comes in different variants like PN532, PN5180, AIC Key, and AIC Touch, each with specific assembly instructions and components. The firmware can be updated via UF2 files and offers command line configurations for LED control, brightness adjustment, card detection, and more.

switch_AIO_LS_pack
Switch_AIO_LS_pack is a comprehensive package for setting up the SD card of the Nintendo Switch. It includes custom firmware, homebrew applications, payloads, and essential modules to enhance the console experience. The pack also contains the latest firmware and has been prepared using the Ultimate-Switch-Hack-Script project in collaboration with the user nightwolf from Logic-sunrise. It is compatible with all models of the Switch.
For similar jobs

aioesphomeapi
aioesphomeapi allows you to interact with devices flashed with ESPHome. ESPHome is an open-source firmware that allows you to control your devices over Wi-Fi or Ethernet. With aioesphomeapi, you can connect to your ESPHome devices, retrieve their status, and control them from your Python code.

home-llm
Home LLM is a project that provides the necessary components to control your Home Assistant installation with a completely local Large Language Model acting as a personal assistant. The goal is to provide a drop-in solution to be used as a "conversation agent" component by Home Assistant. The 2 main pieces of this solution are Home LLM and Llama Conversation. Home LLM is a fine-tuning of the Phi model series from Microsoft and the StableLM model series from StabilityAI. The model is able to control devices in the user's house as well as perform basic question and answering. The fine-tuning dataset is a custom synthetic dataset designed to teach the model function calling based on the device information in the context. Llama Conversation is a custom component that exposes the locally running LLM as a "conversation agent" in Home Assistant. This component can be interacted with in a few ways: using a chat interface, integrating with Speech-to-Text and Text-to-Speech addons, or running the oobabooga/text-generation-webui project to provide access to the LLM via an API interface.