ImTip
ImTip 桌面助手:超级热键,AI 助手,通用输入法状态跟踪提示。
Stars: 1685
ImTip is a lightweight desktop assistant tool that provides features such as super hotkeys, input method status prompts, and custom AI assistant. It displays concise icons at the input cursor to show various input method and keyboard status, allowing users to customize appearance schemes. With ImTip, users can easily manage input method status without cluttering the screen with the built-in status bar. The tool supports visual editing of status prompt appearance and programmable extensions for super hotkeys. ImTip has low CPU usage and offers customizable tracking speed to adjust CPU consumption. It supports various input methods and languages, making it a versatile tool for enhancing typing efficiency and accuracy.
README:
点这里下载 ImTip - 免费开源,仅 799 KB。独立 EXE 无任何外部依赖,兼容 XP,Vista,Win7,Win8,Win10,Win11 …… 等所有流行桌面操作系统。
ImTip 提供超级热键、输入法状态提示、自定义 AI 助手等功能。
输入法提示通过在输入光标处显示 2 个简洁的图标 —— 提前知道中英、中英标点、全半角、大小写、多语言键盘布局等所有状态。
可以方便地自定义外观方案,例如单图标方案效果如下:
再也不怕按错了! 保持思考与输入的连续性,避免低头看任务栏或通过其他操作检查输入状态。
- 不是只能看中英状态,而是关注更少的图标,了解更多的常用输入法与键盘状态。
- 不是只在切换输入法才显示一次状态,当切换到新的输入位置都会及时地提醒输入法状态,可以自定义显示时长、方式、外观。
有了 ImTip 就可以关掉输入法自带的状态栏,屏幕更干净了,美滋滋再也不用看右下角 !
理论上支持所有输入法,系统自带的微软拼音,微软五笔,搜狗输入法,小小输入法,百度输入法,QQ输入法,谷歌输入法,小鹤输入法,手心输入法 …… 包括我测试的日文、韩文、西班牙语输入法都可以支持 ImTip 。
ImTip 支持可视化编辑状态提示外观:
可将外观方案直接拖入 ImTip.exe 或外观设置窗口快速导入。
支持用剪贴板直接复制粘贴配置方案代码。
ImTip 提供可编程扩展的「超级热键」。 例如按 Ctrl+$ 打开财务大写、日期时间大写、数学运算工具:
ImTip 托盘菜单提供快捷启用系统输入法、切换双拼方案等功能。
ImTip CPU 占用极低,可以通过设置「跟踪检测速度」调整 CPU 占用:
默认有微小延迟 —— 这是程序的主动优化( 并非被动延迟 ),您可以加快「跟踪检测速度」(更丝滑,增加的资源占用仍然是可忽略的)。
附:输入法常用快捷键
「Shift」切换中/英输入;
「Ctrl + . 」切换中/英标点;
「Shift + 空格 」切换全/半角;
「Alt + Shift」切换语言
一、关于英文键盘
有些第三方输入法会安装「中文美式键盘」 - 可能导致不必要的错乱。这个键盘在 Win10 其实已被废弃,建议移除或更改为「英语美式键盘」。Win7/Win10/Win11 可在 ImTip 托盘菜单中禁用启用一次「英语键盘」就可修复该问题。
二、管理权限窗口
ImTip 默认以普通权限启动,以管理权限启动 ImTip.exe —— 才会对其他管理权限窗口生效。以管理权限启动后重新勾选 「允许开机启动」,则开机以管理权限启动( 不会再弹出请求权限弹框 , 注意只有同样在管理权限下启动才能取消此设置 )。
三、窗口兼容性
ImTip 使用了多种不同的接口获取输入位置,但少数任何接口都不支持的应用窗口会退化为取鼠标输入指针位置。
对于以上方式都不支持的窗口,可在『兼容窗口类名』中添加窗口类名(可使用 aardio 工具中的窗口探测软件查看),多个类名以分号分开。兼容类名写法规则如下:
- 如果兼容类名前面增加
#
字符则表示该窗口是一个较小的文本输入框,例如#EXCEL6
。 - 如果兼容类名前面增加
@
字符则表示则通过 MSAA 接口获取输入框位置,典型的例如微信 3.x 可指定兼容类名为;@WeChatMainWndForPC
。 - 发果兼容类名前没有
#
或@
字符,则直接获取鼠标指针位置以显示输入状态提示。
微信 4.0 已经完美支持 ImTip,不需要设置。
ImTip 仅在检测到输入框时显示输入状态。即使取消勾选「仅切换输入目标或状态后显示」,在检测不到输入目标的窗口仍然不会显示输入状态(除非所在窗口设置了兼容窗口类名)。
四、输入法兼容性
请参考:输入法与键备状态检测原理与规则
主流输入法基本都可以支持 ImTip 。
微信输入法、手心输入法、讯飞输入法需要勾选『怪异模式』,勾选『怪异模式』以后不支持其他正常输入法。
小小输入法需要注册 TSF 内置组件(这是默认选项)。小小输入法返回的语言代码受系统设置的区域格式影响,如果区域格式不是中文,请到设置中修改为中文,并重新新注册一次 TSF 组件即可正常识别状态。
小鹤输入法有一个小问题,在英文模式下切换全半角后状态会错乱,按 Shift 切换一次中英模式会恢复正常,可能基于多多的输入法都有类似问题。
五、启动参数
ImTip.exe *.aardio 加载配置方案,或者直接将配置文件拖到 ImTip.exe 上也可以。
ImTip.exe 无参数 如果 ImTip 已运行则打开配置窗口,或者直接双击 ImTip.exe 也可以。
ImTip.exe /chat 配置名称 /q 需要立即发送的问题 启动 AI 聊天助手会话窗口。配置名称可省略,q 参数也可以省略。 aardio 提供 process.imTip 库可以方便地启动 ImTip 聊天助手,可参考:超级热键 - 自动调用 AI 会话窗口。
六、提示窗口闪烁
ImTip 默认会阻止重复运行,但如果您在 aardio 开发环境中单独运行创建提示窗口的源码,并且同时创建了多个输入法提示窗口,多个窗口相互冲突当然就会闪烁了。
本页的动画主要使用 开源免费,下载体积仅 820 KB 的极简录屏软件 Gif123 录制。
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for ImTip
Similar Open Source Tools
ImTip
ImTip is a lightweight desktop assistant tool that provides features such as super hotkeys, input method status prompts, and custom AI assistant. It displays concise icons at the input cursor to show various input method and keyboard status, allowing users to customize appearance schemes. With ImTip, users can easily manage input method status without cluttering the screen with the built-in status bar. The tool supports visual editing of status prompt appearance and programmable extensions for super hotkeys. ImTip has low CPU usage and offers customizable tracking speed to adjust CPU consumption. It supports various input methods and languages, making it a versatile tool for enhancing typing efficiency and accuracy.
fastserve-ai
FastServe-AI is a machine learning serving tool focused on GenAI & LLMs with simplicity as the top priority. It allows users to easily serve custom models by implementing the 'handle' method for 'FastServe'. The tool provides a FastAPI server for custom models and can be deployed using Lightning AI Studio. Users can install FastServe-AI via pip and run it to serve their own GPT-like LLM models in minutes.
Visionatrix
Visionatrix is a project aimed at providing easy use of ComfyUI workflows. It offers simplified setup and update processes, a minimalistic UI for daily workflow use, stable workflows with versioning and update support, scalability for multiple instances and task workers, multiple user support with integration of different user backends, LLM power for integration with Ollama/Gemini, and seamless integration as a service with backend endpoints and webhook support. The project is approaching version 1.0 release and welcomes new ideas for further implementation.
llm-x
LLM X is a ChatGPT-style UI for the niche group of folks who run Ollama (think of this like an offline chat gpt server) locally. It supports sending and receiving images and text and works offline through PWA (Progressive Web App) standards. The project utilizes React, Typescript, Lodash, Mobx State Tree, Tailwind css, DaisyUI, NextUI, Highlight.js, React Markdown, kbar, Yet Another React Lightbox, Vite, and Vite PWA plugin. It is inspired by ollama-ui's project and Perplexity.ai's UI advancements in the LLM UI space. The project is still under development, but it is already a great way to get started with building your own LLM UI.
CrewAI-GUI
CrewAI-GUI is a Node-Based Frontend tool designed to revolutionize AI workflow creation. It empowers users to design complex AI agent interactions through an intuitive drag-and-drop interface, export designs to JSON for modularity and reusability, and supports both GPT-4 API and Ollama for flexible AI backend. The tool ensures cross-platform compatibility, allowing users to create AI workflows on Windows, Linux, or macOS efficiently.
chatbox
Chatbox is a desktop client for ChatGPT, Claude, and other LLMs, providing a user-friendly interface for AI copilot assistance on Windows, Mac, and Linux. It offers features like local data storage, multiple LLM provider support, image generation with Dall-E-3, enhanced prompting, keyboard shortcuts, and more. Users can collaborate, access the tool on various platforms, and enjoy multilingual support. Chatbox is constantly evolving with new features to enhance the user experience.
cog
Cog is an open-source tool that lets you package machine learning models in a standard, production-ready container. You can deploy your packaged model to your own infrastructure, or to Replicate.
llm-interface
LLM Interface is an npm module that streamlines interactions with various Large Language Model (LLM) providers in Node.js applications. It offers a unified interface for switching between providers and models, supporting 36 providers and hundreds of models. Features include chat completion, streaming, error handling, extensibility, response caching, retries, JSON output, and repair. The package relies on npm packages like axios, @google/generative-ai, dotenv, jsonrepair, and loglevel. Installation is done via npm, and usage involves sending prompts to LLM providers. Tests can be run using npm test. Contributions are welcome under the MIT License.
nyxtext
Nyxtext is a text editor built using Python, featuring Custom Tkinter with the Catppuccin color scheme and glassmorphic design. It follows a modular approach with each element organized into separate files for clarity and maintainability. NyxText is not just a text editor but also an AI-powered desktop application for creatives, developers, and students.
RisuAI
RisuAI, or Risu for short, is a cross-platform AI chatting software/web application with powerful features such as multiple API support, assets in the chat, regex functions, and much more.
ort
Ort is an unofficial ONNX Runtime 1.17 wrapper for Rust based on the now inactive onnxruntime-rs. ONNX Runtime accelerates ML inference on both CPU and GPU.
skypilot
SkyPilot is a framework for running LLMs, AI, and batch jobs on any cloud, offering maximum cost savings, highest GPU availability, and managed execution. SkyPilot abstracts away cloud infra burdens: - Launch jobs & clusters on any cloud - Easy scale-out: queue and run many jobs, automatically managed - Easy access to object stores (S3, GCS, R2) SkyPilot maximizes GPU availability for your jobs: * Provision in all zones/regions/clouds you have access to (the _Sky_), with automatic failover SkyPilot cuts your cloud costs: * Managed Spot: 3-6x cost savings using spot VMs, with auto-recovery from preemptions * Optimizer: 2x cost savings by auto-picking the cheapest VM/zone/region/cloud * Autostop: hands-free cleanup of idle clusters SkyPilot supports your existing GPU, TPU, and CPU workloads, with no code changes.
asktube
AskTube is an AI-powered YouTube video summarizer and QA assistant that utilizes Retrieval Augmented Generation (RAG) technology. It offers a comprehensive solution with Q&A functionality and aims to provide a user-friendly experience for local machine usage. The project integrates various technologies including Python, JS, Sanic, Peewee, Pytubefix, Sentence Transformers, Sqlite, Chroma, and NuxtJs/DaisyUI. AskTube supports multiple providers for analysis, AI services, and speech-to-text conversion. The tool is designed to extract data from YouTube URLs, store embedding chapter subtitles, and facilitate interactive Q&A sessions with enriched questions. It is not intended for production use but rather for end-users on their local machines.
instill-core
Instill Core is an open-source orchestrator comprising a collection of source-available projects designed to streamline every aspect of building versatile AI features with unstructured data. It includes Instill VDP (Versatile Data Pipeline) for unstructured data, AI, and pipeline orchestration, Instill Model for scalable MLOps and LLMOps for open-source or custom AI models, and Instill Artifact for unified unstructured data management. Instill Core can be used for tasks such as building, testing, and sharing pipelines, importing, serving, fine-tuning, and monitoring ML models, and transforming documents, images, audio, and video into a unified AI-ready format.
duolingo-clone
Lingo is an interactive platform for language learning that provides a modern UI/UX experience. It offers features like courses, quests, and a shop for users to engage with. The tech stack includes React JS, Next JS, Typescript, Tailwind CSS, Vercel, and Postgresql. Users can contribute to the project by submitting changes via pull requests. The platform utilizes resources from CodeWithAntonio, Kenney Assets, Freesound, Elevenlabs AI, and Flagpack. Key dependencies include @clerk/nextjs, @neondatabase/serverless, @radix-ui/react-avatar, and more. Users can follow the project creator on GitHub and Twitter, as well as subscribe to their YouTube channel for updates. To learn more about Next.js, users can refer to the Next.js documentation and interactive tutorial.
GPTSwarm
GPTSwarm is a graph-based framework for LLM-based agents that enables the creation of LLM-based agents from graphs and facilitates the customized and automatic self-organization of agent swarms with self-improvement capabilities. The library includes components for domain-specific operations, graph-related functions, LLM backend selection, memory management, and optimization algorithms to enhance agent performance and swarm efficiency. Users can quickly run predefined swarms or utilize tools like the file analyzer. GPTSwarm supports local LM inference via LM Studio, allowing users to run with a local LLM model. The framework has been accepted by ICML2024 and offers advanced features for experimentation and customization.
For similar tasks
ImTip
ImTip is a lightweight desktop assistant tool that provides features such as super hotkeys, input method status prompts, and custom AI assistant. It displays concise icons at the input cursor to show various input method and keyboard status, allowing users to customize appearance schemes. With ImTip, users can easily manage input method status without cluttering the screen with the built-in status bar. The tool supports visual editing of status prompt appearance and programmable extensions for super hotkeys. ImTip has low CPU usage and offers customizable tracking speed to adjust CPU consumption. It supports various input methods and languages, making it a versatile tool for enhancing typing efficiency and accuracy.
For similar jobs
J.A.R.V.I.S.
J.A.R.V.I.S.1.0 is an advanced virtual assistant tool designed to assist users in various tasks. It provides a wide range of functionalities including voice commands, task automation, information retrieval, and communication management. With its intuitive interface and powerful capabilities, J.A.R.V.I.S.1.0 aims to enhance productivity and streamline daily activities for users.
ImTip
ImTip is a lightweight desktop assistant tool that provides features such as super hotkeys, input method status prompts, and custom AI assistant. It displays concise icons at the input cursor to show various input method and keyboard status, allowing users to customize appearance schemes. With ImTip, users can easily manage input method status without cluttering the screen with the built-in status bar. The tool supports visual editing of status prompt appearance and programmable extensions for super hotkeys. ImTip has low CPU usage and offers customizable tracking speed to adjust CPU consumption. It supports various input methods and languages, making it a versatile tool for enhancing typing efficiency and accuracy.
onyx
Onyx is an open-source Gen-AI and Enterprise Search tool that serves as an AI Assistant connected to company documents, apps, and people. It provides a chat interface, can be deployed anywhere, and offers features like user authentication, role management, chat persistence, and UI for configuring AI Assistants. Onyx acts as an Enterprise Search tool across various workplace platforms, enabling users to access team-specific knowledge and perform tasks like document search, AI answers for natural language queries, and integration with common workplace tools like Slack, Google Drive, Confluence, etc.
local-chat
LocalChat is a simple, easy-to-set-up, and open-source local AI chat tool that allows users to interact with generative language models on their own computers without transmitting data to a cloud server. It provides a chat-like interface for users to experience ChatGPT-like behavior locally, ensuring GDPR compliance and data privacy. Users can download LocalChat for macOS, Windows, or Linux to chat with open-weight generative language models.
lollms-webui
LoLLMs WebUI (Lord of Large Language Multimodal Systems: One tool to rule them all) is a user-friendly interface to access and utilize various LLM (Large Language Models) and other AI models for a wide range of tasks. With over 500 AI expert conditionings across diverse domains and more than 2500 fine tuned models over multiple domains, LoLLMs WebUI provides an immediate resource for any problem, from car repair to coding assistance, legal matters, medical diagnosis, entertainment, and more. The easy-to-use UI with light and dark mode options, integration with GitHub repository, support for different personalities, and features like thumb up/down rating, copy, edit, and remove messages, local database storage, search, export, and delete multiple discussions, make LoLLMs WebUI a powerful and versatile tool.
Azure-Analytics-and-AI-Engagement
The Azure-Analytics-and-AI-Engagement repository provides packaged Industry Scenario DREAM Demos with ARM templates (Containing a demo web application, Power BI reports, Synapse resources, AML Notebooks etc.) that can be deployed in a customer’s subscription using the CAPE tool within a matter of few hours. Partners can also deploy DREAM Demos in their own subscriptions using DPoC.
minio
MinIO is a High Performance Object Storage released under GNU Affero General Public License v3.0. It is API compatible with Amazon S3 cloud storage service. Use MinIO to build high performance infrastructure for machine learning, analytics and application data workloads.
mage-ai
Mage is an open-source data pipeline tool for transforming and integrating data. It offers an easy developer experience, engineering best practices built-in, and data as a first-class citizen. Mage makes it easy to build, preview, and launch data pipelines, and provides observability and scaling capabilities. It supports data integrations, streaming pipelines, and dbt integration.