
obsidian-pieces
Pieces' powerful extension for Obsidian-MD that allows users to access their code snippets directly within the Obsidian workspace
Stars: 148

Pieces for Developers is a closed-source Obsidian plugin designed to revolutionize coding workflows by incorporating key capabilities and favorite features directly into the Obsidian environment. The plugin, Pieces Copilot for Obsidian, enhances coding and problem-solving experiences by providing insights on code snippets, generating samples, and facilitating navigation through PRs. Users can capture, manage, share, and discover code snippets and developer materials with ease, bringing efficiency and organization to their coding experience.
README:
Pieces for Developers | Obsidian Plugin
This plugin is closed source. The Obsidian team has full access to our private codebase.
In 2022, our team embarked on a mission to transform the way you code with your personal micro-repo.
With the debut release of our Flagship Desktop App, we laid the foundation, and now our Obsidian plugin is here to revolutionize your coding workflow further by incorporating key capabilities and our users' favorite features directly into their Obsidian environment.
Pieces Copilot for Obsidian is here to elevate your coding and problem-solving experience. Interact with the Copilot directly in any Obsidian workspace. Ask questions about code or content within a file, gain insights on patterns or keywords in code snippets, request the Copilot to generate samples using an SDK you're exploring, and even use it to navigate through PRs.
As you progress through your workflow, the Pieces for Developers Obsidian plugin enables you to capture, manage, share, and discover code snippets and other developer materials, like code screenshots, with simple clicks.
Designed to be your ultimate development companion, the Pieces Obsidian plugin transforms your workflow, bringing unparalleled efficiency and organization to your coding experience.
- Recent Updates
- Getting Started
- Features
- Explore the Pieces Ecosystem
- Need Help?
- Stay Connected
- Obsidian Community Disclosure
Nov. 1, 2023
- Adds persistent copilot conversations, you can not come back to a copilot chat at any time after you leave it
- Overhauls the context selector to be much simpler and easier to use
- Context is also persisted per conversation, so each conversation will maintain its context indefinitely
- Snippets can now be used as context
Oct. 3, 2023
- New Quick Actions feature for easily selecting LLM runtime and file context in a Copilot conversation.
- Theme Matching to align Copilot's appearance with your environment's theme.
- Styling updates including new icons for an enhanced user experience.
Sept. 8, 2023
- Snippet list filtering based on tags, titles, language, etc., similar to the feature in Pieces Desktop App.
- Easy access to filtering options via a filter button and a user-friendly interface for setting up filters.
- Efficient narrowing down of snippet shortlist to find relevant snippets swiftly.
You must have Pieces OS installed.
Pieces OS facilitates the local operation of Pieces products on your machine and coordinates connections to Pieces extensions. The extension will not function as intended without Pieces OS active on your machine.
*Pieces OS installation comes with the Pieces for Developers Desktop App where your snippets can also be viewed and managed.
- Visit the Pieces for Developers Obsidian Plugin install page.
- Click the
Install
button and thenEnable
to activate the plugin.
Your personal copilot, powered by local or cloud-based LLMs (i.e. Llama 2, GPT-3.5, GPT-4, and PaLM 2) for maximum security and privacy, that’s contextualized by your workflow to help solve coding problems, onboard into new projects faster, and connect you with the right people.
Templates, terminal commands, useful snippets, notes...save elements of your notes in one-click to reference or re-use later.
Blazing fast, powerful search of your code snippets right inside Obsidian.
Share snippets with a simple right-click action from within Obsidian. The link ships with both the snippet and its related context - tags, descriptions, where it’s from and more!
- Runs locally on your machine with the option to connect to the Cloud for backup, sync and sharing
- Deeply embedded into Obsidian - save, search, and share your snippets entirely from the Pieces for Developers Obsidian Plugin
- Keyboard shortcuts-enabled
Automatically discovers new highly relevant and reusable snippets from your vault for you to save.
- Automatically generate tags, titles, descriptions, and links to other similar code snippets with the simple press of a button!
- Just click on the 'P' button embedded in your code blocks, and click on the enrich icon!
To use a Pieces command, simply highlight code in your editor or hover a snippet in your Pieces List.
Hotkeys aren't set by default, but here are our recommended layouts.
Command | macOS | Windows |
---|---|---|
Save a snippet | CMD+Shift+P | Ctrl+Shift+P |
Share a snippet | CMD+Shift+L | CMD+Shift+L |
Explore our suite of products designed to streamline your coding workflow across different platforms.
For detailed descriptions and features of our Obsidian plugin, visit our docs for Obsidian.
Encountered a hurdle? We've got you covered. Reach out for support:
Stay in the loop! Follow us for the latest updates, tips, and insights:
Please note that this repository is hosting a closed-source Obsidian plugin. While the source code is not publicly available, we assure you that the utmost care has been taken to ensure its quality, performance, and respect for user privacy and data security.
The Pieces For Developers Obsidian Plugin is intended to enhance the functionality and user experience of the Obsidian application, while adhering strictly to the developer policies of Obsidian, notably:
- We do not obfuscate our code to hide its purpose.
- We do not insert dynamic ads or static ads outside the Pieces For Developers Obsidian Plugin interface.
- We do not include client-side telemetry.
- We temporarily leverage ChatGPT API endpoints.
- We respect Obsidian's trademark policy.
- For any issues, feature requests, or policy violations, please feel free to open an issue in this repository. In the case of severe issues or policy violations, please also contact the Obsidian team.
To ensure the continued functionality and reliability of the Pieces For Developers Obsidian Plugin, we commit to providing updates and addressing any issues in a timely manner.
Please refer to the included LICENSE file for details on the terms of use for our plugin.
Thank you for your understanding, and we hope you enjoy using the Pieces For Developers Obsidian Plugin.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for obsidian-pieces
Similar Open Source Tools

obsidian-pieces
Pieces for Developers is a closed-source Obsidian plugin designed to revolutionize coding workflows by incorporating key capabilities and favorite features directly into the Obsidian environment. The plugin, Pieces Copilot for Obsidian, enhances coding and problem-solving experiences by providing insights on code snippets, generating samples, and facilitating navigation through PRs. Users can capture, manage, share, and discover code snippets and developer materials with ease, bringing efficiency and organization to their coding experience.

doc2plan
doc2plan is a browser-based application that helps users create personalized learning plans by extracting content from documents. It features a Creator for manual or AI-assisted plan construction and a Viewer for interactive plan navigation. Users can extract chapters, key topics, generate quizzes, and track progress. The application includes AI-driven content extraction, quiz generation, progress tracking, plan import/export, assistant management, customizable settings, viewer chat with text-to-speech and speech-to-text support, and integration with various Retrieval-Augmented Generation (RAG) models. It aims to simplify the creation of comprehensive learning modules tailored to individual needs.

meilisearch
Meilisearch is a lightning-fast search engine that seamlessly integrates into apps, websites, and workflows. It offers features like hybrid search, search-as-you-type, typo tolerance, filtering, sorting, synonym support, geosearch, extensive language support, security management, multi-tenancy, RESTful API, AI-readiness, easy installation, deployment, and maintenance.

ZetaForge
ZetaForge is an open-source AI platform designed for rapid development of advanced AI and AGI pipelines. It allows users to assemble reusable, customizable, and containerized Blocks into highly visual AI Pipelines, enabling rapid experimentation and collaboration. With ZetaForge, users can work with AI technologies in any programming language, easily modify and update AI pipelines, dive into the code whenever needed, utilize community-driven blocks and pipelines, and share their own creations. The platform aims to accelerate the development and deployment of advanced AI solutions through its user-friendly interface and community support.

BloxAI
Blox AI is a platform that allows users to effortlessly create flowcharts and diagrams, collaborate with teams, and receive explanations from the Google Gemini model. It offers rich text editing, versatile visualizations, secure workspaces, and limited files allotment. Users can install it as an app and use it for wireframes, mind maps, and algorithms. The platform is built using Next.Js, Typescript, ShadCN UI, TailwindCSS, Convex, Kinde, EditorJS, and Excalidraw.

mem0-chrome-extension
Mem0 Chrome Extension is a tool that enhances AI interactions by providing a universal memory layer across various AI assistants. It allows users to seamlessly share context, automatically capture relevant information, and retrieve memories intelligently. The extension offers features like one-click sync with existing ChatGPT memories and a memory dashboard for easy management. Users can install the extension in Google Chrome, sign in with Google, and start using it with supported AI assistants. Mem0 is free to use with no usage limits or ads, and it prioritizes privacy and data security by sending messages to the Mem0 API for memory extraction and retrieval.

SolidGPT
SolidGPT is an AI searching assistant for developers that helps with code and workspace semantic search. It provides features such as talking to your codebase, asking questions about your codebase, semantic search and summary in Notion, and getting questions answered from your codebase and Notion without context switching. The tool ensures data safety by not collecting users' data and uses the OpenAI series model API.

commanddash
Dash AI is an open-source coding assistant for Flutter developers. It is designed to not only write code but also run and debug it, allowing it to assist beyond code completion and automate routine tasks. Dash AI is powered by Gemini, integrated with the Dart Analyzer, and specifically tailored for Flutter engineers. The vision for Dash AI is to create a single-command assistant that can automate tedious development tasks, enabling developers to focus on creativity and innovation. It aims to assist with the entire process of engineering a feature for an app, from breaking down the task into steps to generating exploratory tests and iterating on the code until the feature is complete. To achieve this vision, Dash AI is working on providing LLMs with the same access and information that human developers have, including full contextual knowledge, the latest syntax and dependencies data, and the ability to write, run, and debug code. Dash AI welcomes contributions from the community, including feature requests, issue fixes, and participation in discussions. The project is committed to building a coding assistant that empowers all Flutter developers.

kitops
KitOps is a packaging and versioning system for AI/ML projects that uses open standards so it works with the AI/ML, development, and DevOps tools you are already using. KitOps simplifies the handoffs between data scientists, application developers, and SREs working with LLMs and other AI/ML models. KitOps' ModelKits are a standards-based package for models, their dependencies, configurations, and codebases. ModelKits are portable, reproducible, and work with the tools you already use.

ProxyAI
ProxyAI is an open-source AI copilot for JetBrains, offering advanced code assistance features powered by top-tier language models. Users can customize their coding experience, receive AI-suggested code changes, autocomplete suggestions, and context-aware naming suggestions. The tool also allows users to chat with images, reference project files and folders, web docs, git history, and search the web. ProxyAI prioritizes user privacy by not collecting sensitive information and only gathering anonymous usage data with consent.

teams-ai
The Teams AI Library is a software development kit (SDK) that helps developers create bots that can interact with Teams and Microsoft 365 applications. It is built on top of the Bot Framework SDK and simplifies the process of developing bots that interact with Teams' artificial intelligence capabilities. The SDK is available for JavaScript/TypeScript, .NET, and Python.

CodeGPT
CodeGPT is an extension for JetBrains IDEs that provides access to state-of-the-art large language models (LLMs) for coding assistance. It offers a range of features to enhance the coding experience, including code completions, a ChatGPT-like interface for instant coding advice, commit message generation, reference file support, name suggestions, and offline development support. CodeGPT is designed to keep privacy in mind, ensuring that user data remains secure and private.

AppFlowy
AppFlowy.IO is an open-source alternative to Notion, providing users with control over their data and customizations. It aims to offer functionality, data security, and cross-platform native experience to individuals, as well as building blocks and collaboration infra services to enterprises and hackers. The tool is built with Flutter and Rust, supporting multiple platforms and emphasizing long-term maintainability. AppFlowy prioritizes data privacy, reliable native experience, and community-driven extensibility, aiming to democratize the creation of complex workplace management tools.

design-studio
Tiledesk Design Studio is an open-source, no-code development platform for creating chatbots and conversational apps. It offers a user-friendly, drag-and-drop interface with pre-ready actions and integrations. The platform combines the power of LLM/GPT AI with a flexible 'graph' approach for creating conversations and automations with ease. Users can automate customer conversations, prototype conversations, integrate ChatGPT, enhance user experience with multimedia, provide personalized product recommendations, set conditions, use random replies, connect to other tools like HubSpot CRM, integrate with WhatsApp, send emails, and seamlessly enhance existing setups.

ragapp
RAGapp is a tool designed for easy deployment of Agentic RAG in any enterprise. It allows users to configure and deploy RAG in their own cloud infrastructure using Docker. The tool is built using LlamaIndex and supports hosted AI models from OpenAI or Gemini, as well as local models using Ollama. RAGapp provides endpoints for Admin UI, Chat UI, and API, with the option to specify the model and Ollama host. The tool does not come with an authentication layer, requiring users to secure the '/admin' path in their cloud environment. Deployment can be done using Docker Compose with customizable model and Ollama host settings, or in Kubernetes for cloud infrastructure deployment. Development setup involves using Poetry for installation and building frontends.

lollms-webui
LoLLMs WebUI (Lord of Large Language Multimodal Systems: One tool to rule them all) is a user-friendly interface to access and utilize various LLM (Large Language Models) and other AI models for a wide range of tasks. With over 500 AI expert conditionings across diverse domains and more than 2500 fine tuned models over multiple domains, LoLLMs WebUI provides an immediate resource for any problem, from car repair to coding assistance, legal matters, medical diagnosis, entertainment, and more. The easy-to-use UI with light and dark mode options, integration with GitHub repository, support for different personalities, and features like thumb up/down rating, copy, edit, and remove messages, local database storage, search, export, and delete multiple discussions, make LoLLMs WebUI a powerful and versatile tool.
For similar tasks

obsidian-pieces
Pieces for Developers is a closed-source Obsidian plugin designed to revolutionize coding workflows by incorporating key capabilities and favorite features directly into the Obsidian environment. The plugin, Pieces Copilot for Obsidian, enhances coding and problem-solving experiences by providing insights on code snippets, generating samples, and facilitating navigation through PRs. Users can capture, manage, share, and discover code snippets and developer materials with ease, bringing efficiency and organization to their coding experience.

cursor-agent-tracking
Cursor Agent History Tracking System is a simple tool to maintain context and track changes in conversations with Cursor when it's in AGENT mode. It ensures continuity even if the AI 'forgets' previous interactions. The system includes templates for starting chat sessions, tracking changes, and maintaining project status and goals. Users can modify the templates to suit their specific needs while following best practices for consistent formatting and documentation.
For similar jobs

kaito
Kaito is an operator that automates the AI/ML inference model deployment in a Kubernetes cluster. It manages large model files using container images, avoids tuning deployment parameters to fit GPU hardware by providing preset configurations, auto-provisions GPU nodes based on model requirements, and hosts large model images in the public Microsoft Container Registry (MCR) if the license allows. Using Kaito, the workflow of onboarding large AI inference models in Kubernetes is largely simplified.

ai-on-gke
This repository contains assets related to AI/ML workloads on Google Kubernetes Engine (GKE). Run optimized AI/ML workloads with Google Kubernetes Engine (GKE) platform orchestration capabilities. A robust AI/ML platform considers the following layers: Infrastructure orchestration that support GPUs and TPUs for training and serving workloads at scale Flexible integration with distributed computing and data processing frameworks Support for multiple teams on the same infrastructure to maximize utilization of resources

tidb
TiDB is an open-source distributed SQL database that supports Hybrid Transactional and Analytical Processing (HTAP) workloads. It is MySQL compatible and features horizontal scalability, strong consistency, and high availability.

nvidia_gpu_exporter
Nvidia GPU exporter for prometheus, using `nvidia-smi` binary to gather metrics.

tracecat
Tracecat is an open-source automation platform for security teams. It's designed to be simple but powerful, with a focus on AI features and a practitioner-obsessed UI/UX. Tracecat can be used to automate a variety of tasks, including phishing email investigation, evidence collection, and remediation plan generation.

openinference
OpenInference is a set of conventions and plugins that complement OpenTelemetry to enable tracing of AI applications. It provides a way to capture and analyze the performance and behavior of AI models, including their interactions with other components of the application. OpenInference is designed to be language-agnostic and can be used with any OpenTelemetry-compatible backend. It includes a set of instrumentations for popular machine learning SDKs and frameworks, making it easy to add tracing to your AI applications.

BricksLLM
BricksLLM is a cloud native AI gateway written in Go. Currently, it provides native support for OpenAI, Anthropic, Azure OpenAI and vLLM. BricksLLM aims to provide enterprise level infrastructure that can power any LLM production use cases. Here are some use cases for BricksLLM: * Set LLM usage limits for users on different pricing tiers * Track LLM usage on a per user and per organization basis * Block or redact requests containing PIIs * Improve LLM reliability with failovers, retries and caching * Distribute API keys with rate limits and cost limits for internal development/production use cases * Distribute API keys with rate limits and cost limits for students

kong
Kong, or Kong API Gateway, is a cloud-native, platform-agnostic, scalable API Gateway distinguished for its high performance and extensibility via plugins. It also provides advanced AI capabilities with multi-LLM support. By providing functionality for proxying, routing, load balancing, health checking, authentication (and more), Kong serves as the central layer for orchestrating microservices or conventional API traffic with ease. Kong runs natively on Kubernetes thanks to its official Kubernetes Ingress Controller.