
BuildCLI
BuildCLI is a command-line interface (CLI) tool for managing and automating common tasks in Java project development.
Stars: 104

README:
,-----. ,--.,--. ,--. ,-----.,--. ,--.
| |) /_ ,--.,--.`--'| | ,-| |' .--./| | | |
| .-. \| || |,--.| |' .-. || | | | | | Built by the community, for the community
| '--' /' '' '| || |\ `-' |' '--'\| '--.| |
`------' `----' `--'`--' `---' `-----'`-----'`--'
Welcome to BuildCLI - Java Project Management!
BuildCLI is a command-line interface (CLI) tool for managing and automating common tasks in Java project development. It allows you to create, compile, manage dependencies, and run Java projects directly from the terminal, simplifying the development process.
- Repository: https://github.com/BuildCLI/BuildCLI
- License: MIT
- Initialize Project: Creates the basic structure of directories and files for a Java project.
- Compile Project: Compiles the project source code using Maven.
-
Add Dependency: Adds new dependencies to the
pom.xml
. -
Remove Dependency: Remove dependencies from
pom.xml
. - Document Code: [Beta] Generates documentation for a Java file using AI.
-
Manage Configuration Profiles: Creates specific configuration files for profiles (
application-dev.properties
,application-test.properties
, etc.). - Run Project: Starts the project directly from the CLI using Spring Boot.
- Dockerize Project: Generates a Dockerfile for the project, allowing easy containerization.
- Build and Run Docker Container: Builds and runs the Docker container using the generated Dockerfile.
- CI/CD Integration: Automatically generates configuration files por CI/CD tools (e.g., Jenkins, GitHub Actions) and triggers pipelines based on project changes.
- Changelog Generation: Automatically generates a structured changelog by analyzing the Git commit history, facilitating the understanding of changes between releases.
-
Script Installation: Just download the .sh or .bat file and execute.
- On a Unix-like system (Linux, macOS), simply give execution permission to
install.sh
and run it:
sudo chmod +x install.sh ./install.sh
- On Windows: Run
install.bat
by double-clicking it or executing the following command in the Command Prompt (cmd):
install.bat
- On a Unix-like system (Linux, macOS), simply give execution permission to
Now BuildCLI
is ready to use. Test the buildcli
command in the terminal.
We made a major refactor of the BuildCLI
architecture. Please use the buildcli help
command to see all available options. Also, refer to issue #89 and pull request #79 for more details.
Creates the basic Java project structure, including src/main/java
, pom.xml
, and README.md
.
You can specify a project name to dynamically set the package structure and project artifact.
- To initialize a project with a specific name:
buildcli project init MyProject
This will create the project structure with MyProject
as the base package name, resulting in a directory like src/main/java/org/myproject
.
- To initialize a project without specifying a name:
buildcli project init
This will create the project structure with buildcli
as the base package name, resulting in a directory like src/main/java/org/buildcli
.
Compiles the Java project using Maven:
buildcli project build --compile
Adds a dependency to the project in the groupId:artifactId
format. You can also specify a version using the format groupId:artifactId:version
. If no version is specified, the dependency will default to the latest version available.
- To add a dependency with the latest version:
buildcli project add dependency org.springframework:spring-core
- To add a dependency with a specified version:
buildcli p a d org.springframework:spring-core:5.3.21
After executing these commands, the dependency will be appended to your pom.xml file under the <dependencies>
section.
Creates a configuration file with the specified profile, for example, application-dev.properties
:
buildcli project add profile dev
Runs the Java project using Spring Boot:
buildcli project run
Automatically generates inline documentation for a Java file using AI:
# File or directory
buildcli ai code document File.java
This command sends the specified Java file to the local Ollama server, which generates documentation and comments directly within the code. The modified file with documentation will be saved back to the same location.
Sets the active environment profile, saving it to the environment.config
file. The profile is referenced during project execution, ensuring that the correct configuration is loaded.
buildcli p set env dev
After running this command, the active profile is set to dev, and the environment.config
file is updated accordingly.
With the --set-environment
functionality, you can set the active environment profile. When running the project with buildcli --run
, the active profile will be displayed in the terminal.
This command generates a Dockerfile
for your Java project, making it easier to containerize your application.
buildcli p add dockerfile
This command automatically builds and runs the Docker container for you. After running the command, the Docker image will be created, and your project will run inside the container.
buildcli project run docker
Generates configuration files for CI/CD tools and prepares the project for automated pipelines. Supports Jenkins, Gitlab and GitHub Actions.
buildcli project add pipeline github
buildcli project add pipeline gitlab
buildcli project add pipeline jenkins
Ensure you have the Ollama server running locally, as the docs
functionality relies on an AI model accessible via a local API.
You can start the Ollama server by running:
ollama run llama3.2
- Jenkins: Ensure Jenkins is installed and accessible in your environment.
- GitHub Actions: Ensure your repository is hosted on GitHub with Actions enabled.
BuildCLI now includes an automatic changelog generation feature that analyzes your Git commit history and produces a structured changelog. This helps developers and end-users easily track changes between releases.
To generate a changelog, run:
buildcli changelog [OPTIONS]
Or use the alias:
buildcli cl [OPTIONS]
-
--version, -v <version>:
Specify the release version for the changelog. If omitted, BuildCLI attempts to detect the latest Git tag. If no tag is found, it defaults to "Unreleased". -
--format, -f <format>:
Specify the output format. Supported formats:- markdown (default)
- html
- json
-
--output, -o <file>:
Specify the output file name. If not provided, defaults to CHANGELOG.. -
--include, -i <commit types>:
Provide a comma-separated list of commit types to include (e.g., feat,fix,docs,refactor).
buildcli changelog --version v1.0.0 --format markdown --include feat,fix --output CHANGELOG.md
buildcli changelog -v v1.0.0 -f markdown -i feat,fix -o CHANGELOG.md
Contributions are welcome! Feel free to open Issues and submit Pull Requests. See the CONTRIBUTING.md file for more details.
Quick steps to contribute:
-
Fork the project.
-
Create a branch for your changes:
git checkout -b feature/my-feature
-
Commit your changes:
git commit -m "My new feature"
-
Push to your branch:
git push origin feature/my-feature
-
Open a Pull Request in the main repository.
This project is licensed under the MIT License - see the LICENSE file for details.
To get a deeper understanding of the BuildCLI project structure, key classes, commands, and how to contribute, check out our comprehensive guide in PROJECT_FAMILIARIZATION.md.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for BuildCLI
Similar Open Source Tools

pastemax
PasteMax is a modern file viewer application designed for developers to easily navigate, search, and copy code from repositories. It provides features such as file tree navigation, token counting, search capabilities, selection management, sorting options, dark mode, binary file detection, and smart file exclusion. Built with Electron, React, and TypeScript, PasteMax is ideal for pasting code into ChatGPT or other language models. Users can download the application or build it from source, and customize file exclusions. Troubleshooting steps are provided for common issues, and contributions to the project are welcome under the MIT License.

rclip
rclip is a command-line photo search tool powered by the OpenAI's CLIP neural network. It allows users to search for images using text queries, similar image search, and combining multiple queries. The tool extracts features from photos to enable searching and indexing, with options for previewing results in supported terminals or custom viewers. Users can install rclip on Linux, macOS, and Windows using different installation methods. The repository follows the Conventional Commits standard and welcomes contributions from the community.

rag-gpt
RAG-GPT is a tool that allows users to quickly launch an intelligent customer service system with Flask, LLM, and RAG. It includes frontend, backend, and admin console components. The tool supports cloud-based and local LLMs, enables deployment of conversational service robots in minutes, integrates diverse knowledge bases, offers flexible configuration options, and features an attractive user interface.

llm-functions
LLM Functions is a project that enables the enhancement of large language models (LLMs) with custom tools and agents developed in bash, javascript, and python. Users can create tools for their LLM to execute system commands, access web APIs, or perform other complex tasks triggered by natural language prompts. The project provides a framework for building tools and agents, with tools being functions written in the user's preferred language and automatically generating JSON declarations based on comments. Agents combine prompts, function callings, and knowledge (RAG) to create conversational AI agents. The project is designed to be user-friendly and allows users to easily extend the capabilities of their language models.

rag-gpt
RAG-GPT is a tool that allows users to quickly launch an intelligent customer service system with Flask, LLM, and RAG. It includes frontend, backend, and admin console components. The tool supports cloud-based and local LLMs, offers quick setup for conversational service robots, integrates diverse knowledge bases, provides flexible configuration options, and features an attractive user interface.

code2prompt
Code2Prompt is a powerful command-line tool that generates comprehensive prompts from codebases, designed to streamline interactions between developers and Large Language Models (LLMs) for code analysis, documentation, and improvement tasks. It bridges the gap between codebases and LLMs by converting projects into AI-friendly prompts, enabling users to leverage AI for various software development tasks. The tool offers features like holistic codebase representation, intelligent source tree generation, customizable prompt templates, smart token management, Gitignore integration, flexible file handling, clipboard-ready output, multiple output options, and enhanced code readability.

ps-fuzz
The Prompt Fuzzer is an open-source tool that helps you assess the security of your GenAI application's system prompt against various dynamic LLM-based attacks. It provides a security evaluation based on the outcome of these attack simulations, enabling you to strengthen your system prompt as needed. The Prompt Fuzzer dynamically tailors its tests to your application's unique configuration and domain. The Fuzzer also includes a Playground chat interface, giving you the chance to iteratively improve your system prompt, hardening it against a wide spectrum of generative AI attacks.

manifold
Manifold is a powerful platform for workflow automation using AI models. It supports text generation, image generation, and retrieval-augmented generation, integrating seamlessly with popular AI endpoints. Additionally, Manifold provides robust semantic search capabilities using PGVector combined with the SEFII engine. It is under active development and not production-ready.

chatgpt-cli
ChatGPT CLI provides a powerful command-line interface for seamless interaction with ChatGPT models via OpenAI and Azure. It features streaming capabilities, extensive configuration options, and supports various modes like streaming, query, and interactive mode. Users can manage thread-based context, sliding window history, and provide custom context from any source. The CLI also offers model and thread listing, advanced configuration options, and supports GPT-4, GPT-3.5-turbo, and Perplexity's models. Installation is available via Homebrew or direct download, and users can configure settings through default values, a config.yaml file, or environment variables.

xGitGuard
xGitGuard is an AI-based system developed by Comcast Cybersecurity Research and Development team to detect secrets (e.g., API tokens, usernames, passwords) exposed on GitHub repositories. It uses advanced Natural Language Processing to detect secrets at scale and with appropriate velocity. The tool provides workflows for detecting credentials and keys/tokens in both enterprise and public GitHub accounts. Users can set up search patterns, configure API access, run detections with or without ML filters, and train ML models for improved detection accuracy. xGitGuard also supports custom keyword scans for targeted organizations or repositories. The tool is licensed under Apache 2.0.

web-ui
WebUI is a user-friendly tool built on Gradio that enhances website accessibility for AI agents. It supports various Large Language Models (LLMs) and allows custom browser integration for seamless interaction. The tool eliminates the need for re-login and authentication challenges, offering high-definition screen recording capabilities.

hayhooks
Hayhooks is a tool that simplifies the deployment and serving of Haystack pipelines as REST APIs. It allows users to wrap their pipelines with custom logic and expose them via HTTP endpoints, including OpenAI-compatible chat completion endpoints. With Hayhooks, users can easily convert their Haystack pipelines into API services with minimal boilerplate code.

llm-vscode
llm-vscode is an extension designed for all things LLM, utilizing llm-ls as its backend. It offers features such as code completion with 'ghost-text' suggestions, the ability to choose models for code generation via HTTP requests, ensuring prompt size fits within the context window, and code attribution checks. Users can configure the backend, suggestion behavior, keybindings, llm-ls settings, and tokenization options. Additionally, the extension supports testing models like Code Llama 13B, Phind/Phind-CodeLlama-34B-v2, and WizardLM/WizardCoder-Python-34B-V1.0. Development involves cloning llm-ls, building it, and setting up the llm-vscode extension for use.

well-architected-iac-analyzer
Well-Architected Infrastructure as Code (IaC) Analyzer is a project demonstrating how generative AI can evaluate infrastructure code for alignment with best practices. It features a modern web application allowing users to upload IaC documents, complete IaC projects, or architecture diagrams for assessment. The tool provides insights into infrastructure code alignment with AWS best practices, offers suggestions for improving cloud architecture designs, and can generate IaC templates from architecture diagrams. Users can analyze CloudFormation, Terraform, or AWS CDK templates, architecture diagrams in PNG or JPEG format, and complete IaC projects with supporting documents. Real-time analysis against Well-Architected best practices, integration with AWS Well-Architected Tool, and export of analysis results and recommendations are included.