Promptly

Your Smart Terminal Assistant
PromptShell is a powerful tool that translates plain English text into precise shell commands, enhancing your command-line workflow with AI-powered capabilities.
Supported Providers
Choose between local processing for privacy or cloud providers for enhanced capabilities.
Local Processing
Via Ollama integration:- Ollama- Python 3.9+ Required for core functionality- 4GB+ RAM Recommended for local models- Internet Connection Required for initial setup- Local models retain 100% privacy
Cloud Providers
Multiple API options available.
Installation Guide
Get started instantly with just one command - copy, paste, and you're ready to go!
Choose Platform
$ pipx install promptshell
Run PromptShell
$ promptshell
Interactive Configuration Setup
Try it out right here!
$ --config? Select operation mode:local (Privacy-first, needs 4GB+ RAM)api (Faster but requires internet)
PromptShell in Action
See how PromptShell makes your CLI tasks easier with AI-powered commands.
Natural Language Commands Simplifying Complex Workflows
Ask Questions, Run Commands Directly and Ensure Secure Execution
Seamless Integration with Git, Docker, and Developer Tools
Generate Code Snippets Using Prompts and Save Directly to Desired Locations
$ make 2 .js and 3 .txt files? Do you want to run the command 'type nul > script1.js && type nul > script2.js && type nul > file1.txt && type nul > file2.txt && type nul > file3.txt'? YesCommand: type nul > script1.js && type nul > script2.js && type nul > file1.txt && type nul > file2.txt && type nul > file3.txt$ backup add .txt files to a folder named backup? Do you want to run the command 'mkdir backup && copy *.txt backup\'? YesCommand: mkdir backup && copy *.txt backup\file1.txtfile2.txtfile3.txt3 file(s) copied.
Features
Experience a powerful shell assistant designed to enhance your command-line workflow.
- Cross-Platform Compatibility: Works seamlessly across Windows, Linux, and macOS.
- Hybrid AI-Model Support: Supports both local (Ollama) and cloud-based (Groq, OpenAI, Google, etc.) LLMs for flexibility.
- Privacy-First Approach: Defaults to local models and runs completely offline, so that no data leaves your device unless you enable cloud APIs.
- Context-Aware Execution: Remembers command history, tracks files, and adapts suggestions accordingly.
- Secure Command Execution: Blocks dangerous commands and asks for confirmation.
- Smart Autocompletion: Provides tab completions for files and folder present in working directory.
- Intelligent Debugging & Auto-Correction: Identifies errors, autonomously debugs issues, and suggests corrected commands.
- Proudly Open-Sourced: Get started with our documentation.
Comments
Please log in to post a comment.