A terminal based AI-powered task planning and execution assistant that uses Ollama to generate responses. It helps users refine their goals, create detailed task plans, and guide them through task execution.
- Interactive goal refinement through clarifying questions
- AI-generated task plans with detailed steps
- Task execution tracking
- Integration with Ollama for local AI model inference
Before you begin, ensure you have met the following requirements:
- Python 3.7 or higher
- Ollama installed and running on your system
-
Clone this repository:
git clone https://github.com/while-basic/Llama-Goals.git cd Llama-Goals
-
Ensure Ollama is installed and running on your system. Follow the Ollama installation guide if you haven't already set it up.
-
Start the Ollama service if it's not already running.
-
Run the main script:
python goal_planner.py
-
Follow the prompts to input your goal and answer clarifying questions.
-
The assistant will generate a task plan and guide you through the execution of each task.
- To change the Ollama model, modify the
model
parameter in thesend_prompt
function inmain.py
. - The Ollama API endpoint is set to
http://localhost:11434/api/generate
by default. If your Ollama instance is running on a different address, update theOLLAMA_API_URL
variable inmain.py
.
Contributions to Llama Goals are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- This project uses Ollama for local AI model inference.