FreeGenius AI, an advanced AI assistant that can talk and take multi-step actions. Supports numerous open-source LLMs via Llama.cpp or Ollama or Groq Cloud API, with optional integration with AutoGen agents, OpenAI API, Google Gemini Pro and unlimited plugins.
Hello, while not being a code specialist, I am interested in implementing freegenius for "real world" applications. It looks really great. However, when running the script, I get stuck into an "infinite loop". Can you help?
launching FreeGenius AI ...
Checking 'freegenius' version ...
Installed version: 0.2.7
Latest version: 0.2.19
pip tool updated!
Upgrading 'freegenius' ...
Package 'freegenius' upgraded!
Restarting FreeGenius AI ...
launching FreeGenius AI ...
Checking 'freegenius' version ...
Installed version: 0.2.7
Latest version: 0.2.19
pip tool updated!
Upgrading 'freegenius' ...
add support of custom LLM path to be used with llama.cpp
add support use of Ollama hosted models with llama.cpp: It appears that models are easily to be selected and downloaded with Ollama, but inference is faster with llama.cpp. We will add UI to download LLMs via Ollama and used them with llama.cpp
add TUI dialogs to change llm backends and models
ui to change tool dependence
add tool selection threshold to allow users to select tool from closest matches tools
improve tool selection based on users interactions
test and fine-tune all plugins with all supported backends ... in progress ...
add more tools to further expand FreeGenius capabilities
integrate llama.cpp and GeminiPro with AutoGen features
add support of code auto-heal features like we implemented in LetMeDoIt AI