Giter Club home page Giter Club logo

Comments (17)

keriati avatar keriati commented on June 19, 2024

@danraymond at the moment ollama is missing support for model listing (at least when using the openAI compatible connection).
A pull request is already open for it here: ollama/ollama#2476
If the maintainers are accepting this PR, ollama will start to work with the BMO plugin also with the normal API connection.
Also this fix might be relevant then for BMO: #51

from obsidian-bmo-chatbot.

danraymond avatar danraymond commented on June 19, 2024

Is there a way to get int to work in obsidian until a next update?

from obsidian-bmo-chatbot.

twalderman avatar twalderman commented on June 19, 2024

I had a similar problem and I removes the plugin folder after I uninstalled and then reinstalled. It worked after. Also I found that is I had filled in a REST API entry (not oai) it would make other model disappear.

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on June 19, 2024

v1.8.8

@danraymond - Let me know if keriati PR resolve this issue.

Thanks

from obsidian-bmo-chatbot.

danraymond avatar danraymond commented on June 19, 2024

The latest update responds with a "model not found" in the chat box. However, I still can';t set the model in the plugin parameters is that the issue. How then to set the model. Ollama happily running for other plugins

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on June 19, 2024

@danraymond

Hmm, I'm not sure if fetching via the window instance of Ollama is different from the Linux/MacOS version.

I will setup Ollama on a windows machine to see if I can replicate the same issue you are facing.

from obsidian-bmo-chatbot.

danraymond avatar danraymond commented on June 19, 2024

Hi longy2k,
How have you progressed please. The Iatest update of the plugin I still can't load a model when ollama windows instance

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on June 19, 2024

Hi @danraymond,

I just did a fresh install for Obsidian/Ollama on Windows 11 and was able to get it working.

There are two potential issues that I can think of:

The data.json does not have the right data structure which may cause an error. In this case, try creating a new Obsidian vault, download BMO Chatbot, and insert http://localhost:11434 to OLLAMA REST API URL after running the Ollama server. If BMO Chatbot works with the new vault, I would recommend deleting data.json from your original vault and restart the plugin.

  1. Go to Community Settings and click on the folder's icon:

    Screenshot 2024-03-10 at 9 28 38 PM
  2. Close Obsidian completely.

  3. Find the bmo-chatbot folder and delete data.json.

  4. Restart Obsidian.

The second potential issue is that Ollama server is not setup properly.

I know you mentioned that your Ollama server is working with other plugins but these are the only two issues I can think of at the moment. Please let me know what you have tried and maybe we could find other ways to resolve this :)

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on June 19, 2024

v2.1.0

I also ran into the same issue of not being able to select a model in the dropdown at times.

I have added a reload button next to the dropdown under in 'General > Model', let me know if this resolves your issue.

Thanks

from obsidian-bmo-chatbot.

Rvnd0m avatar Rvnd0m commented on June 19, 2024

v2.1.0

I also ran into the same issue of not being able to select a model in the dropdown at times.

I have added a reload button next to the dropdown under in 'General > Model', let me know if this resolves your issue.

Thanks

Hi Longy2k,
First of all, thanks for your time.
I've pushed the reload button to try to show the models (currently I run Mistral, Llama3 and Llama3:70b on my Ollama), but nothing changes... there is not any model in the dropdown list...

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on June 19, 2024

@Rvnd0m

Please update to v2.1.1 and make sure your Ollama server handles the CORS restriction. You can follow these instructions here: https://github.com/longy2k/obsidian-bmo-chatbot/wiki/How-to-setup-with-Ollama

Let me know if you got it working, thanks!

from obsidian-bmo-chatbot.

Rvnd0m avatar Rvnd0m commented on June 19, 2024

@Rvnd0m

Please update to v2.1.1 and make sure your Ollama server handles the CORS restriction. You can follow these instructions here: https://github.com/longy2k/obsidian-bmo-chatbot/wiki/How-to-setup-with-Ollama

Let me know if you got it working, thanks!

Hi! I am running Ollama on PopOS (Ubuntu), and I read that your tutorial is for MacOS CORS. Is it the same for Linux? (I wouldn't want to break something...)

Regards,

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on June 19, 2024

@Rvnd0m

Option 2 might be the easiest:

  1. Open terminal and run OLLAMA_ORIGINS="app://obsidian.md*" OLLAMA_HOST="127.0.0.1:11435" ollama serve. This will create a new server that will bypass the CORS restriction.
  2. Go to 'BMO Chatbot Settings > Ollama Connection > OLLAMA REST API URL' and insert Ollama's url: http://localhost:11435.
  3. Press the reload button next to the model dropdown in "General > Model" if necessary to reload all known models.

If you run into port conflicts (e.g Address already in use):

  1. You can open terminal and run sudo lsof -i :PORT_NUMBER where :PORT_NUMBER can be :11435 or any other port that is causing port conflict.
  2. Read the table and locate the header "PID".
  3. Run sudo kill -9 PID where PID is all the numbers under the heading.
  4. Then repeat the steps in option 2. (If all else fails, just restart your computer and do option 2 again).

from obsidian-bmo-chatbot.

Rvnd0m avatar Rvnd0m commented on June 19, 2024

Hi longy2k.
I did the Option 2 and the PID kills, but it still doesn't show any model, despite the msg "models reloadad" appears on the corner (but the dropdown is empty). I also rebooted the machine btw.

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on June 19, 2024

For option 2, is the server successfully running?

Also check if you are on the latest Ollama version (v0.1.38).

If you are on the latest Ollama version and you are running the Ollama server successfully and no models are loaded, try uninstalling and re-installing the plug-in.

from obsidian-bmo-chatbot.

Rvnd0m avatar Rvnd0m commented on June 19, 2024

Hi Longy2k.

The server is running properly, as I can chat with LLMs via Open WebUI.

My version vas .37, but I've just updated it to the latest (.39)

I have also done the uninstall plugin + install, but still doesn't work...

I'm sorry for the time you are spending on that... i don't know why-df it doesn't dropsdown the models despite the "Models reloaded" msg...

Regards,

from obsidian-bmo-chatbot.

Philosophist avatar Philosophist commented on June 19, 2024

I'm having the same problem. Using WSL version of stable diffusion and Ollama.

from obsidian-bmo-chatbot.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.