Giter Club home page Giter Club logo

Comments (15)

Madd0g avatar Madd0g commented on July 18, 2024 2

I tested the latest version (that uses /api/chat with ollama), works great so far

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on July 18, 2024 1

@twalderman

I set up Ollama to stream the response by default.

I will create an option to turn off the stream so that you do not need to setup an external server.

After that, it should work for you similar to ChatCBT and Obsidian-ollama.

Thanks

from obsidian-bmo-chatbot.

Madd0g avatar Madd0g commented on July 18, 2024 1

@longy2k glad you saw it :)

Yes, the very last streaming message from ollama comes with a big json and is usually split into multiple chunks, so this naive approach doesn't handle it (I think there are streaming json libraries that simplify this though). See my code prints {e} on every error, so I see those a lot, so maybe keep a try/catch in the code like I did.

On an unrelated note, another thing I hacked into my local version (which is why I'm slow at testing the new release) is templating support, I know it's a bit of an overkill feature for a tool that doesn't directly target open source llms, but the tool naively prepends "user:" or "assistant:" to messages, while the llms actually work best with very specific templates (<|im_start|>user and such).

I think a config screen for this is an overkill, but if there was a simple way to configure it (maybe in a yaml/json file), that would be great.

I don't think is an absolute necessity (it mostly works without it), but for me it expands the usefulness of the tool into more "professional" use-cases. And I feel it's more "correct" to use the right template.

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on July 18, 2024

I am not sure if your issue is specific to the BMO chatbot plugin or your configurations with Ollama.

Can you check if the server is running with any of these other interfaces:

If you did not receive an error and it works fine on these websites but not with BMO, I will start troubleshooting and see if I can mimic the same behavior and resolve it on my end.

EDIT: There may also be a whitespace error when you initially use it (I will resolve that in the next release), try chatting with the model a few times to see if it runs.

from obsidian-bmo-chatbot.

twalderman avatar twalderman commented on July 18, 2024

from obsidian-bmo-chatbot.

Madd0g avatar Madd0g commented on July 18, 2024

hmm... I had a bunch of trouble with ollama too and I did some digging.
I was getting JSON.parse errors, not sure if you're seeing similar errors or not @twalderman.

In my case, in a single chunk, there would sometime be more than one JSON object. I figured it must be json-ld. So I changed the code to this (sorry if this isn't like the original, I fixed the shipped bundled version locally)

// around line 4384 of main.js

const chunk = decoder.decode(value, { stream: true }) || "";
// splitting the chunk to parse JSON messages separately
const parts = chunk.split('\n');
for (const part of parts.filter(Boolean)) {
  let parsedChunk;
  try {
    parsedChunk = JSON.parse(part);
  } catch (err) {
    console.error(err);
    console.log('part', part);
    parsedChunk = {response: '{_e_}'};
  }
  const content = parsedChunk.response;
  message += content;
}

for me, it fixed the issue of blank responses.

EDIT: actually, on longer messages (last message that has long context array), they sometimes get split into multiple chunks. so my solution does not handle that.

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on July 18, 2024

@Madd0g

I know you have resolved the JSON.parse errors on your end.

Can you check if v1.7.2 is giving you any errors (especially in a new vault)?

Thanks!

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on July 18, 2024

@twalderman

I have updated Ollama to fetch from /api/chat instead of /api/generate, can you please let me know if Ollama is working for you now?

Thanks

from obsidian-bmo-chatbot.

twalderman avatar twalderman commented on July 18, 2024

getting 404 errors

[GIN] 2023/12/11 - 15:35:06 | 404 | 6.709µs | 127.0.0.1 | POST "/api/chat"

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on July 18, 2024

@twalderman

Is it version 0.1.14 (Type ollama -v in terminal)?

Again, if you are not able to run the server on an alternative interface such as: https://ollama.twanluttik.com/

It may be on your end which makes it harder to troubleshoot. I will poke around to see if I can find a solution for you.

from obsidian-bmo-chatbot.

twalderman avatar twalderman commented on July 18, 2024

Perhaps it is my setup. I tried another plugin "ChatCBT" and it works as expected on port 11434. https://ollama.twanluttik.com/ is not working either so I will try on a different machine. My M3 is on order so I will take a look at this when it arrives. Thanks Man.

from obsidian-bmo-chatbot.

twalderman avatar twalderman commented on July 18, 2024

https://github.com/hinterdupfinger/obsidian-ollama this one seems to work fine. Are you on mac or Linux?

from obsidian-bmo-chatbot.

twalderman avatar twalderman commented on July 18, 2024

that will work fine for me. i dont know if it is related but I came across this: langchain-ai/langchain#11544

from obsidian-bmo-chatbot.

Madd0g avatar Madd0g commented on July 18, 2024

@longy2k thanks, I'll upgrade ollama and try this new mode.

from just looking at the code, if streaming still has the same symptoms - this json fix (split by newline and get the first part) will miss some tokens (see how my fix loops over all split parts). But I haven't tested yet, maybe this isn't a problem anymore.

will update after I try with new ollama.

from obsidian-bmo-chatbot.

longy2k avatar longy2k commented on July 18, 2024

@Madd0g

Oh I see! I just tested it again in a new vault and some tokens are missing.

They're only missing for the first response though which is why I may have overlooked it.

I will make sure it loops over all split parts.

Thanks :)

EDIT: Missing tokens do occur every once and awhile (not just the initial response).

from obsidian-bmo-chatbot.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.