Giter Club home page Giter Club logo

mongodb / chatbot Goto Github PK

View Code? Open in Web Editor NEW
93.0 6.0 43.0 21.25 MB

MongoDB Chatbot Framework. Powered by MongoDB and Atlas Vector Search.

Home Page: https://mongodb.github.io/chatbot/

License: Apache License 2.0

JavaScript 0.94% HTML 4.98% TypeScript 91.54% CSS 0.46% Dockerfile 0.12% Shell 0.05% MDX 0.03% C 0.09% C++ 0.24% Go 0.27% Java 0.24% PHP 0.28% Python 0.15% Ruby 0.11% Rust 0.21% Scala 0.16% Swift 0.13%
azure-openai chatbot chatgpt mongodb mongodb-atlas openai vector-search rag retrieval-augmented-generation retrieval-augmented-qa

chatbot's People

Contributors

admin-token-bot avatar branberry avatar cbush avatar emetozar avatar jackyw2018 avatar mongodben avatar nlarew avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

chatbot's Issues

Adapting UI for Light Mode Users

The current examples/quickstart package does not cater for normal light mode users that well enough. It defaults to using the white background which then renders incorrectly the rest of the text on the Chatbot page.

Quick fix by commenting the code in the ui/index.css file Line 6 and 56-67

/* @media (prefers-color-scheme: light) {
  :root {
    color: #213547;
    background-color: #ffffff;
  }
  a:hover {
    color: #747bff;
  }
  button {
    background-color: #f9f9f9;
  }
} */

Add ModalView properties to set the chat icons

I know the UI is documented as "focused on internal MongoDB use cases", but it's already very customizable, the one exception being the chat icons (notably the MongoDB leaf). It would be great to have a property to override this with something else.

How do you install this?

The following packages in mongodb-chatbot-ui are not on npm:

    "@lg-chat/avatar": "^2.0.6",
    "@lg-chat/chat-disclaimer": "^2.0.0",
    "@lg-chat/chat-window": "^1.0.4",
    "@lg-chat/fixed-chat-window": "^1.1.1",
    "@lg-chat/input-bar": "^3.2.6",
    "@lg-chat/leafygreen-chat-provider": "^1.0.2",
    "@lg-chat/message": "^2.0.8",
    "@lg-chat/message-feed": "^2.0.7",
    "@lg-chat/message-prompts": "^1.0.2",
    "@lg-chat/message-rating": "^1.1.3",

This results in auth errors during npm install

.nprc has these commented out lines:

//artifactory.corp.mongodb.com/artifactory/api/npm/leafygreen-ui/:_authToken=${LG_ARTIFACTORY_TOKEN}
//artifactory.corp.mongodb.com/artifactory/api/npm/leafygreen-ui/:email=${LG_ARTIFACTORY_EMAIL}
//artifactory.corp.mongodb.com/artifactory/api/npm/leafygreen-ui/:always-auth=true

//registry.npmjs.org/:_authToken=${MONGODB_EAI_NPM_TOKEN}

Do you need this, and if so where do you get these tokens??

Info retrieval and LLM function calling with parameter

I successfully added a function calling to my LLM (using mongo-chatbot-server)
On the call to /conversations (and /conversations/:id/messages), I pass a user ID parameter.
I store it as customData in the conversations collection.
When the user's query is about a personal request, I am able to have the LLM invoke my function.
However, I would like to know the context (user ID) so I can make a query from another MongoDB collection (or a SQL DB).
Do you have a best practice approach?

Cannot find the LG packages

when installing the sample code, i get the following:
npm ERR! code E404
npm ERR! 404 Not Found - GET https://registry.npmjs.org/@lg-chat%2fchat-disclaimer - Not found
npm ERR! 404
npm ERR! 404 '@lg-chat/chat-disclaimer@^2.0.0' is not in this registry.
npm ERR! 404
npm ERR! 404 Note that you can also install from a
npm ERR! 404 tarball, folder, http url, or git url.

npm ERR! A complete log of this run can be found in: /Users/willesclusa/.npm/_logs/2024-02-07T03_17_36_566Z-debug-0.log

This applies to all the @lg dependancies.

IP error when deploying mongodb-chatbot-server

When deploying mongodb-chatbot-server (onto Azure app service), the server deploys fine.
But it generates this error when calling /conversations API:

"error": "The request has an invalid IP address: xxx.xxx.xxx.xxx:54295"

xxx.xxx.xxx.xxx being my public IP address

mongodb-chatbot-evaluation ready to use?

Hi,

is mongodb-chatbot-evaluation ready to use?
I installed you and tried an "out-of-the-box"

tsc --module commonjs --target es2017 --outDir build src/eval/eval.config.ts

but eval.config.ts can not import most functions from 'mongodb-chatbot-evaluation'
Module '"mongodb-chatbot-evaluation"' has no exported member 'makeGenerateConversationData'.ts(2305)

Getting access to usage tokens

We might implement a limit to the number of calls being made (based on user account).
Also, we would like to measure/track the usage of LLM calls.
For that, I would like to get the usage response (number of tokens used by LLM) from the LLM call, accessible in the API response, or possibly stored in the /conversations collection.

how to pass parameters from <Chatbot>?

when using mongodb-chatbot-ui
as
<Chatbot
user={{ name: 'eric' }}
isExperimental={false}
darkMode={false}
serverBaseUrl="http://localhost:3000/api/v1"
>

how can I pass parameters (like userID, or app parameters) to the mongodb-chatbot-server ?
I am using middleware to extract parameters, like
const conversation = await res.locals.conversations.findById( req.params.conversationId )

but req.params is always empty.
how can I add my own property?

I redefined the fetch function to add my own header, but it feels a little "hacky"

Use Claude3 LLM

I wanted to test with Claude3 LLM.
Out-of-the-box, framework didn't work; just wanted to share my changes:

I had to change LangchainChatLlm.ts for answerQuestionAwaited, with

...
      const prompts = ChatPromptTemplate.fromMessages(
        messages.map(m => messageBaseMessagePromptTemplateLike(m))
      )
      const chain = prompts.pipe(chatModel)
      const res = await chain.invoke({})
...

function messageBaseMessagePromptTemplateLike(
  message: OpenAiChatMessage
): BaseMessagePromptTemplateLike {
  return [message.role, message.content ?? '']
}

Instrumenting with LangSmith

Some feedback:
I am looking into LangSmith as a tool to monitor and audit the queries.
One of the steps is to retrieve the result from openAiClient.getChatCompletions, and send it to LangSmith.

In my project, I have now copied all the mongodb-server package files so I can customize/debug as needed.
So I added code into the answerQuestionAwaited method to interact with LangSmith.

Wondering if this could be done another way (so I could eventually just use the mongodb-server package "as is")?

`ApiConversations` service

create an ApiConversations service that manages the interface between the server and database to persist api conversations.

similar to docs chat ConversationsService. though, will need to take into account the OpenAI functions + (and maybe system prompt changes?) that are available to following messages.

is VECTOR_SEARCH_INDEX_NAME customizable?

mongodb-chatbot-server uses VECTOR_SEARCH_INDEX_NAME from .env variables.
but mongodb-rag-ingest does not.
I wanted to setup 2 environment: one with text-embedding-ada-002 and one with text-embedding-3-large, so I could test the difference in performance. I was wondering if I could use VECTOR_SEARCH_INDEX_NAME for that purpose (basically having 2 "embedded_content" collections, each with its search index).

OPENAI_EMBEDDING_MODEL="text-embedding-3-small" not working?

My system is properly working when using:
OPENAI_EMBEDDING_MODEL="text-embedding-ada-002"
for ingest and server.

However, when I use:
OPENAI_EMBEDDING_MODEL="text-embedding-3-small"

I am always getting the message associated to NO_RELEVANT_CONTENT.
Log shows

{"level":"info","message":"No matching content found","reqId":"65cc405d974b84e18cac0060"}

Do I need to change any setting in the Config?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.