a full stack fastapi application with llama index integrated
Python 69.08%Makefile 0.41%HTML 30.51%
llama-index-fastapi's Introduction
A Local Knowledge Base Augmented LLM able to serve millions of users, on top of llama index, fastapi and MongoDB
if user asks a question, the bot will try to match the question and find the answer from local database first
local knowledge base is a csv file of question/answer pairs, which is embedded(vectorized) by llama index when first
run
if no good matches found, the bot then call openAI's chatgpt api to get the answer, and insert the question/answer
pair into the index. so next time the bot will be able to answer a similar question from local database
if the question is not relevant to the topic(in our case the topic is Golf), the bot will call openAI's chatgpt api to
get the answer
When asking a question in the knowledge base
When asking a question which is not relevant to the topic
More details
the bot uses fastapi as the web framework, llama index as the search engine, MongoDB as the metadata storage
during the first run, csv file is ingested and embedded by llama index as vector store, and the metadata is stored in
MongoDB