Giter Club home page Giter Club logo

azure-openai-custom-chatgpt's Introduction

Azure OpenAI Intelligent Chatbot

An intelligent, conversational, context-aware chatbot that can be utilized to answer questions about your own documented data.

This repository utilizes Meta's in-memory vector index Faiss, as well as LangChain Python library, and Azure OpenAI's Embedding and Completion services.

Prerequisites

You should have an Azure account with a running instance of OpenAI.

Workflow

Diagram

Creating a Vector Index (dashed arrows flow)

  1. Load the data with a document loader.
  2. Parse and split the data into smaller text chunks with a text splitter.
  3. Send the documents to embedding model to generate the embedding vectors.
  4. Store the embeddings in a vector database, which allows for efficient organization and access to vector data.

Generating a Completion (solid arrows flow)

  1. Use LLM to rephrase the question and add some context from the conversation.
  2. After you have full, standalone question, perform semantic similarity search in the vector database.
  3. Get the raw text that matches the resulting vectors (supported by the vector databases).
  4. Pass the context information, along with the user's question, to the LLM to generate final answer.

Environment

  1. Clone the repository:

    git clone https://github.com/hanit-com/azure-openai-custom-chatgpt.git
    
  2. Create a Python environment:

    python -m venv .venv
    
  3. Activate the environment:

    source .venv/bin/activate
    
  4. Install the dependencies:

    pip install -r requirements.txt
    

Configuration

  1. Create a '.env' file with the following variables. Complete the API key and the Azure resource endpoint base from your azure account. Select an instance on the Cognitive Services Hub and go to "Keys and Endpoints".
    Use either "KEY 1" or "KEY 2" and the "Endpoint" field.

    OPENAI_API_KEY=
    OPENAI_API_BASE=
    OPENAI_API_VERSION=2023-05-15
    
  2. Add your data files to the 'context_data/data' directory.

  3. Uncomment insert_data_to_store() in order to create a vector index at the first run. After running the program once, re-comment the function.

Usage

Run:

python azure_openai.py

In order to terminate the program, type exit in the prompt.

Prompt Example

(Using data from the Wikipedia page Coronation of Charles III and Camilla).

Notice that the questions do not have to be full, the chain will rephrase each question to provide it some context from the conversation.

Prompt: When the coronation took place?
The coronation took place on May 6, 2023.
Prompt: Where?
The coronation took place at Westminster Abbey.
Prompt: Why there?
The coronation took place at Westminster Abbey because it is a traditional location for British coronations, with the last 40 monarchs being crowned there since William the Conqueror in 1066.

azure-openai-custom-chatgpt's People

Contributors

hanit-com avatar

Stargazers

 avatar

Watchers

 avatar  avatar

Forkers

taltaf913

azure-openai-custom-chatgpt's Issues

getting error in faiss

I am trying to run main.py. getting the following error. It will be great if you help to solve the following issue.

(base) (Azure_openAI) l006531@C02CQ3NLMD6N azure-openai-custom-chatgpt % python main.py
Prompt: When the coronation took place?

Entering new StuffDocumentsChain chain...

Entering new LLMChain chain...
Prompt after formatting:
Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.

Anointing

Charles removed his robe of state and was seated on the Coronation Chair.[91][92] He then was anointed with holy oil by the Archbishop of Canterbury, using the ampulla and a medieval spoon, the latter the oldest part of the coronation regalia. The anointing emphasised the spiritual role of the sovereign. It was a private part of the service; as in 1953 it was not televised, and Charles was concealed by a screen. During this the choir sang the anthem Zadok the Priest.[93]

Investment and crowning

========================================== New Doc ==========================================

The coronation elicited both celebrations and protests in the United Kingdom and the other Commonwealth realms. Celebrations in the United Kingdom included street parties, volunteering, special commemorative church services, and a concert at Windsor Castle on 7 May. The ceremony was watched by a peak television audience of 20.4 million in the United Kingdom. Surveys carried out in April 2023 suggested that the British public was ambivalent towards the event and its funding; the events in London and Windsor

========================================== New Doc ==========================================

Service and procession

Order of service for the coronation

The organisation of the coronation was the responsibility of the earl marshal, Edward Fitzalan-Howard.[8] A committee of privy counsellors arranged the event.[9][7] On 11 October 2022, the date of the coronation was announced as 6 May 2023, a choice made to ensure sufficient time to mourn the death of Queen Elizabeth II before holding the ceremony.[10][7]

========================================== New Doc ==========================================

Enthronement and homage

Charles moved to the throne (originally made for George VI in 1937) and the Archbishop of Canterbury and William, Prince of Wales, offered him their fealty.[89][79] The Archbishop of Canterbury then invited the people of the United Kingdom and the other Commonwealth realms to swear allegiance to the King, the first time this has occurred.[79][95]

Coronation of the Queen

Queen Mary's Crown (here depicted in its original form) was used to crown Queen Camilla

Question: When the coronation took place?
Helpful Answer:

Traceback (most recent call last):
File "/Users/l006531/Azure_openAI/azure-openai-custom-chatgpt/main.py", line 31, in
completion = run_conversational_chain(prompt, chat_history, retriever)
File "/Users/l006531/Azure_openAI/azure-openai-custom-chatgpt/chain_utils.py", line 53, in run_conversational_chain
response = support_qa.run({"question": question, "chat_history": history})
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/base.py", line 236, in run
return self(args[0], callbacks=callbacks)[self.output_keys[0]]
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/base.py", line 140, in call
raise e
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/base.py", line 134, in call
self._call(inputs, run_manager=run_manager)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/conversational_retrieval/base.py", line 110, in _call
answer = self.combine_docs_chain.run(
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/base.py", line 239, in run
return self(kwargs, callbacks=callbacks)[self.output_keys[0]]
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/base.py", line 140, in call
raise e
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/base.py", line 134, in call
self._call(inputs, run_manager=run_manager)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/combine_documents/base.py", line 84, in _call
output, extra_return_dict = self.combine_docs(
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/combine_documents/stuff.py", line 87, in combine_docs
return self.llm_chain.predict(callbacks=callbacks, **inputs), {}
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/llm.py", line 213, in predict
return self(kwargs, callbacks=callbacks)[self.output_key]
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/base.py", line 140, in call
raise e
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/base.py", line 134, in call
self._call(inputs, run_manager=run_manager)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/llm.py", line 69, in _call
response = self.generate([inputs], run_manager=run_manager)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chains/llm.py", line 79, in generate
return self.llm.generate_prompt(
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chat_models/base.py", line 143, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chat_models/base.py", line 91, in generate
raise e
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chat_models/base.py", line 83, in generate
results = [
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chat_models/base.py", line 84, in
self._generate(m, stop=stop, run_manager=run_manager)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chat_models/openai.py", line 296, in _generate
response = self.completion_with_retry(messages=message_dicts, **params)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chat_models/openai.py", line 257, in completion_with_retry
return _completion_with_retry(**kwargs)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/tenacity/init.py", line 289, in wrapped_f
return self(f, *args, **kw)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/tenacity/init.py", line 379, in call
do = self.iter(retry_state=retry_state)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/tenacity/init.py", line 314, in iter
return fut.result()
File "/Users/l006531/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/Users/l006531/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/tenacity/init.py", line 382, in call
result = fn(*args, **kwargs)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/langchain/chat_models/openai.py", line 255, in _completion_with_retry
return self.client.create(**kwargs)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
return super().create(*args, **kwargs)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
response, _, api_key = requestor.request(
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request
resp, got_stream = self._interpret_response(result, stream)
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response
self._interpret_response_line(
File "/Users/l006531/chatgpt5/lib/python3.10/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line
raise self.handle_error_response(
openai.error.InvalidRequestError: The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again.
(base) (Azure_openAI) l006531@C02CQ3NLMD6N azure-openai-custom-chatgpt %

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.