Giter Club home page Giter Club logo

chatgpt-memory's People

Contributors

nps1ngh avatar shahrukhx01 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chatgpt-memory's Issues

Model name change

when changing the model name to: model_name: str = "gpt-4"
we get an error? the change was made in chatgpt_memory/llm_client/openai/conversation/config.py
why are we getting this error?

Retaining conversation id's between process executions?

Hello, I just started messing with this library and from what I can tell, the conversation ID's are only valid during an active process session, but if the process/application is restarted, then there seems to no longer be any context when trying to re-use any of these previous conversation ID's.

Is this expected behavior? Is there a way to make these previous conversation ID's re-usable across process executions?

ModuleNotFoundError: No module named 'redis'

Any thoughts on this error I'm getting? Can't seem to wrap my head around it. I set up a redis cloud environment, copied .env.example to .env and updated all the values.

>  uvicorn rest_api:app --host localhost --port 8000
Traceback (most recent call last):
  File "/Users/og/Library/Python/3.9/bin/uvicorn", line 8, in <module>
    sys.exit(main())
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/uvicorn/main.py", line 403, in main
    run(
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/uvicorn/main.py", line 568, in run
    server.run()
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/uvicorn/server.py", line 59, in run
    return asyncio.run(self.serve(sockets=sockets))
  File "/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/asyncio/base_events.py", line 642, in run_until_complete
    return future.result()
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/uvicorn/server.py", line 66, in serve
    config.load()
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/uvicorn/config.py", line 471, in load
    self.loaded_app = import_from_string(self.app)
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/uvicorn/importer.py", line 24, in import_from_string
    raise exc from None
  File "/Users/og/Library/Python/3.9/lib/python/site-packages/uvicorn/importer.py", line 21, in import_from_string
    module = importlib.import_module(module_str)
  File "/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 850, in exec_module
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "/Users/og/src/ufr/sandbox/chatgpt-memory/rest_api.py", line 6, in <module>
    from chatgpt_memory.datastore import RedisDataStore, RedisDataStoreConfig
  File "/Users/og/src/ufr/sandbox/chatgpt-memory/chatgpt_memory/datastore/__init__.py", line 2, in <module>
    from chatgpt_memory.datastore.redis import RedisDataStore  # noqa: F401
  File "/Users/og/src/ufr/sandbox/chatgpt-memory/chatgpt_memory/datastore/redis.py", line 5, in <module>
    import redis
ModuleNotFoundError: No module named 'redis'

Generates irrelevant / too much conversation?

Sometimes, when I simple just type "hi" as an input. The response is a whole conversation generated between human and the assistant. I only want assistant's responses to be generated. Why might this be happening?

gpt-4 model not supported

Getting this error when I try using "gpt-4" as the model:
InvalidRequestError
openai.error.InvalidRequestError: This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?

update pip package

Hello,

I'm starting to use your package but it doesn't correspond to the version on pypi, I've cloned the repository and I'm having fun modifying the different files for the moment however can you update the package?

https://pypi.org/project/chatgpt-memory/0.0.1/

Are you going to continue updating it? Or don't you have enough volunteers for maintenance?

I'd like to know this before going any further on my own.

Thank you in advance for your answers.

add dotenv template

For ease of usage maybe later on we can provide a .env template for reference.

  • add the template file
  • read the env file in environment if the env file exists

Important Question / History by user

Congratulations on the code and creation.
I have doubts about the individualization of contexts.
I would like to know how I can make it so that each user who interacts with the application has their individualized message history, so that the conversation history of a user does not mix with that of the other.

Add detailed Docs

The following is part of the acceptance criteria:

  • add a user flow diagram
  • add example snippets
  • add a description of the usage

Do I need to initialize the Redis database first?

Traceback (most recent call last): File "D:\Memory\RedisMemeory\chatgpt-memory\examples\simple_usage.py", line 31, in <module> memory_manager = MemoryManager(datastore=redis_datastore, embed_client=embed_client, topk=1) File "D:\Memory\RedisMemeory\chatgpt-memory\chatgpt_memory\memory\manager.py", line 34, in __init__ Memory(conversation_id=conversation_id) for conversation_id in datastore.get_all_conversation_ids() File "D:\Memory\RedisMemeory\chatgpt-memory\chatgpt_memory\datastore\redis.py", line 132, in get_all_conversation_ids result_documents = self.redis_connection.ft().search(query).docs File "E:\Coding\Miniconda3\envs\mem\lib\site-packages\redis\commands\search\commands.py", line 420, in search res = self.execute_command(SEARCH_CMD, *args) File "E:\Coding\Miniconda3\envs\mem\lib\site-packages\redis\client.py", line 1269, in execute_command return conn.retry.call_with_retry( File "E:\Coding\Miniconda3\envs\mem\lib\site-packages\redis\retry.py", line 46, in call_with_retry return do() File "E:\Coding\Miniconda3\envs\mem\lib\site-packages\redis\client.py", line 1270, in <lambda> lambda: self._send_command_parse_response( File "E:\Coding\Miniconda3\envs\mem\lib\site-packages\redis\client.py", line 1246, in _send_command_parse_response return self.parse_response(conn, command_name, **options) File "E:\Coding\Miniconda3\envs\mem\lib\site-packages\redis\client.py", line 1286, in parse_response response = connection.read_response() File "E:\Coding\Miniconda3\envs\mem\lib\site-packages\redis\connection.py", line 905, in read_response raise response redis.exceptions.ResponseError: idx: no such index
How to create the idx index?

Also, I am using Python 3.10.12 under conda virtual environment, powershell, windows 11 with Intel x64 CPU. Why I cannot use Tiktoken for OpenAI?
D:\Memory\RedisMemeory\chatgpt-memory>python .\examples\simple_usage.py OpenAI tiktoken module is not available for Python < 3.8,Linux ARM64 and AARCH64. Falling back to GPT2TokenizerFast.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.