Comments (6)
This is really more of a model design suggestion instead of an Agnai issue... and, for that matter, an xkcd.com/1425 moment.
Sure, there are bots that you could tell to do this, but they aren't especially likely to do a good job, or to save that much space, and imposing harsh token constraints on how long the summary can be is more likely to make them not do a good job. With current models, you'd most likely just be encouraging repetition, as tends to happen whenever you feed an AI's output back into itself.
Getting the AI to understand the meaning of what it has in context well enough to know which details can be safely discarded (as opposed to randomly pattern-matching and hoping for the best) would represent a significant AI breakthrough.
from agnai.
Agnai already uses RAG for old messages, but it needs another iteration to provide surrounding messages to provide better context for the "relevant" messages.
I'm not convinced that arbitrarily summarizing messages will provide "infinite memory". There is still limited context. And what happens when the conversation is 10x the size of the context window? As far as I understand it, this proposal still leaves massive gaps in the conversation that won't be summarized. The summarized text will be erase the language/personality/style used in the messages.
from agnai.
Getting the AI to understand the meaning of what it has in context well enough to know which details can be safely discarded (as opposed to randomly pattern-matching and hoping for the best) would represent a significant AI breakthrough.
GPT 3 can do that.
from agnai.
Automatic summary has its place -- it can also be an embed. It has its flaws though in that the summaries aren't always accurate or useful in and of themselves. I'd love to see "summaries" added to Agnai eventually, automatic or button-push or otherwise. There's a couple characters that can make summaries (see the sharing channels) and I think a good first step is a "quick memory button" which you could combo with a summarizer character but which could also be used for lots of other purposes
from agnai.
I like the idea of "summarizing" in general, but it can definitely lose a lot of information and may or may not be as useful as we might suspect.
Let's say the chat history is 10x the context size. Let's say we summarized a full context window of chat history (say, 8k tokens) and we've stored it in a "summary" embedding somewhere in the library with chat window demarcation so as to signify the summary of the mesages between start and end timestamps.
Ignoring for a second the challenge of editing, deleting, and regenerating these messages, let's say it's just a straight 8k tokens worth of existing unedited chat messages. Not that these problems are trivial, but that for the base case this simplifies the conversation.
Now, if we summarized the "first" 8k context of chat history and then another 8k context of chat history is present we would summarize that separately. Now we have two summary embeds, possibly 500 tokens each (or less). If we say there's a summary context limit of 2000 tokens, then when we hit the 2000 token limit (i.e. 4 summaries) we then re-summarize all the summaries and reduce down to a single summary. In this way, as context grows to 10x the context window, summaries will continue to reduce and reduce and reduce.
This is one approach.
Another is through the "quick-memory add" with "summary message generations" so the user could generate a summary of "all previous chat messages" or "automatically sumarize the chat when the window closes" or similar. Then these messages could simply be added to the chat history, optionally, or included only when the "quick memory add" button is pressed (which would add a memory entry to the current memory book with a pop-up for editing memory seettings like which memory book, keywords, weights, and priority). A quick memory add feature could combo nicely with a summarizer feature.
This request I think needs more thought, but I do think there's room here for something interesting.
from agnai.
as I understand it, the current take is that the juice here isn't worth the squeeze because summarizations will drop a lot of valuable information from the chat history. That's likely true, and comboing it with a memory entry that we could edit could be a good fix. That said, this is not a trivial feature and would be a large time commitment with lots of edge cases and design considerations were it to ever get added.
from agnai.
Related Issues (20)
- The "Generate more" button does not work properly
- Claude Messages API prefill functionality
- Scenario: "Copy Event" Button
- Scenario: Repeatable Events. Limited (1<) and Infinite repetition. HOT 1
- Add option for Claude 3 Haiku for Third Parties/Self-hosted preset.
- Feature req: Please integrate apipie.ai
- Feature: Add a lock toggle for Promp Ordering in Preset.
- [bug]: User embeds are inserted in the prompt each time the chat is opened.
- (Feature Request) Prefill Option for OpenRouter for Claude HOT 1
- Can't update memory books in a chat HOT 4
- Mancer models list missing cr/token cost information HOT 1
- [Feature] OpenAI model request. HOT 1
- [Feature] Customizable negative bias
- [bug] Regenerate button disappears when there are hidden messages in the chat
- [Feature Request] - Support for .Webp, and .GIF
- Multilingual embeddings
- Expand {{roll}} placeholder to include dice notation. HOT 1
- Private Vulnerability Reporting HOT 1
- Add Toggleable Custom Prompt Fields to Preset HOT 1
- Image messages are not shown after refreshing the chat HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from agnai.