Comments (10)
Make sure to mention this as the upgrade path:
cp ~/.llm/log.db "$(llm logs path)"
rm -rf ~/.llm
from llm.
from llm.
Changelog now lives here: https://llm.datasette.io/en/latest/changelog.html
from llm.
- Improved display of error messages from OpenAI. #15
Continue previous chat
You can now use llm
to continue a previous conversation with the OpenAI chat models (gpt-3.5-turbo
and gpt-4
). This will include your previous prompts and responses in the prompt sent to the API, allowing the model to continue within the same context.
Use the new -c/--continue
option to continue from the previous message thread:
llm "Pretend to be a witty gerbil, say hi briefly"
Greetings, dear human! I am a clever gerbil, ready to entertain you with my quick wit and endless energy.
llm "What do you think of snacks?" -c
Oh, how I adore snacks, dear human! Crunchy carrot sticks, sweet apple slices, and chewy yogurt drops are some of my favorite treats. I could nibble on them all day long!
The -c
option will continue from the most recent logged message.
To continue a different chat, pass an integer ID to the --chat
option. This should be the ID of a previously logged message. You can find these IDs using the llm logs
command.
Thanks Amjith Ramanujam for contributing to this feature. #6
from llm.
Need to talk about the new location of log.db
and how to upgrade to it, and mention the schema changes - and the llm logs --truncate
option.
from llm.
New llm logs --truncate
option (shortcut -t
) which truncates the displayed prompts to make the log output easier to read. #16
from llm.
Documentation now spans multiple pages and lives at https://llm.datasette.io/ #21
from llm.
New mechanism for storing API keys
API keys for language models such as those by OpenAI can now be saved using the new llm keys
family of commands.
To set the default key to be used for the OpenAI APIs, run this:
llm keys set openai
Then paste in your API key.
Keys can also be passed using the new --key
command line option - this can be a full key or the alias of a key that has been previously stored.
See link-to-docs for more. #13
from llm.
- Default
llm chatgpt
command has been renamed tollm prompt
. #17 - Removed
--code
option in favour of new prompt templates mechanism. #24 - Responses are now streamed by default, if the model supports streaming. The
-s/--stream
option has been removed. A new--no-stream
option can be used to opt-out of streaming. #25 - The
-4/--gpt4
option has been removed in favour of-m 4
or-m gpt4
, using a new mechanism that allows models to have additional short names.
from llm.
New location for the log.db database
The log.db
database that stores a history of executed prompts no longer lives at ~/.llm/log.db
- it can now be found in a location that better fits the host operating system, which can be seen using:
llm logs path
On macOS this is ~/Library/Application Support/io.datasette.llm/log.db
.
To open that database using Datasette, run this:
datasette "$(llm logs path)"
You can upgrade your existing installation by copying your database to the new location like this:
cp ~/.llm/log.db "$(llm logs path)"
rm -rf ~/.llm # To tidy up the now obsolete directory
The database schema has changed, and will be updated automatically the first time you run the command.
That schema is included in the documentation. #35
from llm.
Related Issues (20)
- Add named chats HOT 1
- Support prefill HOT 8
- Mechanism for recording a different model ID from the one requested HOT 1
- How to handle fake messages that were not part of real coversations? HOT 3
- llm-llamafile is missing from plugin directory
- How to cut off the LLM in chat mode
- IndexError on Windows for llm chat HOT 1
- llm-groq does not support llama 3 HOT 2
- some plugins fail to install with "Connection refused" error HOT 1
- UI around chat history HOT 1
- [plugin] add IBM watsonx
- A rapidly convert Files to Prompts Using Rust
- Enhancement idea: implement a self help
- Asynchronous API support HOT 1
- Add API documentation on how to import and use this tool as a Python library
- Support for GPT-4o HOT 3
- Fix for latest mypy
- Rename the gpt-4-turbo aliases HOT 1
- All I ever get is "insufficient_quota"
- llm 0.14: Can't run <<llm chat>> on Windows 11 HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from llm.