An LLM interface (chat bot) implemented in pure Rust using HuggingFace/Candle over Axum Websockets, an SQLite Database, and a Leptos (Wasm) frontend packaged with Tauri!
Hi Daniel,
Thanks for referring me to your example using LLMs, I wanted to point you to loco.rs, a kind of "rust on rails" framework to set up your server with: https://loco.rs
From looking at your codebase, I feel it can save you quite a bit of typing + give you a lot of extras + set you up for success as the project evolves (for example you get a database layer integrated, and JWT authentication for free)