Comments (2)
Yes I forgot to add the init() call.
Thanks!
from adapters.
Hey @leaBroe,
In your code snippet for loading the adapter, you might be missing the init()
call for adapters
. That's why not the load_adapter()
method of adapters
but of PEFT is called.
For adapters
, it is always required to first call adapters.init()
to add the libraries add_adapter()
, load_adapter()
etc methods. Unfortunately, the PEFT library decided to copy the names of some methods in adapters
(such as add_adapter()
, load_adapter()
), causing the confusing error messages if the PEFT library is installed and adapters.init()
is not called. adapters
has no dependencies on the PEFT library, so it's never necessary to have it installed alongside adapters
to use adapters
functionality.
Hope this helps!
PS: the fix for the issue you raised a while ago is part of the v0.2.2 release :)
from adapters.
Related Issues (20)
- Adding adapters and fusion layer to UNIPELT
- Error when trying to set up adapter training for CodeLlama HOT 1
- T5 AdapterDrop Prefix-Tuning Bug HOT 2
- No difference in performance or speed between different adapter configs
- Total Parameter added to original model HOT 1
- UNIPELT original paper use alpha = 2 for Lora while the UNIPELT implementation in adapter has a alpha = 8 HOT 1
- regression head? HOT 4
- Custom Heads not working with adapters
- UNIPELT, UNIPELT (AP) and UNIPELT (APL)
- QuestionAnsweringTrainer for adapter? HOT 1
- "`.to` is not supported for `4-bit` or `8-bit` bitsandbytes models" when i use load_best_model_at_end=True in QLoRa HOT 3
- Using EncoderDecoderModel with Adapters / No module named 'adapters.models.encoder_decoder' HOT 5
- Looking forward to support for SAM HOT 1
- Llama LoRA training not working with sdpa and flash attention
- Support for AST Model
- Trainer evaluation loop can't be used with `predict_with_generate`
- `ForwardContext` is `None` with gradient checkpointing enabled
- `T5ForConditionalGeneration`: After calling adapters.init() the data_collator input misses `attention_mask`
- No improvement in training loss using ReFT Methods HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from adapters.