Comments (4)
You can try extracting the actual layers in the classifier head and construct a Flux.Chain
and call with Flux.activations
. Otherwise, I think a manual loop/calls is probably the simplest.
from transformers.jl.
There is an output_hidden_states
configuration that can be set up with HGFConfig
:
model_name = "mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis"
cfg = HuggingFace.HGFConfig(load_config(model_name); output_hidden_states = true)
mod = load_model(model_name, "ForSequenceClassification"; config = cfg)
then you can access all layer outputs with mod(a).outputs
which is a NTuple{number_layers, @NamedTuple{hidden_state::Array{Float32, 3}}
. Another similar configuration is output_attentions
that would also include the attentions scores in the named tuples in .outputs
.
BTW, if you don't need the sequence classification head, you can simply use load_model(model_name; config = cfg)
which would extract the model part without the classification layers.
from transformers.jl.
Amazing, thanks very much for the quick response 👍🏽
(I won't close this since you added the tag for documentation)
from transformers.jl.
Small follow-up question: is it also somehow possible to collect outputs for each layer of the classifier head?
Edit: I realize I can just break down the forward pass into layer-by-layer calls as below, but perhaps there's a more streamline way to do this?
b = clf.layer.layers[1](b).hidden_state |>
x -> clf.layer.layers[2](x)
from transformers.jl.
Related Issues (20)
- Adding support for checkpointing HOT 12
- update NNlib and Flux compat HOT 9
- State of quantization HOT 3
- Dolly example no longer works ... HOT 19
- OWL-ViT HOT 1
- AMDGPU support HOT 1
- DistilBertModel support HOT 1
- Attempting to download CLIP yields UnderVarError `unk_token` not defined
- Performance issue HOT 1
- Adding phi model HOT 5
- Please support Lux.jl HOT 7
- Example Code always produces Max Length Sequences
- how to download model weights on external drive
- Update to newer versions of dependencies
- Improve documentation and take inspiration from python package HOT 6
- please update compat bounds HOT 6
- Looking to update Transformers.jl and the associated modules HOT 1
- Storage of Downloaded Models from HuggingFace HOT 1
- Converting from integer-tokens to one-hot tokens gives different results. HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from transformers.jl.