Comments (6)
The per-class weight is multiplied with the C value to obtain the penalty factor for a class. Setting the weight for label 2 to zero makes that class unweighted in the optimization problem. This in turn means that during the training procedure, classifying wrongly as class 2 is not penalized. Minimizing the objective during training can therefore be accomplished by assigning all samples to that class.
from libsvm.
libsvm uses 1-vs-1 for multiclass. For each binary problem, if one class has
C=0, then the whole alpha vector is zero for both classes due to the
y^T alpha = 0 constraint. So libsvm fits
neither class in this situation. Therefore, we cannot just think that in your
case only class 2 is not fitted. After pairwise training, there is another
stage of combination. All together the behavior can be very unpredictable
so your situation might happen. To get into details, you need to check
parameters A and B used in the prob model (they are available in the model
file)
olologin writes:
I know that it's weird usage of class weights, but stil, can it be explained
somehow? Or fixed?dataset.txt:
0 1:0 2:0 3:0
0 1:0 2:0 3:1
0 1:0 2:1 3:0
1 1:0 2:1 3:1
1 1:1 2:0 3:0
1 1:1 2:0 3:1
2 1:1 2:1 3:0
2 1:1 2:1 3:1code:
libsvm-3.20$ ./svm-train -b 1 -w0 1 -w1 1 -w2 0 dataset.txt model
libsvm-3.20$ ./svm-predict -b 1 dataset.txt model predictions.outIt produces in predictions.out:
labels 0 1 2
2 3.31221e-14 3.30357e-14 1
2 3.63995e-14 3.24543e-14 1
2 3.36039e-14 3.30595e-14 1
2 3.77311e-14 3.12876e-14 1
2 3.86737e-14 2.78238e-14 1
2 3.82377e-14 2.50579e-14 1
2 3.84825e-14 2.96375e-14 1
2 3.84239e-14 2.58019e-14 1—
Reply to this email directly or view it on GitHub.*
from libsvm.
Thanks for explanation, do i need to close it? Because it seems that it's not issue, at least in mathematical terms.
from libsvm.
Yes, please.
olologin writes:
Thanks for explanation, do i need to close it? Because it seems that it's not
issue, at least in mathematical terms.—
Reply to this email directly or view it on GitHub.*
from libsvm.
That's contents of model which i used in first post:
svm_type c_svc
kernel_type rbf
gamma 0.333333
nr_class 3
total_sv 6
rho 0.0287883 -inf -inf
label 0 1 2
probA 1.44403 0 0
probB -0.010178 -0.287682 -0.287682
nr_sv 3 3 0
SV
1 0 1:0 2:0 3:0
1 0 1:0 2:0 3:1
1 0 1:0 2:1 3:0
-1 0 1:0 2:1 3:1
-1 0 1:1 2:0 3:0
-1 0 1:1 2:0 3:1
I have a question again:
After computing sigmoid with such probA and probB, from 2 classifiers we will get 0.5714... as probability of class 2, in any case, independently from X (because distance function is multiplied by 0). But how those one-vs-one probabilities are combined? I mean how SVM implementation achieves 1 from at least 2*0.5714? Simply sums probabilities for each class and divides by number of classes?
Thanks in advance.
from libsvm.
please check sec 8 of libsvm paper
http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf
olologin writes:
That's contents of model which i used in first post:
svm_type c_svc
kernel_type rbf
gamma 0.333333
nr_class 3
total_sv 6
rho 0.0287883 -inf -inf
label 0 1 2
probA 1.44403 0 0
probB -0.010178 -0.287682 -0.287682
nr_sv 3 3 0
SV
1 0 1:0 2:0 3:0
1 0 1:0 2:0 3:1
1 0 1:0 2:1 3:0
-1 0 1:0 2:1 3:1
-1 0 1:1 2:0 3:0
-1 0 1:1 2:0 3:1I have a question again:
After computing sigmoid with such probA and probB, from 2 classifiers we will
get 0.5714... as probability of class 2, in any case, independently from X
(because distance function is multiplied by 0). But how those one-vs-one
probabilities are combined? I mean how SVM implementation achieves 1 from at
least 2*0.5714? Simply sums probabilities for each class and divides by number
of classes?Thanks in advance.
—
Reply to this email directly or view it on GitHub.*
from libsvm.
Related Issues (20)
- How to use the saved trained svm model file manually HOT 1
- how can i improve train speed? HOT 1
- why my memory always HOT 1
- Some dataset problem HOT 2
- static library HOT 1
- gridregression.py
- LIBSVM library not found HOT 1
- gridregression.py 出现 RuntimeError: get no mse HOT 2
- Exception Thrown: read access violation HOT 1
- 使用svm-train.exe做回归(regression),参数c与g如何优化 HOT 1
- How to modify SVR loss function? HOT 1
- Accuracies lower than 50% if the random seed is unlucky HOT 8
- one class svm HOT 1
- Questions about the results of libSVM HOT 6
- Questions about probability estimation of one-class svm HOT 4
- 如何调整nu,使训练集的准确率增加? HOT 6
- cannot find -lgcc: No such file or directory HOT 1
- When using svm-scale , missing values come out. HOT 1
- Try using a custom objective function
- MATLAB Crashes when trying to run "svmtrain" on CentOS HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from libsvm.