Comments (9)
same issue. Did you guys find a solution?
training code:
import xgboost as xgb
train_labels = train['click']
train_features = train.drop('click', axis=1)
test_labels = test['click']
test_features = test.drop('click', axis=1)
# convert to data matrix
train_matrix = xgb.DMatrix(train_features, train_labels)
test_matrix = xgb.DMatrix(test_features, test_labels)
# group by query
train_matrix.set_group(train_group)
test_matrix.set_group(test_group)
watchlist = [(train_matrix, 'train'), (test_matrix, 'eval')]
param = {
'objective': 'rank:pairwise',
'max_depth': 10,
'eta': 0.1,
'eval_metric': ['ndcg'],
'colsample_bytree': 0.8,
'subsample': 0.8,
'tree_method': 'hist',
'nthread': 64,
'verbosity': 1
}
bst = xgb.train(param, train_matrix, 50, watchlist, early_stopping_rounds=10)
bst.save_model('xgboost.model')
prediction code:
package main
import (
"fmt"
"github.com/dmitryikh/leaves"
)
func main() {
// loading model
model, err := leaves.XGEnsembleFromFile("/Users/anuragkyal/Downloads/xgboost.model", false)
if err != nil {
panic(err)
}
fmt.Printf("Name: %s\n", model.Name())
fmt.Printf("NFeatures: %d\n", model.NFeatures())
fmt.Printf("NOutputGroups: %d\n", model.NOutputGroups())
test_features := []float64{0.09688013136288999, 1.0, 3.0, 2.0, 0.0, 11.0, 2.0, 0.0, 0.0, 0.0, 3.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0}
fmt.Printf("feature size = %d\n", len(test_features))
p := model.Predict(test_features, 50, make([]float64, 1))
fmt.Printf("Prediction for %f\n", p)
}
Output:
Name: xgboost.gbtree
NFeatures: 175
NOutputGroups: 0
My feature size = 175
panic: runtime error: integer divide by zero
goroutine 1 [running]:
github.com/dmitryikh/leaves.(*xgEnsemble).NEstimators(...)
/Users/anuragkyal/go/src/github.com/dmitryikh/leaves/xgensemble.go:22
github.com/dmitryikh/leaves.(*xgEnsemble).adjustNEstimators(0xc000090000, 0xa, 0x13)
/Users/anuragkyal/go/src/github.com/dmitryikh/leaves/xgensemble.go:42 +0x77
github.com/dmitryikh/leaves.(*Ensemble).Predict(0xc00000c020, 0xc00007e000, 0xaf, 0xaf, 0xa, 0xc000484010, 0x1, 0x1, 0x0, 0x0)
/Users/anuragkyal/go/src/github.com/dmitryikh/leaves/leaves.go:72 +0x114
from leaves.
I set python module xgboost===0.90 instead of 1.0.0 . so this error disappeared
from leaves.
I also find this error
from leaves.
also having this error
from leaves.
Hi all!
I assume that this is because of using objective function ('objective': 'rank:ndcg'
, as example) that isn't supported by leaves. Anyway leaves
should be not panic and show the error on model loading stage.
I will try to fix it
from leaves.
@dmitryikh got it, thanks for your response. Is any ranking objective supported with xgboost like rank:pariwise
etc?
It would help if the list of objectives supported is added to the README.
from leaves.
my objective function is binary:logistic, but also have this error
from leaves.
Name: xgboost.gbtree
NFeatures: 31
NOutputGroups: 1
panic: runtime error: integer divide by zero
from leaves.
same here with XGBClassifier().save_model()
Name: xgboost.gbtree
NFeatures: 8
NOutputGroups: 0
panic: runtime error: integer divide by zero
from leaves.
Related Issues (20)
- Leaves can't load xgboost model which trained for java api? HOT 1
- Is there the plan for xgboost Ranker with rank:pairwise?
- Prediction result always be 0.000 HOT 1
- How complicated would it be to provide the model training part as well?
- Support for returning feature_name in Ensemble struct
- support v3 model encoding HOT 2
- is this library compatible with catboost? HOT 3
- Total incorrect python xgboost train model,use leaves load model and predict HOT 2
- only version=v2 is supported HOT 1
- Could you please add support for lambdarank(lambdamart)?
- Support for newer versions of XGBoost HOT 2
- Does for the LightGBM have support for missing values
- Question: support for objective:quantile
- Support the use of sklearn pipelines with prediction model
- what's exactly meanings of some parameters?
- The prediction is wrong when using XGEnsembleFromFile to load model
- How to deal with missing values for xgboost
- Is there any way to support tweedie regression models?
- predict result different with python HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from leaves.