Giter Club home page Giter Club logo

Comments (9)

anuragkyal avatar anuragkyal commented on May 13, 2024 2

same issue. Did you guys find a solution?

training code:

import xgboost as xgb

train_labels = train['click']
train_features = train.drop('click', axis=1)

test_labels = test['click']
test_features = test.drop('click', axis=1)

# convert to data matrix
train_matrix = xgb.DMatrix(train_features, train_labels)
test_matrix = xgb.DMatrix(test_features, test_labels)

# group by query
train_matrix.set_group(train_group)
test_matrix.set_group(test_group)

watchlist = [(train_matrix, 'train'), (test_matrix, 'eval')]

param = {
    'objective': 'rank:pairwise',
    'max_depth': 10,
    'eta': 0.1,
    'eval_metric': ['ndcg'],
    'colsample_bytree': 0.8,
    'subsample': 0.8,
    'tree_method': 'hist',
    'nthread': 64,
    'verbosity': 1
}
bst = xgb.train(param, train_matrix, 50, watchlist, early_stopping_rounds=10)
bst.save_model('xgboost.model')

prediction code:

package main

import (
	"fmt"

	"github.com/dmitryikh/leaves"
)

func main() {
	// loading model
	model, err := leaves.XGEnsembleFromFile("/Users/anuragkyal/Downloads/xgboost.model", false)
	if err != nil {
		panic(err)
	}
	fmt.Printf("Name: %s\n", model.Name())
	fmt.Printf("NFeatures: %d\n", model.NFeatures())
	fmt.Printf("NOutputGroups: %d\n", model.NOutputGroups())

	test_features := []float64{0.09688013136288999,  1.0,  3.0,  2.0,  0.0,  11.0,  2.0,  0.0,  0.0,  0.0,  3.0,  0.0,  1.0,  0.0,  0.0,  0.0,  1.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  1.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0,  0.0}
	fmt.Printf("feature size = %d\n", len(test_features))
	p := model.Predict(test_features, 50, make([]float64, 1))
	fmt.Printf("Prediction for %f\n", p)
}

Output:

Name: xgboost.gbtree
NFeatures: 175
NOutputGroups: 0
My feature size = 175
panic: runtime error: integer divide by zero

goroutine 1 [running]:
github.com/dmitryikh/leaves.(*xgEnsemble).NEstimators(...)
	/Users/anuragkyal/go/src/github.com/dmitryikh/leaves/xgensemble.go:22
github.com/dmitryikh/leaves.(*xgEnsemble).adjustNEstimators(0xc000090000, 0xa, 0x13)
	/Users/anuragkyal/go/src/github.com/dmitryikh/leaves/xgensemble.go:42 +0x77
github.com/dmitryikh/leaves.(*Ensemble).Predict(0xc00000c020, 0xc00007e000, 0xaf, 0xaf, 0xa, 0xc000484010, 0x1, 0x1, 0x0, 0x0)
	/Users/anuragkyal/go/src/github.com/dmitryikh/leaves/leaves.go:72 +0x114

from leaves.

kasiss-liu avatar kasiss-liu commented on May 13, 2024 1

I set python module xgboost===0.90 instead of 1.0.0 . so this error disappeared

from leaves.

liuguiyangnwpu avatar liuguiyangnwpu commented on May 13, 2024

I also find this error

from leaves.

yunfei86 avatar yunfei86 commented on May 13, 2024

also having this error

from leaves.

dmitryikh avatar dmitryikh commented on May 13, 2024

Hi all!
I assume that this is because of using objective function ('objective': 'rank:ndcg', as example) that isn't supported by leaves. Anyway leaves should be not panic and show the error on model loading stage.
I will try to fix it

from leaves.

anuragkyal avatar anuragkyal commented on May 13, 2024

@dmitryikh got it, thanks for your response. Is any ranking objective supported with xgboost like rank:pariwise etc?

It would help if the list of objectives supported is added to the README.

from leaves.

siyi8088 avatar siyi8088 commented on May 13, 2024

my objective function is binary:logistic, but also have this error

from leaves.

siyi8088 avatar siyi8088 commented on May 13, 2024

Name: xgboost.gbtree
NFeatures: 31
NOutputGroups: 1
panic: runtime error: integer divide by zero

from leaves.

qjebbs avatar qjebbs commented on May 13, 2024

same here with XGBClassifier().save_model()

Name: xgboost.gbtree
NFeatures: 8
NOutputGroups: 0
panic: runtime error: integer divide by zero

from leaves.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.