Giter Club home page Giter Club logo

xgboost.jl's Introduction

XGBoost.jl

Build Status XGBoost

eXtreme Gradient Boosting Package in Julia

Abstract

This package is a Julia interface of XGBoost, which is short for eXtreme Gradient Boosting. It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithms. The library is parallelized using OpenMP, and it can be more than 10 times faster than some existing gradient boosting packages. It supports various objective functions, including regression, classification and ranking. The package is also made to be extensible, so that users are also allowed to define their own objectives easily.

Features

  • Sparse feature format, it allows easy handling of missing values, and improve computation efficiency.
  • Advanced features, such as customized loss function, cross validation, see demo folder for walkthrough examples.

Installation

Pkg.add("XGBoost")

or

Pkg.clone("https://github.com/dmlc/XGBoost.jl.git")
Pkg.build("XGBoost")

By default, the package builds the latest stable version of the XGBoost library. To build the latest master, set the environment variable XGBOOST_BUILD_VERSION to "master" prior to installing or building the package (e.g. ENV["XGBOOST_BUILD_VERSION"] = "master").

Minimal examples

To show how XGBoost works, here is an example of dataset Mushroom

  • Prepare Data

XGBoost support Julia Array, SparseMatrixCSC, libSVM format text and XGBoost binary file as input. Here is an example of Mushroom classification. This example will use the function readlibsvm in basic_walkthrough.jl. This function load libsvm format text into Julia dense matrix.

using XGBoost

train_X, train_Y = readlibsvm("data/agaricus.txt.train", (6513, 126))
test_X, test_Y = readlibsvm("data/agaricus.txt.test", (1611, 126))
  • Fit Model
num_round = 2
bst = xgboost(train_X, num_round, label = train_Y, eta = 1, max_depth = 2)

Predict

pred = predict(bst, test_X)
print("test-error=", sum((pred .> 0.5) .!= test_Y) / float(size(pred)[1]), "\n")

Cross-Validation

nfold = 5
param = ["max_depth" => 2,
         "eta" => 1,
         "objective" => "binary:logistic"]
metrics = ["auc"]
nfold_cv(train_X, num_round, nfold, label = train_Y, param = param, metrics = metrics)

Feature Walkthrough

Check demo

Model Parameter Setting

Check XGBoost Wiki

xgboost.jl's People

Contributors

antinucleon avatar tqchen avatar slundberg avatar allardvm avatar maximsch2 avatar andreasnoack avatar fionnan avatar iblislin avatar freeboson avatar jumutc avatar blackswancoming avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.