Giter Club home page Giter Club logo

Comments (13)

iqedgarmg avatar iqedgarmg commented on July 20, 2024

I have the same problem, I checked both models but everything looks fine. I think that the problem is the last layer "detection" of the .cfg structure.

from caffe-yolo.

zhan-xu avatar zhan-xu commented on July 20, 2024

You should use this cfg: darknet/cfg/tiny.cfg
and this weights: http://pjreddie.com/media/files/tiny.weights

from caffe-yolo.

victorv avatar victorv commented on July 20, 2024

@enderhsu - thanks. It got a bit farther now.

I am using a newer version of caffe. I will investigate.

python create_yolo_caffemodel.py -m prototxt/yolo_tiny_train_val.prototxt -w tiny-yolo.weights -o tiny-yolo.caffemodel

[('-m', 'prototxt/yolo_tiny_train_val.prototxt'), ('-w', 'tiny-yolo.weights'), ('-o', 'tiny-yolo.caffemodel')]
model file is prototxt/yolo_tiny_train_val.prototxt
weight file is tiny-yolo.weights
output caffemodel file is tiny-yolo.caffemodel
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0415 19:32:59.547441 1958055936 upgrade_proto.cpp:67] Attempting to upgrade input file specified using deprecated input fields: prototxt/yolo_tiny_train_val.prototxt
I0415 19:32:59.547847 1958055936 upgrade_proto.cpp:70] Successfully upgraded file specified using deprecated input fields.
W0415 19:32:59.547869 1958055936 upgrade_proto.cpp:72] Note that future Caffe releases will only support input layers and not input fields.
I0415 19:32:59.547874 1958055936 upgrade_proto.cpp:77] Attempting to upgrade batch norm layers using deprecated params: prototxt/yolo_tiny_train_val.prototxt
I0415 19:32:59.547881 1958055936 upgrade_proto.cpp:80] Successfully upgraded batch norm layers using deprecated params.
I0415 19:33:00.901940 1958055936 net.cpp:298] The NetState phase (1) differed from the phase (0) specified by a rule in layer bn1
I0415 19:33:00.901985 1958055936 net.cpp:298] The NetState phase (1) differed from the phase (0) specified by a rule in layer bn2
I0415 19:33:00.901998 1958055936 net.cpp:298] The NetState phase (1) differed from the phase (0) specified by a rule in layer bn3
I0415 19:33:00.902009 1958055936 net.cpp:298] The NetState phase (1) differed from the phase (0) specified by a rule in layer bn4
I0415 19:33:00.902020 1958055936 net.cpp:298] The NetState phase (1) differed from the phase (0) specified by a rule in layer bn5
I0415 19:33:00.902030 1958055936 net.cpp:298] The NetState phase (1) differed from the phase (0) specified by a rule in layer bn6
I0415 19:33:00.902040 1958055936 net.cpp:298] The NetState phase (1) differed from the phase (0) specified by a rule in layer bn7
I0415 19:33:00.902050 1958055936 net.cpp:298] The NetState phase (1) differed from the phase (0) specified by a rule in layer bn8
I0415 19:33:00.902062 1958055936 net.cpp:55] Initializing net from parameters:
name: "YOLONet"
state {
phase: TEST
level: 0
}
layer {
name: "input"
type: "Input"
top: "data"
input_param {
shape {
dim: 1
dim: 3
dim: 448
dim: 448
}
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 16
bias_term: false
pad: 1
kernel_size: 3
weight_filler {
type: "xavier"
}
}
}
layer {
name: "bn1"
type: "BatchNorm"
bottom: "conv1"
top: "bn1"
include {
phase: TEST
}
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "scale1"
type: "Scale"
bottom: "bn1"
top: "scale1"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
scale_param {
bias_term: true
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "scale1"
top: "scale1"
relu_param {
negative_slope: 0.1
}
}
layer {
name: "pool1"
type: "Pooling"
bottom: "scale1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "pool1"
top: "conv2"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 32
bias_term: false
pad: 1
kernel_size: 3
weight_filler {
type: "xavier"
}
}
}
layer {
name: "bn2"
type: "BatchNorm"
bottom: "conv2"
top: "bn2"
include {
phase: TEST
}
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "scale2"
type: "Scale"
bottom: "bn2"
top: "scale2"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
scale_param {
bias_term: true
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "scale2"
top: "scale2"
relu_param {
negative_slope: 0.1
}
}
layer {
name: "pool2"
type: "Pooling"
bottom: "scale2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "pool2"
top: "conv3"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 64
bias_term: false
pad: 1
kernel_size: 3
weight_filler {
type: "xavier"
}
}
}
layer {
name: "bn3"
type: "BatchNorm"
bottom: "conv3"
top: "bn3"
include {
phase: TEST
}
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "scale3"
type: "Scale"
bottom: "bn3"
top: "scale3"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
scale_param {
bias_term: true
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "scale3"
top: "scale3"
relu_param {
negative_slope: 0.1
}
}
layer {
name: "pool3"
type: "Pooling"
bottom: "scale3"
top: "pool3"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv4"
type: "Convolution"
bottom: "pool3"
top: "conv4"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 128
bias_term: false
pad: 1
kernel_size: 3
weight_filler {
type: "xavier"
}
}
}
layer {
name: "bn4"
type: "BatchNorm"
bottom: "conv4"
top: "bn4"
include {
phase: TEST
}
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "scale4"
type: "Scale"
bottom: "bn4"
top: "scale4"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
scale_param {
bias_term: true
}
}
layer {
name: "relu4"
type: "ReLU"
bottom: "scale4"
top: "scale4"
relu_param {
negative_slope: 0.1
}
}
layer {
name: "pool4"
type: "Pooling"
bottom: "scale4"
top: "pool4"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv5"
type: "Convolution"
bottom: "pool4"
top: "conv5"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 256
bias_term: false
pad: 1
kernel_size: 3
weight_filler {
type: "xavier"
}
}
}
layer {
name: "bn5"
type: "BatchNorm"
bottom: "conv5"
top: "bn5"
include {
phase: TEST
}
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "scale5"
type: "Scale"
bottom: "bn5"
top: "scale5"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
scale_param {
bias_term: true
}
}
layer {
name: "relu5"
type: "ReLU"
bottom: "scale5"
top: "scale5"
relu_param {
negative_slope: 0.1
}
}
layer {
name: "pool5"
type: "Pooling"
bottom: "scale5"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv6"
type: "Convolution"
bottom: "pool5"
top: "conv6"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 512
bias_term: false
pad: 1
kernel_size: 3
weight_filler {
type: "xavier"
}
}
}
layer {
name: "bn6"
type: "BatchNorm"
bottom: "conv6"
top: "bn6"
include {
phase: TEST
}
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "scale6"
type: "Scale"
bottom: "bn6"
top: "scale6"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
scale_param {
bias_term: true
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "scale6"
top: "scale6"
relu_param {
negative_slope: 0.1
}
}
layer {
name: "pool6"
type: "Pooling"
bottom: "scale6"
top: "pool6"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv7"
type: "Convolution"
bottom: "pool6"
top: "conv7"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 1024
bias_term: false
pad: 1
kernel_size: 3
weight_filler {
type: "xavier"
}
}
}
layer {
name: "bn7"
type: "BatchNorm"
bottom: "conv7"
top: "bn7"
include {
phase: TEST
}
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "scale7"
type: "Scale"
bottom: "bn7"
top: "scale7"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
scale_param {
bias_term: true
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "scale7"
top: "scale7"
relu_param {
negative_slope: 0.1
}
}
layer {
name: "conv8_y"
type: "Convolution"
bottom: "scale7"
top: "conv8"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 256
bias_term: false
pad: 1
kernel_size: 3
weight_filler {
type: "xavier"
}
}
}
layer {
name: "bn8"
type: "BatchNorm"
bottom: "conv8"
top: "bn8"
include {
phase: TEST
}
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "scale8"
type: "Scale"
bottom: "bn8"
top: "scale8"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
scale_param {
bias_term: true
}
}
layer {
name: "relu8"
type: "ReLU"
bottom: "scale8"
top: "scale8"
relu_param {
negative_slope: 0.1
}
}
layer {
name: "fc9"
type: "InnerProduct"
bottom: "scale8"
top: "result"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
inner_product_param {
num_output: 1470
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
I0415 19:33:00.903556 1958055936 layer_factory.hpp:77] Creating layer input
I0415 19:33:00.903575 1958055936 net.cpp:88] Creating Layer input
I0415 19:33:00.903583 1958055936 net.cpp:384] input -> data
I0415 19:33:00.903609 1958055936 net.cpp:126] Setting up input
I0415 19:33:00.903614 1958055936 net.cpp:133] Top shape: 1 3 448 448 (602112)
I0415 19:33:00.903620 1958055936 net.cpp:141] Memory required for data: 2408448
I0415 19:33:00.903697 1958055936 layer_factory.hpp:77] Creating layer conv1
I0415 19:33:00.903712 1958055936 net.cpp:88] Creating Layer conv1
I0415 19:33:00.903717 1958055936 net.cpp:410] conv1 <- data
I0415 19:33:00.903728 1958055936 net.cpp:384] conv1 -> conv1
I0415 19:33:01.658994 1958055936 net.cpp:126] Setting up conv1
I0415 19:33:01.659021 1958055936 net.cpp:133] Top shape: 1 16 448 448 (3211264)
I0415 19:33:01.659029 1958055936 net.cpp:141] Memory required for data: 15253504
I0415 19:33:01.659036 1958055936 layer_factory.hpp:77] Creating layer bn1
I0415 19:33:01.659045 1958055936 net.cpp:88] Creating Layer bn1
I0415 19:33:01.659050 1958055936 net.cpp:410] bn1 <- conv1
I0415 19:33:01.659055 1958055936 net.cpp:384] bn1 -> bn1
I0415 19:33:01.659178 1958055936 net.cpp:126] Setting up bn1
I0415 19:33:01.659184 1958055936 net.cpp:133] Top shape: 1 16 448 448 (3211264)
I0415 19:33:01.659189 1958055936 net.cpp:141] Memory required for data: 28098560
I0415 19:33:01.659196 1958055936 layer_factory.hpp:77] Creating layer scale1
I0415 19:33:01.659202 1958055936 net.cpp:88] Creating Layer scale1
I0415 19:33:01.659206 1958055936 net.cpp:410] scale1 <- bn1
I0415 19:33:01.659210 1958055936 net.cpp:384] scale1 -> scale1
I0415 19:33:01.659221 1958055936 layer_factory.hpp:77] Creating layer scale1
I0415 19:33:01.659952 1958055936 net.cpp:126] Setting up scale1
I0415 19:33:01.659960 1958055936 net.cpp:133] Top shape: 1 16 448 448 (3211264)
I0415 19:33:01.659976 1958055936 net.cpp:141] Memory required for data: 40943616
I0415 19:33:01.659981 1958055936 layer_factory.hpp:77] Creating layer relu1
I0415 19:33:01.659989 1958055936 net.cpp:88] Creating Layer relu1
I0415 19:33:01.659993 1958055936 net.cpp:410] relu1 <- scale1
I0415 19:33:01.659997 1958055936 net.cpp:371] relu1 -> scale1 (in-place)
I0415 19:33:01.660151 1958055936 net.cpp:126] Setting up relu1
I0415 19:33:01.660158 1958055936 net.cpp:133] Top shape: 1 16 448 448 (3211264)
I0415 19:33:01.660174 1958055936 net.cpp:141] Memory required for data: 53788672
I0415 19:33:01.660177 1958055936 layer_factory.hpp:77] Creating layer pool1
I0415 19:33:01.660184 1958055936 net.cpp:88] Creating Layer pool1
I0415 19:33:01.660188 1958055936 net.cpp:410] pool1 <- scale1
I0415 19:33:01.660192 1958055936 net.cpp:384] pool1 -> pool1
I0415 19:33:01.660202 1958055936 net.cpp:126] Setting up pool1
I0415 19:33:01.660207 1958055936 net.cpp:133] Top shape: 1 16 224 224 (802816)
I0415 19:33:01.660210 1958055936 net.cpp:141] Memory required for data: 56999936
I0415 19:33:01.660213 1958055936 layer_factory.hpp:77] Creating layer conv2
I0415 19:33:01.660219 1958055936 net.cpp:88] Creating Layer conv2
I0415 19:33:01.660223 1958055936 net.cpp:410] conv2 <- pool1
I0415 19:33:01.660238 1958055936 net.cpp:384] conv2 -> conv2
I0415 19:33:01.660789 1958055936 net.cpp:126] Setting up conv2
I0415 19:33:01.660799 1958055936 net.cpp:133] Top shape: 1 32 224 224 (1605632)
I0415 19:33:01.660815 1958055936 net.cpp:141] Memory required for data: 63422464
I0415 19:33:01.660820 1958055936 layer_factory.hpp:77] Creating layer bn2
I0415 19:33:01.660826 1958055936 net.cpp:88] Creating Layer bn2
I0415 19:33:01.660830 1958055936 net.cpp:410] bn2 <- conv2
I0415 19:33:01.660835 1958055936 net.cpp:384] bn2 -> bn2
I0415 19:33:01.660881 1958055936 net.cpp:126] Setting up bn2
I0415 19:33:01.660887 1958055936 net.cpp:133] Top shape: 1 32 224 224 (1605632)
I0415 19:33:01.660890 1958055936 net.cpp:141] Memory required for data: 69844992
I0415 19:33:01.660897 1958055936 layer_factory.hpp:77] Creating layer scale2
I0415 19:33:01.660903 1958055936 net.cpp:88] Creating Layer scale2
I0415 19:33:01.660907 1958055936 net.cpp:410] scale2 <- bn2
I0415 19:33:01.660910 1958055936 net.cpp:384] scale2 -> scale2
I0415 19:33:01.660918 1958055936 layer_factory.hpp:77] Creating layer scale2
I0415 19:33:01.661159 1958055936 net.cpp:126] Setting up scale2
I0415 19:33:01.661165 1958055936 net.cpp:133] Top shape: 1 32 224 224 (1605632)
I0415 19:33:01.661180 1958055936 net.cpp:141] Memory required for data: 76267520
I0415 19:33:01.661185 1958055936 layer_factory.hpp:77] Creating layer relu2
I0415 19:33:01.661190 1958055936 net.cpp:88] Creating Layer relu2
I0415 19:33:01.661192 1958055936 net.cpp:410] relu2 <- scale2
I0415 19:33:01.661196 1958055936 net.cpp:371] relu2 -> scale2 (in-place)
I0415 19:33:01.661334 1958055936 net.cpp:126] Setting up relu2
I0415 19:33:01.661340 1958055936 net.cpp:133] Top shape: 1 32 224 224 (1605632)
I0415 19:33:01.661356 1958055936 net.cpp:141] Memory required for data: 82690048
I0415 19:33:01.661360 1958055936 layer_factory.hpp:77] Creating layer pool2
I0415 19:33:01.661365 1958055936 net.cpp:88] Creating Layer pool2
I0415 19:33:01.661368 1958055936 net.cpp:410] pool2 <- scale2
I0415 19:33:01.661372 1958055936 net.cpp:384] pool2 -> pool2
I0415 19:33:01.661381 1958055936 net.cpp:126] Setting up pool2
I0415 19:33:01.661384 1958055936 net.cpp:133] Top shape: 1 32 112 112 (401408)
I0415 19:33:01.661388 1958055936 net.cpp:141] Memory required for data: 84295680
I0415 19:33:01.661392 1958055936 layer_factory.hpp:77] Creating layer conv3
I0415 19:33:01.661397 1958055936 net.cpp:88] Creating Layer conv3
I0415 19:33:01.661401 1958055936 net.cpp:410] conv3 <- pool2
I0415 19:33:01.661415 1958055936 net.cpp:384] conv3 -> conv3
I0415 19:33:01.662611 1958055936 net.cpp:126] Setting up conv3
I0415 19:33:01.662623 1958055936 net.cpp:133] Top shape: 1 64 112 112 (802816)
I0415 19:33:01.662642 1958055936 net.cpp:141] Memory required for data: 87506944
I0415 19:33:01.662657 1958055936 layer_factory.hpp:77] Creating layer bn3
I0415 19:33:01.662662 1958055936 net.cpp:88] Creating Layer bn3
I0415 19:33:01.662665 1958055936 net.cpp:410] bn3 <- conv3
I0415 19:33:01.662670 1958055936 net.cpp:384] bn3 -> bn3
I0415 19:33:01.662716 1958055936 net.cpp:126] Setting up bn3
I0415 19:33:01.662721 1958055936 net.cpp:133] Top shape: 1 64 112 112 (802816)
I0415 19:33:01.662726 1958055936 net.cpp:141] Memory required for data: 90718208
I0415 19:33:01.662731 1958055936 layer_factory.hpp:77] Creating layer scale3
I0415 19:33:01.662737 1958055936 net.cpp:88] Creating Layer scale3
I0415 19:33:01.662741 1958055936 net.cpp:410] scale3 <- bn3
I0415 19:33:01.662745 1958055936 net.cpp:384] scale3 -> scale3
I0415 19:33:01.662752 1958055936 layer_factory.hpp:77] Creating layer scale3
I0415 19:33:01.662834 1958055936 net.cpp:126] Setting up scale3
I0415 19:33:01.662838 1958055936 net.cpp:133] Top shape: 1 64 112 112 (802816)
I0415 19:33:01.662842 1958055936 net.cpp:141] Memory required for data: 93929472
I0415 19:33:01.662849 1958055936 layer_factory.hpp:77] Creating layer relu3
I0415 19:33:01.662864 1958055936 net.cpp:88] Creating Layer relu3
I0415 19:33:01.662868 1958055936 net.cpp:410] relu3 <- scale3
I0415 19:33:01.662873 1958055936 net.cpp:371] relu3 -> scale3 (in-place)
I0415 19:33:01.663158 1958055936 net.cpp:126] Setting up relu3
I0415 19:33:01.663168 1958055936 net.cpp:133] Top shape: 1 64 112 112 (802816)
I0415 19:33:01.663184 1958055936 net.cpp:141] Memory required for data: 97140736
I0415 19:33:01.663188 1958055936 layer_factory.hpp:77] Creating layer pool3
I0415 19:33:01.663193 1958055936 net.cpp:88] Creating Layer pool3
I0415 19:33:01.663197 1958055936 net.cpp:410] pool3 <- scale3
I0415 19:33:01.663202 1958055936 net.cpp:384] pool3 -> pool3
I0415 19:33:01.663208 1958055936 net.cpp:126] Setting up pool3
I0415 19:33:01.663211 1958055936 net.cpp:133] Top shape: 1 64 56 56 (200704)
I0415 19:33:01.663216 1958055936 net.cpp:141] Memory required for data: 97943552
I0415 19:33:01.663219 1958055936 layer_factory.hpp:77] Creating layer conv4
I0415 19:33:01.663228 1958055936 net.cpp:88] Creating Layer conv4
I0415 19:33:01.663230 1958055936 net.cpp:410] conv4 <- pool3
I0415 19:33:01.663234 1958055936 net.cpp:384] conv4 -> conv4
I0415 19:33:01.664613 1958055936 net.cpp:126] Setting up conv4
I0415 19:33:01.664626 1958055936 net.cpp:133] Top shape: 1 128 56 56 (401408)
I0415 19:33:01.664631 1958055936 net.cpp:141] Memory required for data: 99549184
I0415 19:33:01.664638 1958055936 layer_factory.hpp:77] Creating layer bn4
I0415 19:33:01.664643 1958055936 net.cpp:88] Creating Layer bn4
I0415 19:33:01.664646 1958055936 net.cpp:410] bn4 <- conv4
I0415 19:33:01.664651 1958055936 net.cpp:384] bn4 -> bn4
I0415 19:33:01.664669 1958055936 net.cpp:126] Setting up bn4
I0415 19:33:01.664672 1958055936 net.cpp:133] Top shape: 1 128 56 56 (401408)
I0415 19:33:01.664676 1958055936 net.cpp:141] Memory required for data: 101154816
I0415 19:33:01.664681 1958055936 layer_factory.hpp:77] Creating layer scale4
I0415 19:33:01.664707 1958055936 net.cpp:88] Creating Layer scale4
I0415 19:33:01.664711 1958055936 net.cpp:410] scale4 <- bn4
I0415 19:33:01.664716 1958055936 net.cpp:384] scale4 -> scale4
I0415 19:33:01.664723 1958055936 layer_factory.hpp:77] Creating layer scale4
I0415 19:33:01.664752 1958055936 net.cpp:126] Setting up scale4
I0415 19:33:01.664755 1958055936 net.cpp:133] Top shape: 1 128 56 56 (401408)
I0415 19:33:01.664759 1958055936 net.cpp:141] Memory required for data: 102760448
I0415 19:33:01.664764 1958055936 layer_factory.hpp:77] Creating layer relu4
I0415 19:33:01.664769 1958055936 net.cpp:88] Creating Layer relu4
I0415 19:33:01.664772 1958055936 net.cpp:410] relu4 <- scale4
I0415 19:33:01.664786 1958055936 net.cpp:371] relu4 -> scale4 (in-place)
I0415 19:33:01.664937 1958055936 net.cpp:126] Setting up relu4
I0415 19:33:01.664945 1958055936 net.cpp:133] Top shape: 1 128 56 56 (401408)
I0415 19:33:01.664952 1958055936 net.cpp:141] Memory required for data: 104366080
I0415 19:33:01.664955 1958055936 layer_factory.hpp:77] Creating layer pool4
I0415 19:33:01.664960 1958055936 net.cpp:88] Creating Layer pool4
I0415 19:33:01.664964 1958055936 net.cpp:410] pool4 <- scale4
I0415 19:33:01.664968 1958055936 net.cpp:384] pool4 -> pool4
I0415 19:33:01.664976 1958055936 net.cpp:126] Setting up pool4
I0415 19:33:01.664980 1958055936 net.cpp:133] Top shape: 1 128 28 28 (100352)
I0415 19:33:01.664985 1958055936 net.cpp:141] Memory required for data: 104767488
I0415 19:33:01.664988 1958055936 layer_factory.hpp:77] Creating layer conv5
I0415 19:33:01.664995 1958055936 net.cpp:88] Creating Layer conv5
I0415 19:33:01.664999 1958055936 net.cpp:410] conv5 <- pool4
I0415 19:33:01.665004 1958055936 net.cpp:384] conv5 -> conv5
I0415 19:33:01.667851 1958055936 net.cpp:126] Setting up conv5
I0415 19:33:01.667862 1958055936 net.cpp:133] Top shape: 1 256 28 28 (200704)
I0415 19:33:01.667878 1958055936 net.cpp:141] Memory required for data: 105570304
I0415 19:33:01.667883 1958055936 layer_factory.hpp:77] Creating layer bn5
I0415 19:33:01.667889 1958055936 net.cpp:88] Creating Layer bn5
I0415 19:33:01.667893 1958055936 net.cpp:410] bn5 <- conv5
I0415 19:33:01.667907 1958055936 net.cpp:384] bn5 -> bn5
I0415 19:33:01.667917 1958055936 net.cpp:126] Setting up bn5
I0415 19:33:01.667920 1958055936 net.cpp:133] Top shape: 1 256 28 28 (200704)
I0415 19:33:01.667925 1958055936 net.cpp:141] Memory required for data: 106373120
I0415 19:33:01.667930 1958055936 layer_factory.hpp:77] Creating layer scale5
I0415 19:33:01.667934 1958055936 net.cpp:88] Creating Layer scale5
I0415 19:33:01.667938 1958055936 net.cpp:410] scale5 <- bn5
I0415 19:33:01.667943 1958055936 net.cpp:384] scale5 -> scale5
I0415 19:33:01.667959 1958055936 layer_factory.hpp:77] Creating layer scale5
I0415 19:33:01.667969 1958055936 net.cpp:126] Setting up scale5
I0415 19:33:01.667973 1958055936 net.cpp:133] Top shape: 1 256 28 28 (200704)
I0415 19:33:01.667976 1958055936 net.cpp:141] Memory required for data: 107175936
I0415 19:33:01.667981 1958055936 layer_factory.hpp:77] Creating layer relu5
I0415 19:33:01.667985 1958055936 net.cpp:88] Creating Layer relu5
I0415 19:33:01.667999 1958055936 net.cpp:410] relu5 <- scale5
I0415 19:33:01.668002 1958055936 net.cpp:371] relu5 -> scale5 (in-place)
I0415 19:33:01.668153 1958055936 net.cpp:126] Setting up relu5
I0415 19:33:01.668160 1958055936 net.cpp:133] Top shape: 1 256 28 28 (200704)
I0415 19:33:01.668175 1958055936 net.cpp:141] Memory required for data: 107978752
I0415 19:33:01.668179 1958055936 layer_factory.hpp:77] Creating layer pool5
I0415 19:33:01.668184 1958055936 net.cpp:88] Creating Layer pool5
I0415 19:33:01.668187 1958055936 net.cpp:410] pool5 <- scale5
I0415 19:33:01.668192 1958055936 net.cpp:384] pool5 -> pool5
I0415 19:33:01.668198 1958055936 net.cpp:126] Setting up pool5
I0415 19:33:01.668201 1958055936 net.cpp:133] Top shape: 1 256 14 14 (50176)
I0415 19:33:01.668215 1958055936 net.cpp:141] Memory required for data: 108179456
I0415 19:33:01.668218 1958055936 layer_factory.hpp:77] Creating layer conv6
I0415 19:33:01.668242 1958055936 net.cpp:88] Creating Layer conv6
I0415 19:33:01.668246 1958055936 net.cpp:410] conv6 <- pool5
I0415 19:33:01.668259 1958055936 net.cpp:384] conv6 -> conv6
I0415 19:33:01.678371 1958055936 net.cpp:126] Setting up conv6
I0415 19:33:01.678395 1958055936 net.cpp:133] Top shape: 1 512 14 14 (100352)
I0415 19:33:01.678401 1958055936 net.cpp:141] Memory required for data: 108580864
I0415 19:33:01.678407 1958055936 layer_factory.hpp:77] Creating layer bn6
I0415 19:33:01.678416 1958055936 net.cpp:88] Creating Layer bn6
I0415 19:33:01.678421 1958055936 net.cpp:410] bn6 <- conv6
I0415 19:33:01.678426 1958055936 net.cpp:384] bn6 -> bn6
I0415 19:33:01.678447 1958055936 net.cpp:126] Setting up bn6
I0415 19:33:01.678452 1958055936 net.cpp:133] Top shape: 1 512 14 14 (100352)
I0415 19:33:01.678457 1958055936 net.cpp:141] Memory required for data: 108982272
I0415 19:33:01.678464 1958055936 layer_factory.hpp:77] Creating layer scale6
I0415 19:33:01.678470 1958055936 net.cpp:88] Creating Layer scale6
I0415 19:33:01.678473 1958055936 net.cpp:410] scale6 <- bn6
I0415 19:33:01.678488 1958055936 net.cpp:384] scale6 -> scale6
I0415 19:33:01.678498 1958055936 layer_factory.hpp:77] Creating layer scale6
I0415 19:33:01.678509 1958055936 net.cpp:126] Setting up scale6
I0415 19:33:01.678513 1958055936 net.cpp:133] Top shape: 1 512 14 14 (100352)
I0415 19:33:01.678516 1958055936 net.cpp:141] Memory required for data: 109383680
I0415 19:33:01.678521 1958055936 layer_factory.hpp:77] Creating layer relu6
I0415 19:33:01.678526 1958055936 net.cpp:88] Creating Layer relu6
I0415 19:33:01.678529 1958055936 net.cpp:410] relu6 <- scale6
I0415 19:33:01.678544 1958055936 net.cpp:371] relu6 -> scale6 (in-place)
I0415 19:33:01.678782 1958055936 net.cpp:126] Setting up relu6
I0415 19:33:01.678791 1958055936 net.cpp:133] Top shape: 1 512 14 14 (100352)
I0415 19:33:01.678797 1958055936 net.cpp:141] Memory required for data: 109785088
I0415 19:33:01.678800 1958055936 layer_factory.hpp:77] Creating layer pool6
I0415 19:33:01.678807 1958055936 net.cpp:88] Creating Layer pool6
I0415 19:33:01.678809 1958055936 net.cpp:410] pool6 <- scale6
I0415 19:33:01.678814 1958055936 net.cpp:384] pool6 -> pool6
I0415 19:33:01.678822 1958055936 net.cpp:126] Setting up pool6
I0415 19:33:01.678825 1958055936 net.cpp:133] Top shape: 1 512 7 7 (25088)
I0415 19:33:01.678829 1958055936 net.cpp:141] Memory required for data: 109885440
I0415 19:33:01.678833 1958055936 layer_factory.hpp:77] Creating layer conv7
I0415 19:33:01.678839 1958055936 net.cpp:88] Creating Layer conv7
I0415 19:33:01.678843 1958055936 net.cpp:410] conv7 <- pool6
I0415 19:33:01.678848 1958055936 net.cpp:384] conv7 -> conv7
I0415 19:33:01.718405 1958055936 net.cpp:126] Setting up conv7
I0415 19:33:01.718430 1958055936 net.cpp:133] Top shape: 1 1024 7 7 (50176)
I0415 19:33:01.718437 1958055936 net.cpp:141] Memory required for data: 110086144
I0415 19:33:01.718443 1958055936 layer_factory.hpp:77] Creating layer bn7
I0415 19:33:01.718454 1958055936 net.cpp:88] Creating Layer bn7
I0415 19:33:01.718459 1958055936 net.cpp:410] bn7 <- conv7
I0415 19:33:01.718466 1958055936 net.cpp:384] bn7 -> bn7
I0415 19:33:01.718484 1958055936 net.cpp:126] Setting up bn7
I0415 19:33:01.718488 1958055936 net.cpp:133] Top shape: 1 1024 7 7 (50176)
I0415 19:33:01.718492 1958055936 net.cpp:141] Memory required for data: 110286848
I0415 19:33:01.718498 1958055936 layer_factory.hpp:77] Creating layer scale7
I0415 19:33:01.718503 1958055936 net.cpp:88] Creating Layer scale7
I0415 19:33:01.718507 1958055936 net.cpp:410] scale7 <- bn7
I0415 19:33:01.718511 1958055936 net.cpp:384] scale7 -> scale7
I0415 19:33:01.718538 1958055936 layer_factory.hpp:77] Creating layer scale7
I0415 19:33:01.718549 1958055936 net.cpp:126] Setting up scale7
I0415 19:33:01.718552 1958055936 net.cpp:133] Top shape: 1 1024 7 7 (50176)
I0415 19:33:01.718556 1958055936 net.cpp:141] Memory required for data: 110487552
I0415 19:33:01.718560 1958055936 layer_factory.hpp:77] Creating layer relu7
I0415 19:33:01.718575 1958055936 net.cpp:88] Creating Layer relu7
I0415 19:33:01.718578 1958055936 net.cpp:410] relu7 <- scale7
I0415 19:33:01.718582 1958055936 net.cpp:371] relu7 -> scale7 (in-place)
I0415 19:33:01.718709 1958055936 net.cpp:126] Setting up relu7
I0415 19:33:01.718725 1958055936 net.cpp:133] Top shape: 1 1024 7 7 (50176)
I0415 19:33:01.718730 1958055936 net.cpp:141] Memory required for data: 110688256
I0415 19:33:01.718734 1958055936 layer_factory.hpp:77] Creating layer conv8_y
I0415 19:33:01.718750 1958055936 net.cpp:88] Creating Layer conv8_y
I0415 19:33:01.718755 1958055936 net.cpp:410] conv8_y <- scale7
I0415 19:33:01.718760 1958055936 net.cpp:384] conv8_y -> conv8
I0415 19:33:01.745636 1958055936 net.cpp:126] Setting up conv8_y
I0415 19:33:01.745661 1958055936 net.cpp:133] Top shape: 1 256 7 7 (12544)
I0415 19:33:01.745667 1958055936 net.cpp:141] Memory required for data: 110738432
I0415 19:33:01.745673 1958055936 layer_factory.hpp:77] Creating layer bn8
I0415 19:33:01.745682 1958055936 net.cpp:88] Creating Layer bn8
I0415 19:33:01.745687 1958055936 net.cpp:410] bn8 <- conv8
I0415 19:33:01.745692 1958055936 net.cpp:384] bn8 -> bn8
I0415 19:33:01.745709 1958055936 net.cpp:126] Setting up bn8
I0415 19:33:01.745713 1958055936 net.cpp:133] Top shape: 1 256 7 7 (12544)
I0415 19:33:01.745718 1958055936 net.cpp:141] Memory required for data: 110788608
I0415 19:33:01.745723 1958055936 layer_factory.hpp:77] Creating layer scale8
I0415 19:33:01.745729 1958055936 net.cpp:88] Creating Layer scale8
I0415 19:33:01.745733 1958055936 net.cpp:410] scale8 <- bn8
I0415 19:33:01.745738 1958055936 net.cpp:384] scale8 -> scale8
I0415 19:33:01.745760 1958055936 layer_factory.hpp:77] Creating layer scale8
I0415 19:33:01.745770 1958055936 net.cpp:126] Setting up scale8
I0415 19:33:01.745775 1958055936 net.cpp:133] Top shape: 1 256 7 7 (12544)
I0415 19:33:01.745779 1958055936 net.cpp:141] Memory required for data: 110838784
I0415 19:33:01.745784 1958055936 layer_factory.hpp:77] Creating layer relu8
I0415 19:33:01.745789 1958055936 net.cpp:88] Creating Layer relu8
I0415 19:33:01.745791 1958055936 net.cpp:410] relu8 <- scale8
I0415 19:33:01.745800 1958055936 net.cpp:371] relu8 -> scale8 (in-place)
I0415 19:33:01.747014 1958055936 net.cpp:126] Setting up relu8
I0415 19:33:01.747023 1958055936 net.cpp:133] Top shape: 1 256 7 7 (12544)
I0415 19:33:01.747027 1958055936 net.cpp:141] Memory required for data: 110888960
I0415 19:33:01.747031 1958055936 layer_factory.hpp:77] Creating layer fc9
I0415 19:33:01.747045 1958055936 net.cpp:88] Creating Layer fc9
I0415 19:33:01.747048 1958055936 net.cpp:410] fc9 <- scale8
I0415 19:33:01.747063 1958055936 net.cpp:384] fc9 -> result
I0415 19:33:01.995153 1958055936 net.cpp:126] Setting up fc9
I0415 19:33:01.995194 1958055936 net.cpp:133] Top shape: 1 1470 (1470)
I0415 19:33:01.995201 1958055936 net.cpp:141] Memory required for data: 110894840
I0415 19:33:01.995210 1958055936 net.cpp:204] fc9 does not need backward computation.
I0415 19:33:01.995215 1958055936 net.cpp:204] relu8 does not need backward computation.
I0415 19:33:01.995219 1958055936 net.cpp:204] scale8 does not need backward computation.
I0415 19:33:01.995224 1958055936 net.cpp:204] bn8 does not need backward computation.
I0415 19:33:01.995229 1958055936 net.cpp:204] conv8_y does not need backward computation.
I0415 19:33:01.995234 1958055936 net.cpp:204] relu7 does not need backward computation.
I0415 19:33:01.995237 1958055936 net.cpp:204] scale7 does not need backward computation.
I0415 19:33:01.995241 1958055936 net.cpp:204] bn7 does not need backward computation.
I0415 19:33:01.995245 1958055936 net.cpp:204] conv7 does not need backward computation.
I0415 19:33:01.995249 1958055936 net.cpp:204] pool6 does not need backward computation.
I0415 19:33:01.995254 1958055936 net.cpp:204] relu6 does not need backward computation.
I0415 19:33:01.995256 1958055936 net.cpp:204] scale6 does not need backward computation.
I0415 19:33:01.995260 1958055936 net.cpp:204] bn6 does not need backward computation.
I0415 19:33:01.995263 1958055936 net.cpp:204] conv6 does not need backward computation.
I0415 19:33:01.995267 1958055936 net.cpp:204] pool5 does not need backward computation.
I0415 19:33:01.995271 1958055936 net.cpp:204] relu5 does not need backward computation.
I0415 19:33:01.995275 1958055936 net.cpp:204] scale5 does not need backward computation.
I0415 19:33:01.995280 1958055936 net.cpp:204] bn5 does not need backward computation.
I0415 19:33:01.995282 1958055936 net.cpp:204] conv5 does not need backward computation.
I0415 19:33:01.995286 1958055936 net.cpp:204] pool4 does not need backward computation.
I0415 19:33:01.995290 1958055936 net.cpp:204] relu4 does not need backward computation.
I0415 19:33:01.995295 1958055936 net.cpp:204] scale4 does not need backward computation.
I0415 19:33:01.995297 1958055936 net.cpp:204] bn4 does not need backward computation.
I0415 19:33:01.995301 1958055936 net.cpp:204] conv4 does not need backward computation.
I0415 19:33:01.995306 1958055936 net.cpp:204] pool3 does not need backward computation.
I0415 19:33:01.995309 1958055936 net.cpp:204] relu3 does not need backward computation.
I0415 19:33:01.995313 1958055936 net.cpp:204] scale3 does not need backward computation.
I0415 19:33:01.995316 1958055936 net.cpp:204] bn3 does not need backward computation.
I0415 19:33:01.995321 1958055936 net.cpp:204] conv3 does not need backward computation.
I0415 19:33:01.995324 1958055936 net.cpp:204] pool2 does not need backward computation.
I0415 19:33:01.995327 1958055936 net.cpp:204] relu2 does not need backward computation.
I0415 19:33:01.995332 1958055936 net.cpp:204] scale2 does not need backward computation.
I0415 19:33:01.995335 1958055936 net.cpp:204] bn2 does not need backward computation.
I0415 19:33:01.995338 1958055936 net.cpp:204] conv2 does not need backward computation.
I0415 19:33:01.995342 1958055936 net.cpp:204] pool1 does not need backward computation.
I0415 19:33:01.995347 1958055936 net.cpp:204] relu1 does not need backward computation.
I0415 19:33:01.995349 1958055936 net.cpp:204] scale1 does not need backward computation.
I0415 19:33:01.995353 1958055936 net.cpp:204] bn1 does not need backward computation.
I0415 19:33:01.995357 1958055936 net.cpp:204] conv1 does not need backward computation.
I0415 19:33:01.995362 1958055936 net.cpp:204] input does not need backward computation.
I0415 19:33:01.995364 1958055936 net.cpp:246] This network produces output result
I0415 19:33:01.995378 1958055936 net.cpp:259] Network initialization done.
False
(16175385,)
conv1(conv)
bn1(batchnorm)
scale1(scale)
conv2(conv)
bn2(batchnorm)
scale2(scale)
conv3(conv)
bn3(batchnorm)
scale3(scale)
conv4(conv)
bn4(batchnorm)
scale4(scale)
conv5(conv)
bn5(batchnorm)
scale5(scale)
conv6(conv)
bn6(batchnorm)
scale6(scale)
conv7(conv)
bn7(batchnorm)
scale7(scale)
conv8_y(conv)
bn8(batchnorm)
scale8(scale)
fc9(fc)
Traceback (most recent call last):
File "create_yolo_caffemodel.py", line 113, in
main(sys.argv[1:])
File "create_yolo_caffemodel.py", line 92, in main
net.params[pr][0].data[...] = np.reshape(netWeights[count:count+weightSize], dims)
File "/usr/local/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 232, in reshape
return _wrapfunc(a, 'reshape', newshape, order=order)
File "/usr/local/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 57, in _wrapfunc
return getattr(obj, method)(*args, **kwds)
ValueError: cannot reshape array of size 7515115 into shape (1470,12544)

from caffe-yolo.

xiaoyuch24 avatar xiaoyuch24 commented on July 20, 2024

@victorv @enderhsu I got same problem with victorv. I already downloaded "tiny.cfg", "tiny.weights", "yolo.weights" and "yolov1.weights". For each weights file, I used this command

python create_yolo_caffemodel.py -m "prototxt_file" -w "weights_file" -o tiny-yolo.caffemodel

to try all the prototxt files in the prototxt folder. But, none of them is working.

@enderhsu could you please tell us which specific prototxt file in the prototxt folder you used with tiny.weights file?

I also tried "python create_yolo_prototxt.py tiny.cfg yolo_tiny.prototxt" to build a new prototxt file. And then, I used the new yolo_tiny.prototxt file to run

"python create_yolo_caffemodel.py -m yolo_tiny.prototxt -w tiny.weights -o yolo.caffemodel"

I got the error:

[('-m', 'yolo_tiny.prototxt'), ('-w', 'tiny.weights'), ('-o', 'yolo.caffemodel')]
model file is yolo_tiny.prototxt
weight file is tiny.weights
output caffemodel file is yolo.caffemodel
[libprotobuf ERROR google/protobuf/text_format.cc:245] Error parsing text-format caffe.NetParameter: 629:3: Invalid value for boolean field "global_pooling". Value: "True".
WARNING: Logging before InitGoogleLogging() is written to STDERR
F0415 22:59:29.137689 12044 upgrade_proto.cpp:88] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: yolo_tiny.prototxt
*** Check failure stack trace: ***
Aborted (core dumped)

from caffe-yolo.

zhan-xu avatar zhan-xu commented on July 20, 2024

@xiaoyuchhusky I met this error too. I manually change the value "True" to "true" for global_pooling in the generated prototxt. Besides, you probably also need to change "Relu" to "ReLU" in the same prototxt.

from caffe-yolo.

xiaoyuch24 avatar xiaoyuch24 commented on July 20, 2024

@enderhsu Thank you so much for your quick reply. Everything is working well right now. Did you try to input a video file to do the object detection?

from caffe-yolo.

victorv avatar victorv commented on July 20, 2024

@enderhsu @xiaoyuchhusky confirmed that it works with the generated prototxt only.

from caffe-yolo.

zhan-xu avatar zhan-xu commented on July 20, 2024

@xiaoyuchhusky Not yet. Will do in a few days.

from caffe-yolo.

zhan-xu avatar zhan-xu commented on July 20, 2024

@xiaoyuchhusky it seems not correct. For some 1*1 conv layer, the generated prototxt adds 1 pixel padding, which increases the size of feature maps. The overall architecture is not consistent with original dark net.

from caffe-yolo.

yyytq avatar yyytq commented on July 20, 2024

@xiaoyuchhusky @enderhsu I use the .weight and .cfg you haved mentioned, but still get the error:

Traceback (most recent call last):
File "create_yolo_caffemodel.py", line 113, in
main(sys.argv[1:])
File "create_yolo_caffemodel.py", line 78, in main
net.params[pr][0].data[...] = np.reshape(netWeights[count:count+weightSize], dims)
File "/usr/local/lib/python2.7/dist-packages/numpy/core/fromnumeric.py", line 232, in reshape
return _wrapfunc(a, 'reshape', newshape, order=order)
File "/usr/local/lib/python2.7/dist-packages/numpy/core/fromnumeric.py", line 57, in _wrapfunc
return getattr(obj, method)(*args, **kwds)
ValueError: cannot reshape array of size 158400 into shape (512,64,3,3)

do you have any suggestions about this?

from caffe-yolo.

deepdad avatar deepdad commented on July 20, 2024

python D:\caffe-yolo\yolo_main.py -m D:\caffe-yolo\created_yolo.caffemodel -w D:\caffe-yolo\tiny-yolo.weights -i C:\4.jpg
[('-m', 'D:\caffe-yolo\created_yolo.caffemodel'), ('-w', 'D:\caffe-yolo\tiny-yolo.weights'), ('-i', 'C:\4.jpg')]
model file is " D:\caffe-yolo\created_yolo.caffemodel
weight file is " D:\caffe-yolo\tiny-yolo.weights
image file is " C:\4.jpg
WARNING: Logging before InitGoogleLogging() is written to STDERR
W0527 08:16:09.356567 5128 _caffe.cpp:175] DEPRECATION WARNING - deprecated use of Python interface
W0527 08:16:09.356567 5128 _caffe.cpp:176] Use this instead (with the named "weights" parameter):
W0527 08:16:09.356567 5128 _caffe.cpp:178] Net('D:\caffe-yolo\created_yolo.caffemodel', 1, weights='D:\caffe-yolo\tiny-yolo.weights')
[libprotobuf ERROR C:\Users\guillaume\work\caffe-builder\build_v140_x64\packages\protobuf\protobuf_download-prefix\src\protobuf_download\src\google\protobuf\text_format.cc:298] Error parsing text-format caffe.NetParameter: 2:1: Invalid control characters encountered in text.
[libprotobuf ERROR C:\Users\guillaume\work\caffe-builder\build_v140_x64\packages\protobuf\protobuf_download-prefix\src\protobuf_download\src\google\protobuf\text_format.cc:298] Error parsing text-format caffe.NetParameter: 2:6: Interpreting non ascii codepoint 162.
[libprotobuf ERROR C:\Users\guillaume\work\caffe-builder\build_v140_x64\packages\protobuf\protobuf_download-prefix\src\protobuf_download\src\google\protobuf\text_format.cc:298] Error parsing text-format caffe.NetParameter: 2:6: Message type "caffe.NetParameter" has no field named "tiny".
F0527 08:16:09.356567 5128 upgrade_proto.cpp:88] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: D:\caffe-yolo\created_yolo.caffemodel
*** Check failure stack trace: ***

Putting
net = caffe.Net(model_filename, caffe.TEST, weights=weight_filename)
on line 137 of yolo_main.py results in a different error

from caffe-yolo.

ashwinnair14 avatar ashwinnair14 commented on July 20, 2024

Hello,

Did any one solve this issue?
I am not able to run tiny yolo.
And converting them I get similar errors as stated above.

from caffe-yolo.

TapSumBong avatar TapSumBong commented on July 20, 2024

@enderhsu Hi. I was able to convert to prototxt using tiny.cfg. I would like to train my own model, how to I configure the number of classes I'm going to train on tiny.cfg?

from caffe-yolo.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.