OS: Ubuntu16.04
MXNET: 0.11.1
OPENCV: 3.3
CUDA 8.0
CUDNN 6.0
`EEEE
ERROR: test_basicSoftMax (main.SoftmaxFocalTest)
Traceback (most recent call last):
File "softmaxfocaltest.py", line 16, in test_basicSoftMax
sym = mx.symbol.SoftmaxFocalOutput(data=self.var_input_data,labels=self.var_label,alphas=(0.25,1,1,1),gamma=2)
File "", line 47, in SoftmaxFocalOutput
File "/home/mxnet/python/mxnet/_ctypes/symbol.py", line 127, in _symbol_creator
ctypes.byref(sym_handle)))
File "/home/mxnet/python/mxnet/base.py", line 143, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
MXNetError: Cannot find argument 'data', Possible Arguments:
grad_scale : float, optional, default=1
Scale the gradient by a float factor
ignore_label : float, optional, default=-1
the labels with value equals to ignore_label
will be ignored during backward (only works if use_ignore is set to be true).
multi_output : boolean, optional, default=False
If set to true, softmax will applied on axis 1
use_ignore : boolean, optional, default=False
If set to true, the ignore_label value will not contribute to the backward gradient
preserve_shape : boolean, optional, default=False
If true, softmax will applied on the last axis
normalization : {'batch', 'null', 'valid'},optional, default='null'
Normalize the gradient
out_grad : boolean, optional, default=False
Apply weighting from output gradient
gamma : int, optional, default='2'
gamma value in focal softmax
alphas : , optional, default=(0.25,)
alpha value in focal softmax,alpha can be set for each class
, in operator SoftmaxFocalOutput(name="", alphas="(0.25, 1, 1, 1)", gamma="2", data="")
======================================================================
ERROR: test_softamxIgnorLabel (main.SoftmaxFocalTest)
Traceback (most recent call last):
File "softmaxfocaltest.py", line 27, in test_softamxIgnorLabel
sym = mx.symbol.SoftmaxFocalOutput(data=self.var_input_data, label=self.var_label,alphas=(0.25,1,1,1),gamma=2,use_ignore=True,ignore_label =1)
File "", line 47, in SoftmaxFocalOutput
File "/home/mxnet/python/mxnet/_ctypes/symbol.py", line 127, in _symbol_creator
ctypes.byref(sym_handle)))
File "/home/mxnet/python/mxnet/base.py", line 143, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
MXNetError: Cannot find argument 'label', Possible Arguments:
grad_scale : float, optional, default=1
Scale the gradient by a float factor
ignore_label : float, optional, default=-1
the labels with value equals to ignore_label
will be ignored during backward (only works if use_ignore is set to be true).
multi_output : boolean, optional, default=False
If set to true, softmax will applied on axis 1
use_ignore : boolean, optional, default=False
If set to true, the ignore_label value will not contribute to the backward gradient
preserve_shape : boolean, optional, default=False
If true, softmax will applied on the last axis
normalization : {'batch', 'null', 'valid'},optional, default='null'
Normalize the gradient
out_grad : boolean, optional, default=False
Apply weighting from output gradient
gamma : int, optional, default='2'
gamma value in focal softmax
alphas : , optional, default=(0.25,)
alpha value in focal softmax,alpha can be set for each class
, in operator SoftmaxFocalOutput(name="", alphas="(0.25, 1, 1, 1)", gamma="2", label="", data="", ignore_label="1", use_ignore="True")
======================================================================
ERROR: test_basicSoftMax (main.SoftmaxMultiLabelFocalTest)
Traceback (most recent call last):
File "softmaxfocaltest.py", line 69, in test_basicSoftMax
gamma=2,multi_output=True)
File "", line 47, in SoftmaxFocalOutput
File "/home/mxnet/python/mxnet/_ctypes/symbol.py", line 127, in _symbol_creator
ctypes.byref(sym_handle)))
File "/home/mxnet/python/mxnet/base.py", line 143, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
MXNetError: Cannot find argument 'data', Possible Arguments:
grad_scale : float, optional, default=1
Scale the gradient by a float factor
ignore_label : float, optional, default=-1
the labels with value equals to ignore_label
will be ignored during backward (only works if use_ignore is set to be true).
multi_output : boolean, optional, default=False
If set to true, softmax will applied on axis 1
use_ignore : boolean, optional, default=False
If set to true, the ignore_label value will not contribute to the backward gradient
preserve_shape : boolean, optional, default=False
If true, softmax will applied on the last axis
normalization : {'batch', 'null', 'valid'},optional, default='null'
Normalize the gradient
out_grad : boolean, optional, default=False
Apply weighting from output gradient
gamma : int, optional, default='2'
gamma value in focal softmax
alphas : , optional, default=(0.25,)
alpha value in focal softmax,alpha can be set for each class
, in operator SoftmaxFocalOutput(name="", data="", label="", alphas="(0.25, 1, 1, 1, 1, 1, 1, 1)", multi_output="True", gamma="2")
======================================================================
ERROR: test_cpuSoftamxIgnorLabel (main.SoftmaxMultiLabelFocalTest)
Traceback (most recent call last):
File "softmaxfocaltest.py", line 80, in test_cpuSoftamxIgnorLabel
gamma=2, use_ignore=True, ignore_label=1,multi_output=True)
File "", line 47, in SoftmaxFocalOutput
File "/home/mxnet/python/mxnet/_ctypes/symbol.py", line 127, in _symbol_creator
ctypes.byref(sym_handle)))
File "/home/mxnet/python/mxnet/base.py", line 143, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
MXNetError: Cannot find argument 'label', Possible Arguments:
grad_scale : float, optional, default=1
Scale the gradient by a float factor
ignore_label : float, optional, default=-1
the labels with value equals to ignore_label
will be ignored during backward (only works if use_ignore is set to be true).
multi_output : boolean, optional, default=False
If set to true, softmax will applied on axis 1
use_ignore : boolean, optional, default=False
If set to true, the ignore_label value will not contribute to the backward gradient
preserve_shape : boolean, optional, default=False
If true, softmax will applied on the last axis
normalization : {'batch', 'null', 'valid'},optional, default='null'
Normalize the gradient
out_grad : boolean, optional, default=False
Apply weighting from output gradient
gamma : int, optional, default='2'
gamma value in focal softmax
alphas : , optional, default=(0.25,)
alpha value in focal softmax,alpha can be set for each class
, in operator SoftmaxFocalOutput(name="", gamma="2", use_ignore="True", label="", data="", ignore_label="1", alphas="(0.25, 1, 1, 1)", multi_output="True")
Ran 4 tests in 0.002s
FAILED (errors=4)`