Giter Club home page Giter Club logo

also's People

Contributors

aboulch avatar erjanmx avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

also's Issues

semanticKITTI pre_train and downstream training data

hi authors,

thanks for your great work. I have some training data questions, especially semanticKITTI

  1. pre_train: I found you use the train data like 00 to 10 except 08. not the eval 08 and test data 11-21. Do I understand correctly? because it is reasonable

  2. downstream train data. I found you use the datasets/percentiles_split.json to load the data. How is this json generated? is it randomly? As one simple way is use the python slicing, for example, if the train ration is 1%, and the skip ratio is 100, so man can easily use train_data[::100] to get it. Why you use a json file to load it?

PointContrast for Outdoor Data

Do not we need different views of the same pointcloud to train PointContrast? How did you pretrain NuScenes and SemanticKITTI datasets with PointContrast approach? Did you just use the static parts of the scene?

Visualization

Are there any viusalization tools for occupancy prediction to supply or recommend? Thanks

SmanticKITTI - Baseline results are not reproducible

Hello and thanks for the great work!

I have now tried the framework and trained the baseline (random initialization) with 0.1% and 1% of the data on SemanticKITTI with the MinkUNET. However, I get much lower values. For 0.1% of the data I get 20 mIOU (instead of the reported 30%) and for 1% of the data I get 35.7% mIOU (instead of the reported 47%). Do you have any idea whats the issue here?

Best regards,
Christian

NuScenes Replication of Results

Than you for publishing your work and code. I tried to replicate the results for SemanticKITTI and NuScenes datasets for semantic segmentation using MinkUNet. For SemanticKITTI I get 62.7 mIoU(scratch) which is perfect but for NuScenes I get 71.3(scratch) which is a bit more than expected(70.2). I only changed train split to train and val split to val from the original config file of nuscenes. I attach the output file so maybe you can help. Thanks in advance.

output.txt

what is the pytorch and cuda version to run your code, i got problem when testing openpcdet?

hi, thx for your great project.
i got problem when i install openpcdet and run
python setup.py develop
The error is shown below:

running develop
/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/command/develop.py:40: EasyInstallDeprecationWarning: easy_install command is deprecated.
!!

    ********************************************************************************
    Please avoid running ``setup.py`` and ``easy_install``.
    Instead, use pypa/build, pypa/installer or other
    standards-based tools.

    See https://github.com/pypa/setuptools/issues/917 for details.
    ********************************************************************************

!!
easy_install.initialize_options(self)
/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated.
!!

    ********************************************************************************
    Please avoid running ``setup.py`` directly.
    Instead, use pypa/build, pypa/installer or other
    standards-based tools.

    See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details.
    ********************************************************************************

!!
self.initialize_options()
running egg_info
writing pcdet.egg-info/PKG-INFO
writing dependency_links to pcdet.egg-info/dependency_links.txt
writing requirements to pcdet.egg-info/requires.txt
writing top-level names to pcdet.egg-info/top_level.txt
reading manifest file 'pcdet.egg-info/SOURCES.txt'
adding license file 'LICENSE'
writing manifest file 'pcdet.egg-info/SOURCES.txt'
running build_ext
/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/torch/utils/cpp_extension.py:387: UserWarning: The detected CUDA version (11.8) has a minor version mismatch with the version that was used to compile PyTorch (11.7). Most likely this shouldn't be a problem.
warnings.warn(CUDA_MISMATCH_WARN.format(cuda_str_version, torch.version.cuda))
/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/torch/utils/cpp_extension.py:397: UserWarning: There are no g++ version bounds defined for CUDA version 11.8
warnings.warn(f'There are no {compiler_name} version bounds defined for CUDA version {cuda_str_version}')
building 'pcdet.ops.iou3d_nms.iou3d_nms_cuda' extension
Traceback (most recent call last):
File "/home/hea4sgh/ALSO/OpenPCDet/setup.py", line 34, in
setup(
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/init.py", line 103, in setup
return distutils.core.setup(**attrs)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 185, in setup
return run_commands(dist)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
dist.run_commands()
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 969, in run_commands
self.run_command(cmd)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/dist.py", line 989, in run_command
super().run_command(command)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/command/develop.py", line 34, in run
self.install_for_development()
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/command/develop.py", line 109, in install_for_development
self.run_command('build_ext')
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/dist.py", line 989, in run_command
super().run_command(command)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/command/build_ext.py", line 88, in run
_build_ext.run(self)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 345, in run
self.build_extensions()
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 843, in build_extensions
build_ext.build_extensions(self)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 467, in build_extensions
self._build_extensions_serial()
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 493, in _build_extensions_serial
self.build_extension(ext)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/command/build_ext.py", line 249, in build_extension
_build_ext.build_extension(self, ext)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 548, in build_extension
objects = self.compiler.compile(
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 649, in unix_wrap_ninja_compile
cuda_post_cflags = unix_cuda_flags(cuda_post_cflags)
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 548, in unix_cuda_flags
cflags + _get_cuda_arch_flags(cflags))
File "/home/hea4sgh/.conda/envs/also/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1780, in _get_cuda_arch_flags
arch_list[-1] += '+PTX'
IndexError: list index out of range

I found some related issues.
open-mmlab/OpenPCDet#1212

and I found that it may be because of the cuda version issue. so what is the pytorch and cuda version to run your code?
BTW, could you please provide an easy way to install or copy your environment? I can only see some dependencies in your README.
thx in advance.

error while running of pre trained model

Hello, I am trying to use the pretrained models given in the readme. I am using the nuscene(v.1-0mini) dataset and the MinkUNet34 pretrained model. I first run the command

python convert_models.py —downstream —ckpt path_to_downstream_checkpoint

and then the command

python eval.py —split val —config path_to_downstream_model/config.yaml —ckpt path_to_downstream_checkpoint/trained_model_XXX.ckpt

The first command works fine but the second command gives an error :

Traceback (most recent call last):
File "/root/docker_data/ALSO2/ALSO/also_selfsup/eval.py", line 78, in
net.load_state_dict(torch.load(opts.ckpt), strict=True)
File "/root/miniconda3/envs/env2/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1671, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for MinkUNet34:
Missing key(s) in state_dict: "stem.0.kernel", "stem.1.weight", "stem.1.bias", "stem.1.running_mean", "stem.1.running_var", "stage1.0.net.0.kernel", "stage1.0.net.1.weight", "stage1.0.net.1.bias", "stage1.0.net.1.running_mean", "stage1.0.net.1.running_var", "stage1.1.net.0.kernel", "stage1.1.net.1.weight", "stage1.1.net.1.bias", "stage1.1.net.1.running_mean", "stage1.1.net.1.running_var", "stage1.1.net.3.kernel", "stage1.1.net.4.weight", "stage1.1.net.4.bias", "stage1.1.net.4.running_mean", "stage1.1.net.4.running_var", "stage1.2.net.0.kernel", "stage1.2.net.1.weight", "stage1.2.net.1.bias", "stage1.2.net.1.running_mean", "stage1.2.net.1.running_var", "stage1.2.net.3.kernel", "stage1.2.net.4.weight", "stage1.2.net.4.bias", "stage1.2.net.4.running_mean", "stage1.2.net.4.running_var", "stage2.0.net.0.kernel", "stage2.0.net.1.weight", "stage2.0.net.1.bias", "stage2.0.net.1.running_mean", "stage2.0.net.1.running_var", "stage2.1.net.0.kernel", "stage2.1.net.1.weight", "stage2.1.net.1.bias", "stage2.1.net.1.running_mean", "stage2.1.net.1.running_var", "stage2.1.net.3.kernel", "stage2.1.net.4.weight", "stage2.1.net.4.bias", "stage2.1.net.4.running_mean", "stage2.1.net.4.running_var", "stage2.1.downsample.0.kernel", "stage2.1.downsample.1.weight", "stage2.1.downsample.1.bias", "stage2.1.downsample.1.running_mean", "stage2.1.downsample.1.running_var", "stage2.2.net.0.kernel", "stage2.2.net.1.weight", "stage2.2.net.1.bias", "stage2.2.net.1.running_mean", "stage2.2.net.1.running_var", "stage2.2.net.3.kernel", "stage2.2.net.4.weight", "stage2.2.net.4.bias", "stage2.2.net.4.running_mean", "stage2.2.net.4.running_var", "stage2.3.net.0.kernel", "stage2.3.net.1.weight", "stage2.3.net.1.bias", "stage2.3.net.1.running_mean", "stage2.3.net.1.running_var", "stage2.3.net.3.kernel", "stage2.3.net.4.weight", "stage2.3.net.4.bias", "stage2.3.net.4.running_mean", "stage2.3.net.4.running_var", "stage3.0.net.0.kernel", "stage3.0.net.1.weight", "stage3.0.net.1.bias", "stage3.0.net.1.running_mean", "stage3.0.net.1.running_var", "stage3.1.net.0.kernel", "stage3.1.net.1.weight", "stage3.1.net.1.bias", "stage3.1.net.1.running_mean", "stage3.1.net.1.running_var", "stage3.1.net.3.kernel", "stage3.1.net.4.weight", "stage3.1.net.4.bias", "stage3.1.net.4.running_mean", "stage3.1.net.4.running_var", "stage3.1.downsample.0.kernel", "stage3.1.downsample.1.weight", "stage3.1.downsample.1.bias", "stage3.1.downsample.1.running_mean", "stage3.1.downsample.1.running_var", "stage3.2.net.0.kernel", "stage3.2.net.1.weight", "stage3.2.net.1.bias", "stage3.2.net.1.running_mean", "stage3.2.net.1.running_var", "stage3.2.net.3.kernel", "stage3.2.net.4.weight", "stage3.2.net.4.bias", "stage3.2.net.4.running_mean", "stage3.2.net.4.running_var", "stage3.3.net.0.kernel", "stage3.3.net.1.weight", "stage3.3.net.1.bias", "stage3.3.net.1.running_mean", "stage3.3.net.1.running_var", "stage3.3.net.3.kernel", "stage3.3.net.4.weight", "stage3.3.net.4.bias", "stage3.3.net.4.running_mean", "stage3.3.net.4.running_var", "stage3.4.net.0.kernel", "stage3.4.net.1.weight", "stage3.4.net.1.bias", "stage3.4.net.1.running_mean", "stage3.4.net.1.running_var", "stage3.4.net.3.kernel", "stage3.4.net.4.weight", "stage3.4.net.4.bias", "stage3.4.net.4.running_mean", "stage3.4.net.4.running_var", "stage4.0.net.0.kernel", "stage4.0.net.1.weight", "stage4.0.net.1.bias", "stage4.0.net.1.running_mean", "stage4.0.net.1.running_var", "stage4.1.net.0.kernel", "stage4.1.net.1.weight", "stage4.1.net.1.bias", "stage4.1.net.1.running_mean", "stage4.1.net.1.running_var", "stage4.1.net.3.kernel", "stage4.1.net.4.weight", "stage4.1.net.4.bias", "stage4.1.net.4.running_mean", "stage4.1.net.4.running_var", "stage4.1.downsample.0.kernel", "stage4.1.downsample.1.weight", "stage4.1.downsample.1.bias", "stage4.1.downsample.1.running_mean", "stage4.1.downsample.1.running_var", "stage4.2.net.0.kernel", "stage4.2.net.1.weight", "stage4.2.net.1.bias", "stage4.2.net.1.running_mean", "stage4.2.net.1.running_var", "stage4.2.net.3.kernel", "stage4.2.net.4.weight", "stage4.2.net.4.bias", "stage4.2.net.4.running_mean", "stage4.2.net.4.running_var", "stage4.3.net.0.kernel", "stage4.3.net.1.weight", "stage4.3.net.1.bias", "stage4.3.net.1.running_mean", "stage4.3.net.1.running_var", "stage4.3.net.3.kernel", "stage4.3.net.4.weight", "stage4.3.net.4.bias", "stage4.3.net.4.running_mean", "stage4.3.net.4.running_var", "stage4.4.net.0.kernel", "stage4.4.net.1.weight", "stage4.4.net.1.bias", "stage4.4.net.1.running_mean", "stage4.4.net.1.running_var", "stage4.4.net.3.kernel", "stage4.4.net.4.weight", "stage4.4.net.4.bias", "stage4.4.net.4.running_mean", "stage4.4.net.4.running_var", "stage4.5.net.0.kernel", "stage4.5.net.1.weight", "stage4.5.net.1.bias", "stage4.5.net.1.running_mean", "stage4.5.net.1.running_var", "stage4.5.net.3.kernel", "stage4.5.net.4.weight", "stage4.5.net.4.bias", "stage4.5.net.4.running_mean", "stage4.5.net.4.running_var", "stage4.6.net.0.kernel", "stage4.6.net.1.weight", "stage4.6.net.1.bias", "stage4.6.net.1.running_mean", "stage4.6.net.1.running_var", "stage4.6.net.3.kernel", "stage4.6.net.4.weight", "stage4.6.net.4.bias", "stage4.6.net.4.running_mean", "stage4.6.net.4.running_var", "up1.0.net.0.kernel", "up1.0.net.1.weight", "up1.0.net.1.bias", "up1.0.net.1.running_mean", "up1.0.net.1.running_var", "up1.1.0.net.0.kernel", "up1.1.0.net.1.weight", "up1.1.0.net.1.bias", "up1.1.0.net.1.running_mean", "up1.1.0.net.1.running_var", "up1.1.0.net.3.kernel", "up1.1.0.net.4.weight", "up1.1.0.net.4.bias", "up1.1.0.net.4.running_mean", "up1.1.0.net.4.running_var", "up1.1.0.downsample.0.kernel", "up1.1.0.downsample.1.weight", "up1.1.0.downsample.1.bias", "up1.1.0.downsample.1.running_mean", "up1.1.0.downsample.1.running_var", "up1.1.1.net.0.kernel", "up1.1.1.net.1.weight", "up1.1.1.net.1.bias", "up1.1.1.net.1.running_mean", "up1.1.1.net.1.running_var", "up1.1.1.net.3.kernel", "up1.1.1.net.4.weight", "up1.1.1.net.4.bias", "up1.1.1.net.4.running_mean", "up1.1.1.net.4.running_var", "up2.0.net.0.kernel", "up2.0.net.1.weight", "up2.0.net.1.bias", "up2.0.net.1.running_mean", "up2.0.net.1.running_var", "up2.1.0.net.0.kernel", "up2.1.0.net.1.weight", "up2.1.0.net.1.bias", "up2.1.0.net.1.running_mean", "up2.1.0.net.1.running_var", "up2.1.0.net.3.kernel", "up2.1.0.net.4.weight", "up2.1.0.net.4.bias", "up2.1.0.net.4.running_mean", "up2.1.0.net.4.running_var", "up2.1.0.downsample.0.kernel", "up2.1.0.downsample.1.weight", "up2.1.0.downsample.1.bias", "up2.1.0.downsample.1.running_mean", "up2.1.0.downsample.1.running_var", "up2.1.1.net.0.kernel", "up2.1.1.net.1.weight", "up2.1.1.net.1.bias", "up2.1.1.net.1.running_mean", "up2.1.1.net.1.running_var", "up2.1.1.net.3.kernel", "up2.1.1.net.4.weight", "up2.1.1.net.4.bias", "up2.1.1.net.4.running_mean", "up2.1.1.net.4.running_var", "up3.0.net.0.kernel", "up3.0.net.1.weight", "up3.0.net.1.bias", "up3.0.net.1.running_mean", "up3.0.net.1.running_var", "up3.1.0.net.0.kernel", "up3.1.0.net.1.weight", "up3.1.0.net.1.bias", "up3.1.0.net.1.running_mean", "up3.1.0.net.1.running_var", "up3.1.0.net.3.kernel", "up3.1.0.net.4.weight", "up3.1.0.net.4.bias", "up3.1.0.net.4.running_mean", "up3.1.0.net.4.running_var", "up3.1.0.downsample.0.kernel", "up3.1.0.downsample.1.weight", "up3.1.0.downsample.1.bias", "up3.1.0.downsample.1.running_mean", "up3.1.0.downsample.1.running_var", "up3.1.1.net.0.kernel", "up3.1.1.net.1.weight", "up3.1.1.net.1.bias", "up3.1.1.net.1.running_mean", "up3.1.1.net.1.running_var", "up3.1.1.net.3.kernel", "up3.1.1.net.4.weight", "up3.1.1.net.4.bias", "up3.1.1.net.4.running_mean", "up3.1.1.net.4.running_var", "up4.0.net.0.kernel", "up4.0.net.1.weight", "up4.0.net.1.bias", "up4.0.net.1.running_mean", "up4.0.net.1.running_var", "up4.1.0.net.0.kernel", "up4.1.0.net.1.weight", "up4.1.0.net.1.bias", "up4.1.0.net.1.running_mean", "up4.1.0.net.1.running_var", "up4.1.0.net.3.kernel", "up4.1.0.net.4.weight", "up4.1.0.net.4.bias", "up4.1.0.net.4.running_mean", "up4.1.0.net.4.running_var", "up4.1.0.downsample.0.kernel", "up4.1.0.downsample.1.weight", "up4.1.0.downsample.1.bias", "up4.1.0.downsample.1.running_mean", "up4.1.0.downsample.1.running_var", "up4.1.1.net.0.kernel", "up4.1.1.net.1.weight", "up4.1.1.net.1.bias", "up4.1.1.net.1.running_mean", "up4.1.1.net.1.running_var", "up4.1.1.net.3.kernel", "up4.1.1.net.4.weight", "up4.1.1.net.4.bias", "up4.1.1.net.4.running_mean", "up4.1.1.net.4.running_var", "classifier.0.weight", "classifier.0.bias".

Environment Issue

Firstly, I really appreciate your awesome work. I got some problems when I built the envrionment. Could you plese specific the envs you used? It will be very helpful.
Thanks
Sincerely

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.