Giter Club home page Giter Club logo

Comments (11)

bvrabete avatar bvrabete commented on July 30, 2024 2

For Workload : SpecCpu-2017 ERROR: failed to solve: failed to compute cache key: failed to calculate the checksum of ref 8f74b3ec-14bb-4b3f-bf4a-2415442bc78c::s1b8k8203tvipdcv00euqyggg: "/data": not found
As readme mentioned: need user manually create /data folder and export related binaries --> https://github.com/intel-innersource/applications.benchmarking.benchmark.external-platform-hero-features/blob/23.3_external/workload/SpecCpu-2017/README.md

Sure, where can we find these binaries?

SpecCPU is a commercial benchmark so you need to purchase a license from https://spec.org/cpu2017/

from workload-services-framework.

yikedeng avatar yikedeng commented on July 30, 2024 1

For Workload : SpecCpu-2017 ERROR: failed to solve: failed to compute cache key: failed to calculate the checksum of ref 8f74b3ec-14bb-4b3f-bf4a-2415442bc78c::s1b8k8203tvipdcv00euqyggg: "/data": not found
As readme mentioned: need user manually create /data folder and export related binaries --> https://github.com/intel-innersource/applications.benchmarking.benchmark.external-platform-hero-features/blob/23.3_external/workload/SpecCpu-2017/README.md

Sure, where can we find these binaries?

sure, I will share SpecCpu-2017 related intel internal binary url to you via email.

from workload-services-framework.

xiuying1 avatar xiuying1 commented on July 30, 2024 1

Can you provide information on Python-related errors for BERTLarge-PyTorch-Xeon-Public & ResNet50-PyTorch-Xeon-Public?

After contact Dev, PR: #46 for this issue

from workload-services-framework.

yikedeng avatar yikedeng commented on July 30, 2024

For Workload : SpecCpu-2017
ERROR: failed to solve: failed to compute cache key: failed to calculate the checksum of ref 8f74b3ec-14bb-4b3f-bf4a-2415442bc78c::s1b8k8203tvipdcv00euqyggg: "/data": not found

As readme mentioned: need user manually create /data folder and export related binaries --> https://github.com/intel-innersource/applications.benchmarking.benchmark.external-platform-hero-features/blob/23.3_external/workload/SpecCpu-2017/README.md

from workload-services-framework.

dihu1007 avatar dihu1007 commented on July 30, 2024

For Workloads: SmartScience-YOLO-MSTCN-OpenVINO, Video-Structure, 3DHuman-Pose-Estimation
These 3 WLs all need customers to refer to the steps mentioned in readme to add some necessary files before make, hope these can solve the problem you mentioned.
SmartScience-YOLO-MSTCN-OpenVINO:
https://github.com/intel/workload-services-framework/blob/main/workload/SmartScience-YOLO-MSTCN-OpenVINO/README.md#preparation
Video-Structure:
https://github.com/intel/workload-services-framework/blob/main/workload/Video-Structure/README.md#preparation
3DHuman-Pose-Estimation:
https://github.com/intel/workload-services-framework/blob/main/stack/3DHuman-Pose/README.md#usage

from workload-services-framework.

saiprasanna7 avatar saiprasanna7 commented on July 30, 2024

For Workload : SpecCpu-2017 ERROR: failed to solve: failed to compute cache key: failed to calculate the checksum of ref 8f74b3ec-14bb-4b3f-bf4a-2415442bc78c::s1b8k8203tvipdcv00euqyggg: "/data": not found

As readme mentioned: need user manually create /data folder and export related binaries --> https://github.com/intel-innersource/applications.benchmarking.benchmark.external-platform-hero-features/blob/23.3_external/workload/SpecCpu-2017/README.md

Sure, where can we find these binaries?

from workload-services-framework.

saiprasanna7 avatar saiprasanna7 commented on July 30, 2024

For Workloads: SmartScience-YOLO-MSTCN-OpenVINO, Video-Structure, 3DHuman-Pose-Estimation These 3 WLs all need customers to refer to the steps mentioned in readme to add some necessary files before make, hope these can solve the problem you mentioned. SmartScience-YOLO-MSTCN-OpenVINO: https://github.com/intel/workload-services-framework/blob/main/workload/SmartScience-YOLO-MSTCN-OpenVINO/README.md#preparation Video-Structure: https://github.com/intel/workload-services-framework/blob/main/workload/Video-Structure/README.md#preparation 3DHuman-Pose-Estimation: https://github.com/intel/workload-services-framework/blob/main/stack/3DHuman-Pose/README.md#usage

Can you provide information on how can we get these files in order to build the docker image?
I see there are files attached in README.md for 3D-Human-Pose-Estimation, but not for other workloads.

from workload-services-framework.

saiprasanna7 avatar saiprasanna7 commented on July 30, 2024

Can you provide information on Python-related errors for BERTLarge-PyTorch-Xeon-Public & ResNet50-PyTorch-Xeon-Public?

from workload-services-framework.

saiprasanna7 avatar saiprasanna7 commented on July 30, 2024

Can you provide information on Python-related errors for BERTLarge-PyTorch-Xeon-Public & ResNet50-PyTorch-Xeon-Public?

After contact Dev, PR: #46 for this issue

Tried building with this patch, the BERTLarge workload is failing with following errors:

#18 27.78 Requirement already satisfied: joblib in /root/anaconda3/lib/python3.10/site-packages (from sacremoses->transformers==3.0.2) (1.2.0)
#18 27.84 Building wheels for collected packages: tokenizers, sacremoses
#18 27.84   Building wheel for tokenizers (pyproject.toml): started
#18 28.27   Building wheel for tokenizers (pyproject.toml): finished with status 'error'
#18 28.28   error: subprocess-exited-with-error
#18 28.28
#18 28.28   × Building wheel for tokenizers (pyproject.toml) did not run successfully.
#18 28.28   │ exit code: 1
#18 28.28   ╰─> [48 lines of output]
#18 28.28       /tmp/pip-build-env-_l68llis/overlay/lib/python3.10/site-packages/setuptools/dist.py:314: InformationOnly: Normalizing '0.8.1.rc1' to '0.8.1rc1'
#18 28.28         self.metadata.version = self._normalize_version(
#18 28.28       running bdist_wheel
#18 28.28       running build
#18 28.28       running build_py
#18 28.28       creating build
#18 28.28       creating build/lib.linux-x86_64-cpython-310
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers
#18 28.28       copying tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/models
#18 28.28       copying tokenizers/models/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/models
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/decoders
#18 28.28       copying tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/decoders
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
#18 28.28       copying tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
#18 28.28       copying tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/processors
#18 28.28       copying tokenizers/processors/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/processors
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/trainers
#18 28.28       copying tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/trainers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers
#18 28.28       copying tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/models
#18 28.28       copying tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/decoders
#18 28.28       copying tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
#18 28.28       copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
#18 28.28       copying tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/processors
#18 28.28       copying tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/trainers
#18 28.28       running build_ext
#18 28.28       running build_rust
#18 28.28       error: can't find Rust compiler
#18 28.28
#18 28.28       If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
#18 28.28
#18 28.28       To update pip, run:
#18 28.28
#18 28.28           pip install --upgrade pip
#18 28.28
#18 28.28       and then retry package installation.
#18 28.28
#18 28.28       If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
#18 28.28       [end of output]
#18 28.28
#18 28.28   note: This error originates from a subprocess, and is likely not a problem with pip.
#18 28.29   ERROR: Failed building wheel for tokenizers
#18 28.29   Building wheel for sacremoses (setup.py): started
#18 29.37   Building wheel for sacremoses (setup.py): finished with status 'done'
#18 29.38   Created wheel for sacremoses: filename=sacremoses-0.0.53-py3-none-any.whl size=895241 sha256=a58105eaac7a12184a43fc033ef7a7510230af243983494f6ad41d52989c879d
#18 29.38   Stored in directory: /root/.cache/pip/wheels/00/24/97/a2ea5324f36bc626e1ea0267f33db6aa80d157ee977e9e42fb
#18 29.39 Successfully built sacremoses
#18 29.39 Failed to build tokenizers
#18 29.39 ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
#18 29.39
#18 29.39 [notice] A new release of pip is available: 23.1.1 -> 23.3
#18 29.39 [notice] To update, run: pip install --upgrade pip
------
process "/bin/bash -c source activate base &&     cd quickstart/language_modeling/pytorch/bert_large/inference/cpu &&     git clone https://github.com/huggingface/transformers.git &&     cd transformers &&     git checkout v3.0.2 &&     git apply ../enable_ipex_for_squad.diff &&     pip install -e ./ &&     pip install tensorboard tensorboardX" did not complete successfully: exit code: 1
workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public.dir/build.make:57: recipe for target 'workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public' failed
make[2]: *** [workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public] Error 1
CMakeFiles/Makefile2:985: recipe for target 'workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public.dir/all' failed
make[1]: *** [workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public.dir/all] Error 2
Makefile:94: recipe for target 'all' failed
make: *** [all] Error 2

from workload-services-framework.

saiprasanna7 avatar saiprasanna7 commented on July 30, 2024

Can you provide information on Python-related errors for BERTLarge-PyTorch-Xeon-Public & ResNet50-PyTorch-Xeon-Public?

After contact Dev, PR: #46 for this issue

Tried building with this patch, the BERTLarge workload is failing with following errors:

#18 27.78 Requirement already satisfied: joblib in /root/anaconda3/lib/python3.10/site-packages (from sacremoses->transformers==3.0.2) (1.2.0)
#18 27.84 Building wheels for collected packages: tokenizers, sacremoses
#18 27.84   Building wheel for tokenizers (pyproject.toml): started
#18 28.27   Building wheel for tokenizers (pyproject.toml): finished with status 'error'
#18 28.28   error: subprocess-exited-with-error
#18 28.28
#18 28.28   × Building wheel for tokenizers (pyproject.toml) did not run successfully.
#18 28.28   │ exit code: 1
#18 28.28   ╰─> [48 lines of output]
#18 28.28       /tmp/pip-build-env-_l68llis/overlay/lib/python3.10/site-packages/setuptools/dist.py:314: InformationOnly: Normalizing '0.8.1.rc1' to '0.8.1rc1'
#18 28.28         self.metadata.version = self._normalize_version(
#18 28.28       running bdist_wheel
#18 28.28       running build
#18 28.28       running build_py
#18 28.28       creating build
#18 28.28       creating build/lib.linux-x86_64-cpython-310
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers
#18 28.28       copying tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/models
#18 28.28       copying tokenizers/models/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/models
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/decoders
#18 28.28       copying tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/decoders
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
#18 28.28       copying tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
#18 28.28       copying tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/processors
#18 28.28       copying tokenizers/processors/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/processors
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/trainers
#18 28.28       copying tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/trainers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers
#18 28.28       copying tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/models
#18 28.28       copying tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/decoders
#18 28.28       copying tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
#18 28.28       copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
#18 28.28       copying tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/processors
#18 28.28       copying tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/trainers
#18 28.28       running build_ext
#18 28.28       running build_rust
#18 28.28       error: can't find Rust compiler
#18 28.28
#18 28.28       If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
#18 28.28
#18 28.28       To update pip, run:
#18 28.28
#18 28.28           pip install --upgrade pip
#18 28.28
#18 28.28       and then retry package installation.
#18 28.28
#18 28.28       If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
#18 28.28       [end of output]
#18 28.28
#18 28.28   note: This error originates from a subprocess, and is likely not a problem with pip.
#18 28.29   ERROR: Failed building wheel for tokenizers
#18 28.29   Building wheel for sacremoses (setup.py): started
#18 29.37   Building wheel for sacremoses (setup.py): finished with status 'done'
#18 29.38   Created wheel for sacremoses: filename=sacremoses-0.0.53-py3-none-any.whl size=895241 sha256=a58105eaac7a12184a43fc033ef7a7510230af243983494f6ad41d52989c879d
#18 29.38   Stored in directory: /root/.cache/pip/wheels/00/24/97/a2ea5324f36bc626e1ea0267f33db6aa80d157ee977e9e42fb
#18 29.39 Successfully built sacremoses
#18 29.39 Failed to build tokenizers
#18 29.39 ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
#18 29.39
#18 29.39 [notice] A new release of pip is available: 23.1.1 -> 23.3
#18 29.39 [notice] To update, run: pip install --upgrade pip
------
process "/bin/bash -c source activate base &&     cd quickstart/language_modeling/pytorch/bert_large/inference/cpu &&     git clone https://github.com/huggingface/transformers.git &&     cd transformers &&     git checkout v3.0.2 &&     git apply ../enable_ipex_for_squad.diff &&     pip install -e ./ &&     pip install tensorboard tensorboardX" did not complete successfully: exit code: 1
workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public.dir/build.make:57: recipe for target 'workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public' failed
make[2]: *** [workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public] Error 1
CMakeFiles/Makefile2:985: recipe for target 'workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public.dir/all' failed
make[1]: *** [workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public.dir/all] Error 2
Makefile:94: recipe for target 'all' failed
make: *** [all] Error 2

Tried upgrading the pip to 23.3 but still fails with the same error

from workload-services-framework.

ziyizhang-1 avatar ziyizhang-1 commented on July 30, 2024

Can you provide information on Python-related errors for BERTLarge-PyTorch-Xeon-Public & ResNet50-PyTorch-Xeon-Public?

After contact Dev, PR: #46 for this issue

Tried building with this patch, the BERTLarge workload is failing with following errors:

#18 27.78 Requirement already satisfied: joblib in /root/anaconda3/lib/python3.10/site-packages (from sacremoses->transformers==3.0.2) (1.2.0)
#18 27.84 Building wheels for collected packages: tokenizers, sacremoses
#18 27.84   Building wheel for tokenizers (pyproject.toml): started
#18 28.27   Building wheel for tokenizers (pyproject.toml): finished with status 'error'
#18 28.28   error: subprocess-exited-with-error
#18 28.28
#18 28.28   × Building wheel for tokenizers (pyproject.toml) did not run successfully.
#18 28.28   │ exit code: 1
#18 28.28   ╰─> [48 lines of output]
#18 28.28       /tmp/pip-build-env-_l68llis/overlay/lib/python3.10/site-packages/setuptools/dist.py:314: InformationOnly: Normalizing '0.8.1.rc1' to '0.8.1rc1'
#18 28.28         self.metadata.version = self._normalize_version(
#18 28.28       running bdist_wheel
#18 28.28       running build
#18 28.28       running build_py
#18 28.28       creating build
#18 28.28       creating build/lib.linux-x86_64-cpython-310
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers
#18 28.28       copying tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/models
#18 28.28       copying tokenizers/models/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/models
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/decoders
#18 28.28       copying tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/decoders
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
#18 28.28       copying tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
#18 28.28       copying tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/processors
#18 28.28       copying tokenizers/processors/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/processors
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/trainers
#18 28.28       copying tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/trainers
#18 28.28       creating build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
#18 28.28       copying tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers
#18 28.28       copying tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/models
#18 28.28       copying tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/decoders
#18 28.28       copying tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
#18 28.28       copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
#18 28.28       copying tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/processors
#18 28.28       copying tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/trainers
#18 28.28       running build_ext
#18 28.28       running build_rust
#18 28.28       error: can't find Rust compiler
#18 28.28
#18 28.28       If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
#18 28.28
#18 28.28       To update pip, run:
#18 28.28
#18 28.28           pip install --upgrade pip
#18 28.28
#18 28.28       and then retry package installation.
#18 28.28
#18 28.28       If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
#18 28.28       [end of output]
#18 28.28
#18 28.28   note: This error originates from a subprocess, and is likely not a problem with pip.
#18 28.29   ERROR: Failed building wheel for tokenizers
#18 28.29   Building wheel for sacremoses (setup.py): started
#18 29.37   Building wheel for sacremoses (setup.py): finished with status 'done'
#18 29.38   Created wheel for sacremoses: filename=sacremoses-0.0.53-py3-none-any.whl size=895241 sha256=a58105eaac7a12184a43fc033ef7a7510230af243983494f6ad41d52989c879d
#18 29.38   Stored in directory: /root/.cache/pip/wheels/00/24/97/a2ea5324f36bc626e1ea0267f33db6aa80d157ee977e9e42fb
#18 29.39 Successfully built sacremoses
#18 29.39 Failed to build tokenizers
#18 29.39 ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
#18 29.39
#18 29.39 [notice] A new release of pip is available: 23.1.1 -> 23.3
#18 29.39 [notice] To update, run: pip install --upgrade pip
------
process "/bin/bash -c source activate base &&     cd quickstart/language_modeling/pytorch/bert_large/inference/cpu &&     git clone https://github.com/huggingface/transformers.git &&     cd transformers &&     git checkout v3.0.2 &&     git apply ../enable_ipex_for_squad.diff &&     pip install -e ./ &&     pip install tensorboard tensorboardX" did not complete successfully: exit code: 1
workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public.dir/build.make:57: recipe for target 'workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public' failed
make[2]: *** [workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public] Error 1
CMakeFiles/Makefile2:985: recipe for target 'workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public.dir/all' failed
make[1]: *** [workload/BERTLarge-PyTorch-Xeon-Public/CMakeFiles/build_bertlarge-pytorch-xeon-public.dir/all] Error 2
Makefile:94: recipe for target 'all' failed
make: *** [all] Error 2

Tried upgrading the pip to 23.3 but still fails with the same error

Since you have upgraded the PyTorch base stack. you also need to bump the transformer version and benchmark code version as well.
Consider:

  1. Switch intel modelzoo (in Dockerfile.2.benchmark) from spr-launch-public to pytorch-r2.0-models
  2. Change commit ID to 168256a
  3. Switch transformer from v3.0.2 to v4.18.0 and also the EVAL_SCRIPT (in Dockerfile.1.inference)

Those change had already made in the innersource, please refer to PR8275 and PR8417

from workload-services-framework.

Related Issues (6)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.