Giter Club home page Giter Club logo

Comments (5)

JiahangXu avatar JiahangXu commented on August 18, 2024 1

That's right. To test the model level prediction accuracy, you should generate some sample based on different network architecture. We plan to release the code of data generation in about May. The idea of dataset generation is change configs, such as the output channel of every building block in Mobilenet, and the kernel size for dwconv operation, based on one fixed network architecture. After generating models in .pb format, the nn-meter ir graph could be generated by command nn-meter get_ir --tensorflow <pb-file> [--output <output-name>]. And the golden standard of latency could be obtained by profiling. If you are in urgent, you can follow the mentioned step to generation benchmark dataset. I'm sorry for the inconvenience.

from nn-meter.

JiahangXu avatar JiahangXu commented on August 18, 2024

Hi, thanks for your interests to nn-Meter! We have the plan to release the code of data generation. Due to other features with higher priority, the dataset generation code is planed to be released in about May, 2022.

from nn-meter.

XYAskWhy avatar XYAskWhy commented on August 18, 2024

Hi, thanks for your interests to nn-Meter! We have the plan to release the code of data generation. Due to other features with higher priority, the dataset generation code is planed to be released in about May, 2022.

Hi, I am customizing a new predictor to try nn-Meter out. To make things easier, I only changed to another Cortex cpu and still using the benchmark_model_cpu_v2.1 profiler you provided. Now I am at the final steps of Build Kernel Latency Predictor , and I realize that I won't be able test the accuray of the predictor. So I wonder how could I create my own benchmark datasets? Is it the dataset generation code you mentioned above? @JiahangXu

from nn-meter.

XYAskWhy avatar XYAskWhy commented on August 18, 2024

That's right. To test the model level prediction accuracy, you should generate some sample based on different network architecture. We plan to release the code of data generation in about May. The idea of dataset generation is change configs, such as the output channel of every building block in Mobilenet, and the kernel size for dwconv operation, based on one fixed network architecture. After generating models in .pb format, the nn-meter ir graph could be generated by command nn-meter get_ir --tensorflow <pb-file> [--output <output-name>]. And the golden standard of latency could be obtained by profiling. If you are in urgent, you can follow the mentioned step to generation benchmark dataset. I'm sorry for the inconvenience.

Hi there. I tried to profile models in pb_models.zip with benchmark_model_cpu_v2.1 on my CortexA73 device, but I can't convert the pb model to tflite format. The code I am using for example is:
import tensorflow as tf
tf_model_path = ".../mobilenetv2_0"
tflite_model_path = ".../mobilenetv2_0.tflite"
converter = tf.lite.TFLiteConverter.from_saved_model(tf_model_path)
tf_lite_model = converter.convert()
with open(tflite_model_path, 'wb') as f:
f.write(tflite_model_path)
and I always get a error as:
RuntimeError: MetaGraphDef associated with tags {'serve'} could not be found in SavedModel. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: saved_model_cli available_tags: [set()]
I also tried to convert an onnx model to pb then to tflite model, and It doesn't raise error any more, but the nchw to nhwc input format conversion leads to many Pad and Transpose ops in the output model, which significantly affects the latency profiled.
So while you are working hard on the dataset generation code, could you please open source the tflite models as you did with pb_models.zip and onnx_models.zip in advance? Or can you provide a practicable method for converting to tflite format from your already open-sourced pb/onnx model?
Besides, can I use the following command to profile a whole model, which is exactly the same as profiling a kernel model?
./benchmark_model_cpu_v2.1 --kernel_path=/data/tf_benchmark/kernel.cl --num_threads=1 --num_runs=50 --warmup_runs=10 --graph=/data/tf_benchmark/mobilenetv2_0.tflite
If not, could you kindly share the correct profile tool/method too?
Many thanks. @JiahangXu

from nn-meter.

SakuraiYuuta avatar SakuraiYuuta commented on August 18, 2024

Did the dataset generation code release? I didn't find it. And could you please share the code to create the nasbench201 models? I can't create the nasbench201 models with tensorflow.

from nn-meter.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.