Giter Club home page Giter Club logo

edgetpu's Introduction

Coral issue tracker

This edgetpu repo is primarily our issue tracker for all types of bugs or feature requests with Coral devices and software.

If you have an issue, please report it here.

The code that remains in this repo is legacy and might be removed in the future.

Legacy readme

The following build information is still accurate for the code in this repo, but beware that all the code in here is no longer maintained.

You should instead refer to the following repos:

Edge TPU Runtime

Run scripts/runtime/install.sh to install Edge TPU runtime or scripts/runtime/uninstall.sh to uninstall it.

Edge TPU Python API

  1. Run scripts/build_swig.sh to build SWIG-based native layer for different Linux architectures. Build is Docker-based, so you need to have it installed.

  2. Run make wheel to generate Python library wheel and then pip3 install $(ls dist/*.whl) to install it

Native C++ code

All native code is inside src folder. You can build everything using make command which invokes Bazel internally.

For example, run make tests to build all C++ unit tests or make benchmarks to build all C++ benchmarks. To get the list of all available make targets run make help. All output goes to out directory.

Linux

On Linux you can compile natively or cross-compile for 32-bit and 64-bit ARM CPUs.

To compile natively you need to install at least the following packages:

sudo apt-get install -y build-essential \
                        libpython3-dev \
                        libusb-1.0-0-dev \

and to cross-compile:

sudo dpkg --add-architecture armhf
sudo apt-get install -y crossbuild-essential-armhf \
                        libpython3-dev:armhf \
                        libusb-1.0-0-dev:armhf

sudo dpkg --add-architecture arm64
sudo apt-get install -y crossbuild-essential-arm64 \
                        libpython3-dev:arm64 \
                        libusb-1.0-0-dev:arm64

Compilation or cross-compilation is done by setting CPU variable for make command:

make CPU=k8      tests  # Builds for x86_64 (default CPU value)
make CPU=armv7a  tests  # Builds for ARMv7-A, e.g. Pi 3 or Pi 4
make CPU=aarch64 tests  # Builds for ARMv8, e.g. Coral Dev Board

macOS

You need to install the following software:

  1. Xcode from https://developer.apple.com/xcode/
  2. Xcode Command Line Tools: xcode-select --install
  3. Bazel for macOS from https://github.com/bazelbuild/bazel/releases
  4. MacPorts from https://www.macports.org/install.php
  5. Ports of python interpreter and numpy library: sudo port install python35 python36 python37 py35-numpy py36-numpy py37-numpy
  6. Port of libusb library: sudo port install libusb

Right after that all normal make commands should work as usual. You can run make tests to compile all C++ unit tests natively on macOS.

Docker

Docker allows to avoid complicated environment setup and build binaries for Linux on other operating systems without complicated setup:

make DOCKER_IMAGE=debian:buster DOCKER_CPUS="k8 armv7a aarch64" DOCKER_TARGETS=tests docker-build
make DOCKER_IMAGE=ubuntu:18.04  DOCKER_CPUS="k8 armv7a aarch64" DOCKER_TARGETS=tests docker-build

edgetpu's People

Contributors

dmitriykovalev avatar hjonnala avatar weiranzhao avatar xiaoqiqi177 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

edgetpu's Issues

about two usb camera in one program

Hi All:

after some try and error, I found using two USB camera by plug into one USB hub, and the USB hub is plugged into edgetpu dev-board's USB type-A connector. the USB hub is 3.1 compliant.

run the my c++ test program by using opencv will show the following message.
this seems come from kernel / uvc driver's complain.

[ 9079.026742] usb 1-1.1.2: new high-speed USB device number 13 using xhci-hcd
[ 9079.236013] uvcvideo: Found UVC 1.00 device Webcam C170 (046d:082b)
[ 9079.250905] uvcvideo 1-1.1.2:1.0: Entity type for entity Processing 2 was not initialized!
[ 9079.259206] uvcvideo 1-1.1.2:1.0: Entity type for entity Extension 6 was not initialized!
[ 9079.267570] uvcvideo 1-1.1.2:1.0: Entity type for entity Camera 1 was not initialized!
[ 9079.276133] input: Webcam C170: Webcam C170 as /devices/platform/usb@38200000/38200000.dwc3/xhci-hcd.0.auto/usb1/1-1/1-1.1/1-1.1.2/1-1.1.2:1.0/input/input12
[ 9079.396470] usbcore: registered new interface driver snd-usb-audio
[ 9085.682685] usb 1-1.1.1: new high-speed USB device number 14 using xhci-hcd
[ 9085.827077] uvcvideo: Found UVC 1.00 device HD USB Camera (05a3:9520)
[ 9085.967116] uvcvideo 1-1.1.1:1.0: Entity type for entity Extension 2 was not initialized!
[ 9085.975423] uvcvideo 1-1.1.1:1.0: Entity type for entity Processing 3 was not initialized!
[ 9085.983912] uvcvideo 1-1.1.1:1.0: Entity type for entity Camera 1 was not initialized!
[ 9085.993012] input: HD USB Camera as /devices/platform/usb@38200000/38200000.dwc3/xhci-hcd.0.auto/usb1/1-1/1-1.1/1-1.1.1/1-1.1.1:1.0/input/input13
[ 9090.402334] usb 1-1.1.2: reset high-speed USB device number 13 using xhci-hcd
[ 9091.861684] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9091.868956] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9091.924090] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9091.930953] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9091.992355] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9091.999306] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.059786] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.066558] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.128033] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.134871] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.193329] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.199936] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.261640] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.268426] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.332729] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.339572] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.396418] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.403007] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.460239] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.466923] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.656892] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.663589] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.679862] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.686257] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.697201] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.704174] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.726696] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.733531] usb 1-1.1.1: Not enough bandwidth for altsetting 1
[ 9092.799419] usb 1-1.1.1: Not enough bandwidth for new device state.
[ 9092.805925] usb 1-1.1.1: Not enough bandwidth for altsetting 1

if using one USB camera, the c++ test program works very well. but not ok in two USB camera case.

for one camera, the setting of camera could reach to 1920x1080@15fps.

for two camera case, both of them are all 640x480@30fps, even downgrade to 320x240@15fps., still show out the above messages.

where is the limitation? the TYPE-A USB port on edgetpu dev-board? the USB hub?

is there any possible way to solve this problem?

BR, Akio

Mismatched glibc version

Running:

make DOCKER_IMAGE=ubuntu:18.04  DOCKER_CPUS="aarch64" DOCKER_TARGETS=all docker-build

and testing one of the resulting binaries on a Coral EdgeTPU dev board, produces the following:

mendel@edgetpu:~$ ./classify_image
./classify_image: /lib/aarch64-linux-gnu/libm.so.6: version `GLIBC_2.27' not found (required by ./classify_image)

Mendel ships with glibc 2.24:

mendel@edgetpu:~$ ldd --version
ldd (Debian GLIBC 2.24-11+deb9u4) 2.24

Coral pcie install (gasket-dkms) fails on ROCK PI 4

Hi,

Installing gasket-dkms fails.

Thanks for looking into it

pi@rockpi1: sudo apt install gasket-dkms fails with following information:
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following NEW packages will be installed:
gasket-dkms
0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded.
Need to get 44.7 kB of archives.
After this operation, 240 kB of additional disk space will be used.
Get:1 https://packages.cloud.google.com/apt coral-edgetpu-stable/main arm64 gasket-dkms all 1.0-10 [44.7 kB]
Fetched 44.7 kB in 1s (88.3 kB/s)
Selecting previously unselected package gasket-dkms.
(Reading database ... 96059 files and directories currently installed.)
Preparing to unpack .../gasket-dkms_1.0-10_all.deb ...
Unpacking gasket-dkms (1.0-10) ...
Setting up gasket-dkms (1.0-10) ...
Loading new gasket-1.0 DKMS files...
Building for 4.4.154-104-rockchip-g3037f71a0de7
Building initial module for 4.4.154-104-rockchip-g3037f71a0de7
Error! Bad return status for module build on kernel: 4.4.154-104-rockchip-g3037f71a0de7 (aarch64)
Consult /var/lib/dkms/gasket/1.0/build/make.log for more information.

pi@rockpi1:~$ cat /var/lib/dkms/gasket/1.0/build/make.log
DKMS make.log for gasket-1.0 for kernel 4.4.154-104-rockchip-g3037f71a0de7 (aarch64)
Sat Jan 11 20:19:56 UTC 2020
make: Entering directory '/usr/src/linux-headers-4.4.154-104-rockchip-g3037f71a0de7'
LD /var/lib/dkms/gasket/1.0/build/built-in.o
CC [M] /var/lib/dkms/gasket/1.0/build/gasket_core.o
CC [M] /var/lib/dkms/gasket/1.0/build/gasket_page_table.o
CC [M] /var/lib/dkms/gasket/1.0/build/gasket_interrupt.o
CC [M] /var/lib/dkms/gasket/1.0/build/gasket_ioctl.o
CC [M] /var/lib/dkms/gasket/1.0/build/gasket_sysfs.o
CC [M] /var/lib/dkms/gasket/1.0/build/apex_driver.o
/bin/sh: 1: ./scripts/recordmcount: Exec format error
scripts/Makefile.build:277: recipe for target '/var/lib/dkms/gasket/1.0/build/gasket_sysfs.o' failed
make[1]: *** [/var/lib/dkms/gasket/1.0/build/gasket_sysfs.o] Error 2
make[1]: *** Waiting for unfinished jobs....
/bin/sh: 1: ./scripts/recordmcount: Exec format error
scripts/Makefile.build:277: recipe for target '/var/lib/dkms/gasket/1.0/build/gasket_ioctl.o' failed
make[1]: *** [/var/lib/dkms/gasket/1.0/build/gasket_ioctl.o] Error 2
/bin/sh: 1: ./scripts/recordmcount: Exec format error
scripts/Makefile.build:277: recipe for target '/var/lib/dkms/gasket/1.0/build/gasket_interrupt.o' failed
make[1]: *** [/var/lib/dkms/gasket/1.0/build/gasket_interrupt.o] Error 2
/bin/sh: 1: ./scripts/recordmcount: Exec format error
scripts/Makefile.build:277: recipe for target '/var/lib/dkms/gasket/1.0/build/gasket_page_table.o' failed
make[1]: *** [/var/lib/dkms/gasket/1.0/build/gasket_page_table.o] Error 2
/bin/sh: 1: ./scripts/recordmcount: Exec format error
scripts/Makefile.build:277: recipe for target '/var/lib/dkms/gasket/1.0/build/gasket_core.o' failed
make[1]: *** [/var/lib/dkms/gasket/1.0/build/gasket_core.o] Error 2
/bin/sh: 1: ./scripts/recordmcount: Exec format error
scripts/Makefile.build:277: recipe for target '/var/lib/dkms/gasket/1.0/build/apex_driver.o' failed
make[1]: *** [/var/lib/dkms/gasket/1.0/build/apex_driver.o] Error 2
Makefile:1474: recipe for target 'module/var/lib/dkms/gasket/1.0/build' failed
make: *** [module/var/lib/dkms/gasket/1.0/build] Error 2
make: Leaving directory '/usr/src/linux-headers-4.4.154-104-rockchip-g3037f71a0de7'

pi@rockpi1:~$ lscpu
Architecture: aarch64
Byte Order: Little Endian
CPU(s): 6
On-line CPU(s) list: 0-5
Thread(s) per core: 1
Core(s) per socket: 3
Socket(s): 2
Vendor ID: ARM
Model: 4
Model name: Cortex-A53
Stepping: r0p4
CPU max MHz: 1800.0000
CPU min MHz: 408.0000
BogoMIPS: 48.00
Flags: fp asimd evtstrm aes pmull sha1 sha2 crc32

pi@rockpi1:~$ uname -r
4.4.154-104-rockchip-g3037f71a0de7

pi@rockpi1:~$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 18.04.3 LTS
Release: 18.04
Codename: bionic

Request for static libedgetpu.a

Would it be possible to remove the explicit dependency on libc++.so.1 and libc++.so.1 . Those are old libraries from older Debian releases which I think the official build process uses. This prevents us from using libedgetpu.so on newer AARCH64 releases like 64 bit Gentoo on RPi4 (https://github.com/sakaki-/gentoo-on-rpi-64bit) .

Or alternatively please package the libc++ deps too, or better yet just open source libedgetpu.so :)

After fighting bazel on ARM64 and cross compiling etc. I have a simple CMake based CPP EdgeTPU example here: https://github.com/powderluv/etdemo

tf.linalg.matmul is not converted

I have created a simple model with the following code:

import tensorflow as tf

size = 1024

@tf.function(input_signature=[tf.TensorSpec([size] * 2, tf.float32)] * 2)
def bench_func(a, b):
    x = tf.linalg.matmul(a, b, transpose_b=True)  # b is not transposed, but this is a benchmark, whatever
    return tf.reduce_sum(x)

def gen_input_samples():
    i = np.identity(size, np.float32)
    yield [-i, i]
    yield [i, i]

converter = tf.lite.TFLiteConverter.from_concrete_functions([bench_func.get_concrete_function()])
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.representative_dataset = gen_input_samples
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
tflite_model = converter.convert()
with open("model.tflite", "wb") as fout:
    fout.write(tflite_model)

None of the ops get converted by the compiler:

edgetpu_compiler -s model.tflite

Edge TPU Compiler version 2.0.267685300

Model compiled successfully in 0 ms.

Input model: bench_model_1024.tflite
Input size: 5.37KiB
Output model: bench_model_1024_edgetpu.tflite
Output size: 5.20KiB
On-chip memory available for caching model parameters: 0.00B
On-chip memory used for caching model parameters: 0.00B
Off-chip memory used for streaming uncached model parameters: 0.00B
Number of Edge TPU subgraphs: 0
Total number of operations: 5
Operation log: bench_model_1024_edgetpu.log

Model successfully compiled but not all operations are supported by the Edge TPU. A percentage of the model will instead run on the CPU, which is slower. If possible, consider updating your model to use only operations supported by the Edge TPU. For details, visit g.co/coral/model-reqs.
Number of operations that will run on Edge TPU: 0
Number of operations that will run on CPU: 5

Operator                       Count      Status

FULLY_CONNECTED                1          Filter, bias, or other param is not constant at compile-time
SUM                            1          Operation is otherwise supported, but not mapped due to some unspecified limitation
QUANTIZE                       2          Operation is otherwise supported, but not mapped due to some unspecified limitation
DEQUANTIZE                     1          Operation is working on an unsupported data type

The most surprising to me is

FULLY_CONNECTED                1          Filter, bias, or other param is not constant at compile-time

because everything is definitely settled static. My expectation is that FULLY_CONNECTED, SUM and QUANTIZE should get converted.

Model file: model.zip

Equivalent flatbuffers JSON
{
  version: 3,
  operator_codes: [
    {
      builtin_code: "FULLY_CONNECTED",
      version: 4
    },
    {
      builtin_code: "SUM",
      version: 2
    },
    {
      builtin_code: "QUANTIZE"
    },
    {
      builtin_code: "DEQUANTIZE",
      version: 2
    }
  ],
  subgraphs: [
    {
      tensors: [
        {
          shape: [
            2
          ],
          type: "INT32",
          buffer: 3,
          name: "Const",
          quantization: {
          }
        },
        {
          type: "INT8",
          buffer: 5,
          name: "Identity_int8",
          quantization: {
            min: [
              -1024.0
            ],
            max: [
              1024.0
            ],
            scale: [
              8.031373
            ],
            zero_point: [
              -1
            ]
          }
        },
        {
          shape: [
            1024,
            1024
          ],
          type: "INT8",
          buffer: 6,
          name: "MatMul",
          quantization: {
            min: [
              -1.0
            ],
            max: [
              1.0
            ],
            scale: [
              0.007843
            ],
            zero_point: [
              -1
            ]
          }
        },
        {
          shape: [
            1024
          ],
          type: "INT32",
          buffer: 1,
          name: "MatMul_bias",
          quantization: {
            scale: [
              0.000031
            ],
            zero_point: [
              0
            ]
          }
        },
        {
          shape: [
            1024,
            1024
          ],
          type: "INT8",
          buffer: 4,
          name: "a_int8",
          quantization: {
            min: [
              -1.0
            ],
            max: [
              1.0
            ],
            scale: [
              0.007843
            ],
            zero_point: [
              -1
            ]
          }
        },
        {
          shape: [
            1024,
            1024
          ],
          type: "INT8",
          buffer: 2,
          name: "b_int8",
          quantization: {
            min: [
              0.0
            ],
            max: [
              1.0
            ],
            scale: [
              0.003922
            ],
            zero_point: [
              -128
            ]
          }
        },
        {
          shape: [
            1024,
            1024
          ],
          name: "a"
        },
        {
          shape: [
            1024,
            1024
          ],
          name: "b"
        },
        {
          name: "Identity"
        }
      ],
      inputs: [
        6,
        7
      ],
      outputs: [
        8
      ],
      operators: [
        {
          opcode_index: 2,
          inputs: [
            7
          ],
          outputs: [
            5
          ]
        },
        {
          opcode_index: 2,
          inputs: [
            6
          ],
          outputs: [
            4
          ]
        },
        {
          inputs: [
            4,
            5,
            3
          ],
          outputs: [
            2
          ],
          builtin_options_type: "FullyConnectedOptions",
          builtin_options: {
          }
        },
        {
          opcode_index: 1,
          inputs: [
            2,
            0
          ],
          outputs: [
            1
          ],
          builtin_options_type: "ReducerOptions",
          builtin_options: {
          }
        },
        {
          opcode_index: 3,
          inputs: [
            1
          ],
          outputs: [
            8
          ]
        }
      ]
    }
  ],
  description: "TOCO Converted.",
  buffers: [
    {
    },
    {
      data: [
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0,
        0
      ]
    },
    {
    },
    {
      data: [
        0,
        0,
        0,
        0,
        1,
        0,
        0,
        0
      ]
    },
    {
    },
    {
    },
    {
    },
    {
      data: [
        49,
        46,
        49,
        48,
        46,
        48
      ]
    }
  ],
  metadata: [
    {
      name: "min_runtime_version",
      buffer: 7
    }
  ]
}

Bench-marking at the op level

Tensorflow lite provides benchmarking at the op level in their cpp example with --enable_op_profiling flag.
How can this be done for models run on an edgetpu?

ERROR run src/cpp/examples/minimal.cc

when run src/cpp/examples/minimal.cc, i get

ERROR: Null custom op data.
ERROR: Node number 0 (edgetpu-custom-op) failed to prepare.

Failed to allocate tensors.

How fix it ???

How can I install the compiler?

hey, I want to install the x86_64 compiler in ubuntu16.04 using the compiler file in here, how should I do? Or how should I do to use it?

Invoke Hangs using C API

I am working on a project DOODS https://github.com/snowzach/doods that is an image detection API wrapper around TF/TFLite with EdgeTPU support. I am experiencing random hangs during invoke when using the EdgeTPU.

DOODS is written in Go and uses a wrapper around the Tensorflow Lite C API. Is there a good way to debug the hang? Is this the right place to open the issue?

Anchor free on edgetpu

As ancor free One stage object detection and kpts regression are going to be more and more popular can you add some kind of example on how to standardize ti handle these kind of models without using any Google/Coral team "canned solution"?

Generally all these research papers prefer pythorch code relase so It would be really useful to know to handle these models in Coral with an optimized pieline.

Also point 2 at coral-posenet mention some optimized custom ops for heatmap handling that instead It could be useful in general for anchor-free models.

See also:
tensorflow/hub#424
google-coral/tutorials#1
google-coral/project-posenet#16

Edge tpu m.2 not working

Followed all instructions as per coral tpu getting started web page, I am using PCIe m.2 tpu

ValueError: Failed to load delegate from libedgetpu.so.1 while running the inference code(classify image.py) example

ls /dev/apex_0 returns nothing

I am using RockPi 4b rk3399.

Any idea?
Thanks

ERROR : Deadline exceeded: USB transfer error 2 [LibUsbDataOutCallback]

I am using 2 Coral Edge Tpu's on Rasberry Pi 4. I use 2 threads to utilize the Coral's, where on the first thread I switch between 2 models to detect objects and extract features. Similarly, on the second thread I switch between 2 models to detect and classify objects. I have encountered the following error consistently when running the code for a long period of time (couple of hours).

F :838] transfer on tag 1 failed. Abort. Deadline exceeded: USB transfer error 2 [LibUsbDataOutCallback] Fatal Python error: Aborted

This error is precisely encountered when edgetpu.basic.basic_engine.BasicEngine.run_inference is called.

As far as I have experimented, I could not reproduce the problem in a consistent manner. It usually occurs after running for a couple hours, but also occasionally in shorter amounts of time. Moreover, I did not encounter the error when using a single Edge TPU with either thread function. Thus I suspect that this has something to do with the USB bus and its limitations in transferring data to two different Coral's in a sufficient amount of time.

Any ideas on causes or workarounds?

About option "keep_aspect_ratio" in examples/object_detection.py

I followed the instruction here, then I got the following result and first figure:

$ python3 examples/object_detection.py --model='test_data/ssd_mobilenet_v1_fine_tuned_edgetpu.tflite' --label='test_data/pet_labels.txt' --input='test_data/pets.jpg' --keep_aspect_ratio
-----------------------------------------
german_shorthaired
score =  0.37109375
box =  [1762.1801829338074, 838.6267263974463, 2210.349624156952, 1173.771092295647]
-----------------------------------------
shiba_inu
score =  0.32421875
box =  [1916.2815427780151, 784.8315710467953, 2195.608711242676, 1104.3338206197534]
Please check  object_detection_result.jpg

object_detection_result

I can see both two bounding boxes are assigned to the right dog.

Then I remove --keep_aspect_ratio form command line, which gave a better result.

$ python3 examples/object_detection.py --model='test_data/ssd_mobilenet_v1_fine_tuned_edgetpu.tflite' --label='test_data/pet_labels.txt' --input='test_data/pets.jpg' -----------------------------------------
staffordshire_bull_terrier
score =  0.37109375
box =  [712.9122591018677, 272.12397533655167, 1122.5919842720032, 674.8110462427139]
-----------------------------------------
staffordshire_bull_terrier
score =  0.35546875
box =  [1831.8864297866821, 770.1564584970474, 2190.160117149353, 1146.7327305078506]
Please check  object_detection_result.jpg

object_detection_result

EDGETPU_COMPILER: Didn't find op for builtin opcode 'RELU' version '2'

Hi,

I'm having some trouble in compiling my tflite model for edgetpu. I get the following error when calling edgetpu_compiler models/model.tflite:

Edge TPU Compiler version 2.0.267685300
ERROR: Didn't find op for builtin opcode 'RELU' version '2'

ERROR: Registration failed.

Invalid model: models/model.tflite
Model could not be parsed

But from https://coral.ai/docs/edgetpu/models-intro/#supported-operations I was expecting ReLU to be supported.

P.S. I've converted my TF model to full-integer post-training quantized tflite model with the following code:

import os

import tensorflow as tf
import numpy as np
from PIL import Image


model = 'model'
input_size = 112

dataset = []
directory_images = './models/' + model + '/data_test'
directory_saved = './models/' + model
for img in os.listdir(directory_images):
	data = Image.open(os.path.join(directory_images,img)).resize((input_size, input_size))
	data = np.asarray(data, dtype=np.float32)[np.newaxis, :]
	dataset.append(data)

def representative_dataset_gen():
    for input_value in tuple(dataset):
        yield [input_value]

converter = tf.lite.TFLiteConverter.from_saved_model(directory_saved + "/saved")
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.representative_dataset = representative_dataset_gen
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8

tflite_model = converter.convert()

name = directory_saved + "/" + model
open(name + ".tflite", "wb").write(tflite_model)

Fedora packages

Currently, there are only Debian packages available. Given that quite a few data science organizations typically run on Fedora, given the ease of use for SciPy and friends, can we get a few Fedora releases made available?

Quantize node not being converted

Hi,

I am trying to convert a simple Keras model from TF 2.0.

I follwed the "Full integer quantization of weights and activations" tutorial from the doc

def representative_dataset_gen():
    for i in range(100):
        yield [x_train[i, None]]

converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.representative_dataset = representative_dataset_gen
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8

The conversion goes well and results in the following graph in Netron:
model tflite

The edgetpu compilation is successful but states that the quantize nodes will be mapped to the CPU.

Edge TPU Compiler version 2.0.267685300
Input: /mnt/localtmp/model.tflite
Output: /mnt/localtmp/model_edgetpu.tflite

Operator                       Count      Status

SOFTMAX                        1          Mapped to Edge TPU
FULLY_CONNECTED                1          Mapped to Edge TPU
QUANTIZE                       1          Operation is otherwise supported, but not mapped due to some unspecified limitation
CONV_2D                        4          Mapped to Edge TPU
DEQUANTIZE                     1          Operation is working on an unsupported data type

However if I try to run this model in python it either throws Cannot cast array data from dtype('float32') to dtype('uint8') according to the rule 'safe' for float input or segfaults for uint8 input.

I see that all the test models do not have a quantization node but instead have a quantization information on their input nodes (seen in Netron) quantization: -1 โ‰ค 0.0078125 * (q - 128) โ‰ค 0.9921875.

Am I doing something wrong?
Is the edgetpu_compiler compatible with post training quantization from TF 2.0?

Issues when running Image classification example

There are issues when running the example here
Steps to reproduce:

Step 1. Following steps in this page

Step 2: Run the example code. You will get an issue like below

Traceback (most recent call last):
  File "classify_image.py", line 44, in <module>
    main()
  File "classify_image.py", line 32, in main
    labels = dataset_utils.read_label_file(args.label)
AttributeError: module 'edgetpu.utils.dataset_utils' has no attribute 'read_label_file'

Step 3: I checked all functions available in the dataset_utils module and seeing it's using ReadLabelFile as its function name, not 'read_label_file' in the example code.

Step 4: Update the code using 'ReadLabelFile', and run again. You will get another issue as below:

python3 classify_image.py --model models/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite --label models/inat_bird_labels.txt --image images/parrot.jpg

Error:

Segmentation fault (core dumped)

I debugged and saw it was erred at line 34: engine = ClassificationEngine(args.model)

Please validate. Thanks,

Sang

Internal compiler error. Aborting!

Hi,

I'm trying to compile my model to edgeTPU but I get the following error:

Edge TPU Compiler version 2.0.291256449

Internal compiler error. Aborting!

I've realized that the compiler is very "picky" about the model structure and sometimes is very difficult to figure out what's wrong.
How can I properly debug this kind of error?

I/O are still in float32, but that's not an issue. I'm already using other models with same I/O and they can compile and run. Moreover, I'm specifying the right data type during conversion to tflite:

import os

import tensorflow as tf
import numpy as np
from PIL import Image


model = 'model'
input_size = 160

dataset = []
directory_images = './models/' + model + '/data_test'
directory_saved = './models/' + model
for img in os.listdir(directory_images):
	data = Image.open(os.path.join(directory_images,img)).resize((input_size, input_size))
	data = np.asarray(data, dtype=np.float32)[np.newaxis, :]
	dataset.append((data-127.5)/128)

def representative_dataset_gen():
    for input_value in tuple(dataset):
        yield [input_value]

converter = tf.lite.TFLiteConverter.from_saved_model(directory_saved + "/saved")
#converter.allow_custom_ops=True
#converter.post_training_quantize=True
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.representative_dataset = representative_dataset_gen
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8

tflite_model = converter.convert()

name = directory_saved + "/" + model
open(name + ".tflite", "wb").write(tflite_model)

Here's my model: link

Creating our own model to do the imprinting weight

Hello,
I try to create the model to do the imprinted weight, the model through quantization-aware training and compiler. However, Regardless of using tf.keras.layers.LayerNormalizationใ€tf.math.l2_normalizeใ€tf.keras.backend.l2_normalize, edge TPU raised the Error: must have L2norm or RuntimeError: The last 5 operators should be L2Norm, Con2d, Mul, Reshape and Softmax when we are using imprinting_learning.py.
Which function do we have to use is able to lead us to run the imprinting_learning.py successfully?

Thanks

about the APIs and call flow.

Hi, All:

Currently the APIs of Edge TPU support two approach. One is Python, which could be found in
https://github.com/google-coral/edgetpu/tree/master/edgetpu. The other is C++ API, which could be found in https://github.com/google-coral/edgetpu/tree/master/libedgetpu.

The first problem is the Python interface is powered by the code in https://github.com/google-coral/edgetpu/tree/master/src/cpp or not?

The second problem is the code in https://github.com/google-coral/edgetpu/tree/master/src/cpp rely on libedgetpu.so directly or also need co-work with libtensorflow-lite.a ?

The path could be shown as followings.

1st case: Python API ==> C++ API ==> libtensorflow-lite.a ==> libedgetpu.so ==> edgetpu driver.

2nd case: Python API ==> C++ API ==> libtensorflow-lite.a __
==> libedgetpu.so /---->>>> edgetpu driver.

could the following flow possible or not:

data / model ==> Pre-processed by libtensorflow-lite.a ==> C++ API ==> libedgetpu.so

why this usage flow, the major reason is for the API maintenance.

Thank you.

BR, Akio

using edgetpu c++ api crash

Dear Sir:

After some code trying and library integration, finally I get a runable code sequence. the code is as the following. the code uses openCV for video capture, format conversion, resize, etc. the edgetpu part is using edgetpu c++ api which found in src/cpp. the crash log is got by setting "edgetpu::EdgeTpuManager::GetSingleton()->SetVerbosity(10);"

the teset code is the main.edgetpu_api.log. please rename to main.edgetpu_api.cpp
main.edgetpu_api.log

the crash log is as the following file.
edgetpu crash.log

Could any one give me some advice? Thanks alot.

BR, Akio

Object detection - CPU resources

Hello,

I'm performing object detection in the same way as classify_image.cc but using the detection engine and doing it in a loop for several images.
However, when I do so it consumes 100% of one of my CPU cores.
I didn't expect this since the detection is being executed on the USB accelerator. Right?

Thanks in advance.

Didn't find op for builtin opcode 'RESIZE_BILINEAR' version '3'

I was trying to compile a tflite model using the edgetpu_compiler tool, but got the following error:

Edge TPU Compiler version 2.0.267685300
ERROR: Didn't find op for builtin opcode 'RESIZE_BILINEAR' version '3'

ERROR: Registration failed.

This op is relatively new, it was fixed recently (tensorflow/tensorflow#33691) and I am using it by installing tf-nightly-gpu. How can I make the EdgeTPU compiler aware of this new op? Can I compile against the latest version of tensorflow somehow?

Running model with TimeDistributed or with batch_size>1

I am trying to deploy on the edge TPU a network that process input data of shape (256,1).
In order to exploit as much as possible the parallelism enabled by the TPU, I want to process multiple windows with one invoke.
One solution might be to use a TimeDistributed keras layer as follows

timedist=tf.keras.layers.TimeDistributed(conv_block,input_shape=(30,256,1))

This works without problems on CPU using tensorflow lite runtime. However, if I try to compile such model with the edgetpu compiler, I get the following error:

~/$ edgetpu_compiler -s conv30windows.tflite 
Edge TPU Compiler version 2.0.267685300

Internal compiler error. Aborting! 

Another solution might be to convert the model with a batch size larger than 1.

x=tf.keras.Input(shape=(256,1),batch_size=30)
y=conv_block(x)
model==tf.keras.Model(x,y)

but when I convert it with tflite converter, I find out that the input has again shape (1,256,1).

Is there any way to process in one shot all the 30 windows rather than calling invoke 30 times with different inputs?

Potential memory corruption using C API and certain models

I have been developing Doods https://github.com/snowzach/doods which is a Go wrapper around tensorflow and tensorflow lite with edgetpu support. It performs object detection. It uses the C api and CGO to call it.

When using models from the Coral EdgeTPU site it works great. I've run it days at a time with no issues.

I have been trying now using a few other models and I get all kinds of strange issues.
This model for the Pixel 4 from the model Zoo randomly returns strange results and sometimes even segfaults when running lots of times in a row: https://storage.cloud.google.com/mobilenet_edgetpu/checkpoints/ssdlite_mobilenet_edgetpu_coco_quant.tar.gz

One of my users provided me with this model that supposedly works fine in python: https://github.com/snowzach/doods/files/3923689/raccoon_edgetpu.tflite.zip
I understand it works fine with the python API.

Since I pretty much just call the C API. I'm not sure how to debug.

EdgeTPU - ResizeBilinear only for small Models?

Hello,

i want to use Google Coral Accelerator for Semantic Segmentation. Most Networks for Semantic Segmentation uses Encoder/Decoder architecture to accomplish such tasks. My goal is to run Unet on the edgetpu (coral usb accelerator).

Ive started with a very simple Keras/Tensorflow model like this:

    input_layer = Input(shape=(512, 512, 3))
    x = Conv2D(32, 3, padding = 'same')(input_layer)
    x = BatchNormalization()(x)
    x = Activation('relu')(x)   
    x = MaxPooling2D()(x)
    x = UpSampling2D(interpolation='bilinear')(x)
    
    output_layer = Conv2D(1, 1, padding = 'same', activation = 'sigmoid')(x)
    model = Model(inputs = input_layer, outputs = output_layer)
    model.compile(optimizer = Adam(lr = 0.001), loss = 'binary_crossentropy', metrics = ['accuracy'])

After converting this model to tflite (with Tensorflow 1.15.0 Version) i tried to convert this model to edgetpu. Unfortunally for the ResizeBilinear Operator i get this message:

Operation is otherwise supported, but not mapped due to some unspecified limitation

When i change the Input Shape to smaller size, for example 128x128x3, the ResizeBilinear Operator mapps perfectly to edgetpu and i can smoothly run the compiled model on edgetpu.

On coral.ai it says for ResizeBilinear:

Input/output is a 3-dimensional tensor. Depending on input/output size, this operation may not be mapped to the Edge TPU to avoid loss in precision.

So my question is:
Is there a way to force mapping UpSampling2D/ResizeBilinear to Edgetpu despite input/output size?

Otherwise i see no use case for edgetpu in semantic segmentation. (What i have seen is that DeepLabV3 seems to work according Benchmark of Google, but we get no information about which project exactly was used, so we cant use it for custom data).

One more question: Even if ResizeBilinear is not mapped to edgetpu, i expect the model to run on edgetpu + cpu. But if i run the compiled tflite file, i get the Error:

RuntimeError: Internal: :71 tf_lite_type != kTfLiteUInt8 (9 != 3)Node number 5 (EdgeTpuDelegateForCustomOp) failed to prepare.

Im running it with the Code-Example provided by Google: https://github.com/google-coral/tflite.git
Under tflite/python/examples/classification/classify_image.py
I use this code for my model.

Failed to build from source on RPi 4(Buster)

Getting this error when executing scripts/build_swig.sh:

 ---> f3460b57b9f3
Step 8/14 : RUN apt-get update && apt-get install -y   debhelper   python   python-future   python3-all   python3-numpy   python3-setuptools   python3-wheel   libpython3-dev   libpython3-dev:armhf   libpython3-dev:arm64   build-essential   crossbuild-essential-armhf   crossbuild-essential-arm64   libusb-1.0-0-dev   libusb-1.0-0-dev:arm64   libusb-1.0-0-dev:armhf   zlib1g-dev   zlib1g-dev:armhf   zlib1g-dev:arm64   pkg-config   zip   unzip   curl   wget   git
 ---> Running in 1fd692177223
Get:1 http://ports.ubuntu.com/ubuntu-ports xenial InRelease [247 kB]
Get:2 http://ports.ubuntu.com/ubuntu-ports xenial-updates InRelease [109 kB]
Get:3 http://ports.ubuntu.com/ubuntu-ports xenial-backports InRelease [107 kB]
Get:4 http://ports.ubuntu.com/ubuntu-ports xenial-security InRelease [109 kB]
Ign:5 http://ports.ubuntu.com/ubuntu-ports xenial/main amd64 Packages
Ign:6 http://ports.ubuntu.com/ubuntu-ports xenial/restricted amd64 Packages
Ign:7 http://ports.ubuntu.com/ubuntu-ports xenial/universe amd64 Packages
Ign:8 http://ports.ubuntu.com/ubuntu-ports xenial/multiverse amd64 Packages
Get:9 http://ports.ubuntu.com/ubuntu-ports xenial/main arm64 Packages [1466 kB]
Get:10 http://ports.ubuntu.com/ubuntu-ports xenial/main armhf Packages [1486 kB]
Get:11 http://ports.ubuntu.com/ubuntu-ports xenial/universe arm64 Packages [9493 kB]
Get:12 http://ports.ubuntu.com/ubuntu-ports xenial/universe armhf Packages [9531 kB]
Ign:13 http://ports.ubuntu.com/ubuntu-ports xenial-updates/main amd64 Packages
Ign:14 http://ports.ubuntu.com/ubuntu-ports xenial-updates/restricted amd64 Packages
Ign:15 http://ports.ubuntu.com/ubuntu-ports xenial-updates/universe amd64 Packages
Ign:16 http://ports.ubuntu.com/ubuntu-ports xenial-updates/multiverse amd64 Packages
Get:17 http://ports.ubuntu.com/ubuntu-ports xenial-updates/main arm64 Packages [951 kB]
Get:18 http://ports.ubuntu.com/ubuntu-ports xenial-updates/main armhf Packages [1012 kB]
Get:19 http://ports.ubuntu.com/ubuntu-ports xenial-updates/universe arm64 Packages [905 kB]
Get:20 http://ports.ubuntu.com/ubuntu-ports xenial-updates/universe armhf Packages [882 kB]
Ign:21 http://ports.ubuntu.com/ubuntu-ports xenial-backports/main amd64 Packages
Ign:22 http://ports.ubuntu.com/ubuntu-ports xenial-backports/universe amd64 Packages
Ign:5 http://ports.ubuntu.com/ubuntu-ports xenial/main amd64 Packages
Ign:6 http://ports.ubuntu.com/ubuntu-ports xenial/restricted amd64 Packages
Ign:7 http://ports.ubuntu.com/ubuntu-ports xenial/universe amd64 Packages
Ign:8 http://ports.ubuntu.com/ubuntu-ports xenial/multiverse amd64 Packages
Ign:23 http://ports.ubuntu.com/ubuntu-ports xenial-security/main amd64 Packages
Ign:24 http://ports.ubuntu.com/ubuntu-ports xenial-security/restricted amd64 Packages
Ign:25 http://ports.ubuntu.com/ubuntu-ports xenial-security/universe amd64 Packages
Ign:26 http://ports.ubuntu.com/ubuntu-ports xenial-security/multiverse amd64 Packages
Get:27 http://ports.ubuntu.com/ubuntu-ports xenial-security/main arm64 Packages [621 kB]
Get:28 http://ports.ubuntu.com/ubuntu-ports xenial-security/main armhf Packages [674 kB]
Get:29 http://ports.ubuntu.com/ubuntu-ports xenial-security/universe arm64 Packages [531 kB]
Get:30 http://ports.ubuntu.com/ubuntu-ports xenial-security/universe armhf Packages [508 kB]
Ign:13 http://ports.ubuntu.com/ubuntu-ports xenial-updates/main amd64 Packages
Ign:14 http://ports.ubuntu.com/ubuntu-ports xenial-updates/restricted amd64 Packages
Ign:15 http://ports.ubuntu.com/ubuntu-ports xenial-updates/universe amd64 Packages
Ign:16 http://ports.ubuntu.com/ubuntu-ports xenial-updates/multiverse amd64 Packages
Ign:21 http://ports.ubuntu.com/ubuntu-ports xenial-backports/main amd64 Packages
Ign:22 http://ports.ubuntu.com/ubuntu-ports xenial-backports/universe amd64 Packages
Ign:5 http://ports.ubuntu.com/ubuntu-ports xenial/main amd64 Packages
Ign:6 http://ports.ubuntu.com/ubuntu-ports xenial/restricted amd64 Packages
Ign:7 http://ports.ubuntu.com/ubuntu-ports xenial/universe amd64 Packages
Ign:8 http://ports.ubuntu.com/ubuntu-ports xenial/multiverse amd64 Packages
Ign:23 http://ports.ubuntu.com/ubuntu-ports xenial-security/main amd64 Packages
Ign:24 http://ports.ubuntu.com/ubuntu-ports xenial-security/restricted amd64 Packages
Ign:25 http://ports.ubuntu.com/ubuntu-ports xenial-security/universe amd64 Packages
Ign:26 http://ports.ubuntu.com/ubuntu-ports xenial-security/multiverse amd64 Packages
Ign:13 http://ports.ubuntu.com/ubuntu-ports xenial-updates/main amd64 Packages
Ign:14 http://ports.ubuntu.com/ubuntu-ports xenial-updates/restricted amd64 Packages
Ign:15 http://ports.ubuntu.com/ubuntu-ports xenial-updates/universe amd64 Packages
Ign:16 http://ports.ubuntu.com/ubuntu-ports xenial-updates/multiverse amd64 Packages
Ign:21 http://ports.ubuntu.com/ubuntu-ports xenial-backports/main amd64 Packages
Ign:22 http://ports.ubuntu.com/ubuntu-ports xenial-backports/universe amd64 Packages
Err:5 http://ports.ubuntu.com/ubuntu-ports xenial/main amd64 Packages
  404  Not Found
Ign:6 http://ports.ubuntu.com/ubuntu-ports xenial/restricted amd64 Packages
Ign:7 http://ports.ubuntu.com/ubuntu-ports xenial/universe amd64 Packages
Ign:8 http://ports.ubuntu.com/ubuntu-ports xenial/multiverse amd64 Packages
Ign:23 http://ports.ubuntu.com/ubuntu-ports xenial-security/main amd64 Packages
Ign:24 http://ports.ubuntu.com/ubuntu-ports xenial-security/restricted amd64 Packages
Ign:25 http://ports.ubuntu.com/ubuntu-ports xenial-security/universe amd64 Packages
Ign:26 http://ports.ubuntu.com/ubuntu-ports xenial-security/multiverse amd64 Packages
Err:13 http://ports.ubuntu.com/ubuntu-ports xenial-updates/main amd64 Packages
  404  Not Found
Ign:14 http://ports.ubuntu.com/ubuntu-ports xenial-updates/restricted amd64 Packages
Ign:15 http://ports.ubuntu.com/ubuntu-ports xenial-updates/universe amd64 Packages
Ign:16 http://ports.ubuntu.com/ubuntu-ports xenial-updates/multiverse amd64 Packages
Err:21 http://ports.ubuntu.com/ubuntu-ports xenial-backports/main amd64 Packages
  404  Not Found
Ign:22 http://ports.ubuntu.com/ubuntu-ports xenial-backports/universe amd64 Packages
Err:23 http://ports.ubuntu.com/ubuntu-ports xenial-security/main amd64 Packages
  404  Not Found
Ign:24 http://ports.ubuntu.com/ubuntu-ports xenial-security/restricted amd64 Packages
Ign:25 http://ports.ubuntu.com/ubuntu-ports xenial-security/universe amd64 Packages
Ign:26 http://ports.ubuntu.com/ubuntu-ports xenial-security/multiverse amd64 Packages
Fetched 28.6 MB in 3min 25s (139 kB/s)
Reading package lists...
W: Target Packages (main/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:3 and /etc/apt/sources.list:50
W: Target Packages (universe/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:14 and /etc/apt/sources.list:50
W: Target Packages (main/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:8 and /etc/apt/sources.list:51
W: Target Packages (universe/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:16 and /etc/apt/sources.list:51
W: Target Packages (main/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:44 and /etc/apt/sources.list:52
W: Target Packages (universe/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:46 and /etc/apt/sources.list:52
E: Failed to fetch http://ports.ubuntu.com/ubuntu-ports/dists/xenial/main/binary-amd64/Packages  404  Not Found
E: Failed to fetch http://ports.ubuntu.com/ubuntu-ports/dists/xenial-updates/main/binary-amd64/Packages  404  Not Found
E: Failed to fetch http://ports.ubuntu.com/ubuntu-ports/dists/xenial-backports/main/binary-amd64/Packages  404  Not Found
E: Failed to fetch http://ports.ubuntu.com/ubuntu-ports/dists/xenial-security/main/binary-amd64/Packages  404  Not Found
E: Some index files failed to download. They have been ignored, or old ones used instead.
W: Target Packages (main/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:3 and /etc/apt/sources.list:50
W: Target Packages (universe/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:14 and /etc/apt/sources.list:50
W: Target Packages (main/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:8 and /etc/apt/sources.list:51
W: Target Packages (universe/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:16 and /etc/apt/sources.list:51
W: Target Packages (main/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:44 and /etc/apt/sources.list:52
W: Target Packages (universe/binary-all/Packages) is configured multiple times in /etc/apt/sources.list:46 and /etc/apt/sources.list:52

Any idea?

FULLY_CONNECTED op is not converted

I am converting the following file with edgetpu_compiler: model.zip
It consists of a single matrix multiplication op. The inputs and outputs are UINT8.

I get

Edge TPU Compiler version 2.0.267685300
...
Operator                       Count      Status

FULLY_CONNECTED                1          Filter, bias, or other param is not constant at compile-time

AttributeError: 'glVertexAttribPointer' object has no attribute 'baseFunction' when running edgetpu_demo --stream

After flashing the image on Coral dev board, when trying to run edgetpu_demo --stream and using my browser to see the demo, following error throws out:

Traceback (most recent call last):
File "/usr/lib/aarch64-linux-gnu/gstreamer-1.0/python/glbox.py", line 203, in do_gl_start
glVertexAttribPointer.baseFunction(a_position, 2, GL_FLOAT, GL_FALSE, 0, None)
AttributeError: 'glVertexAttribPointer' object has no attribute 'baseFunction'
Error: gst-library-error-quark: Subclass failed to initialize. (3): gstglbasefilter.c(402): gst_gl_base_filter_decide_allocation (): /GstPipeline:pipeline0/GstGLFilterBin:glfilterbin0/glbox+GlBox:filter

Have tried installing gstreamer again on the board but still the same error throws out, any thoughts?

Coral Install and example works, but edgetpu import not

I've been folliwing this guide: https://coral.withgoogle.com/docs/accelerator/get-started/

I n the end I run the example to classify an image, it all works.

But this will not work:

root@32d1e00c6940:/coral/tflite/python/examples# python3 -c 'import edgetpu; print("OK") '
Traceback (most recent call last):
File "", line 1, in
ModuleNotFoundError: No module named 'edgetpu'

What am I missing? I need to get the edgetpu python lib installed...

thx
Sven

Internal compiler error with tf.math.multiply

I have created the following model:

import tensorflow as tf

size = 1024

@tf.function(input_signature=[tf.TensorSpec([size] * 2, tf.float32)] * 2)
def bench_func(a, b):
    x = tf.math.multiply(a, b)
    return tf.reduce_sum(x)

def gen_input_samples():
    i = np.identity(size, np.float32)
    yield [-i, i]
    yield [i, i]

converter = tf.lite.TFLiteConverter.from_concrete_functions([bench_func.get_concrete_function()])
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.representative_dataset = gen_input_samples
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
tflite_model = converter.convert()
with open("model.tflite", "wb") as fout:
    fout.write(tflite_model)

I try to convert it and the compiler crashes:

edgetpu_compiler -s model.tflite
Edge TPU Compiler version 2.0.267685300

Internal compiler error. Aborting!

File: model.zip

USB accelerator cannot run `edgetpu` model

My USB accelerator cannot run edgetpu compiled model anymore, while it still can run the not-compiled tflite model.

Everything is tested with classification example from https://github.com/google-coral/tflite.

This is the error:

INFO: Initialized TensorFlow Lite runtime.
Traceback (most recent call last):
  File "classify_image.py", line 118, in <module>
    main()
  File "classify_image.py", line 96, in main
    interpreter.allocate_tensors()
  File "/home/ds017/.pyenv/versions/coral35/lib/python3.5/site-packages/tflite_runtime/interpreter.py", line 244, in allocate_tensors
    return self._interpreter.AllocateTensors()
  File "/home/ds017/.pyenv/versions/coral35/lib/python3.5/site-packages/tflite_runtime/interpreter_wrapper.py", line 114, in AllocateTensors
    return _interpreter_wrapper.InterpreterWrapper_AllocateTensors(self)
RuntimeError: Internal: Unsupported data type in custom op handler: 0Node number 1 (EdgeTpuDelegateForCustomOp) failed to prepare.

Some time ago everything was running fine. What might be wrong?

coral::GetInputFromImage is missing in c++ examples

Looks like edgetpu/cpp/basic/inference_utils.cc | .h were not migrated over to the new repo so examples such as classify_image (c++) fail. Is there a new home for these utils ?

classify_image.cc:27: undefined reference to `coral::GetInputFromImage(std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&, std::array<int, 3ul> const&)'

ReadLabelFile is missing a 'return'

During the latest release, ReadLabelFile() in dataset_utils.py was made deprecated. I suspect, the function is now supposed to call read_label_file() but the return statement is missing, so it always returns None. This breaks existing code.

Offending code line

minimal.cc build error

I was trying to run the minimal.cc file to test the c++ code. I'm using Ubuntu 16.04 on a x86-64 machine. I've built the tree following the Mendal get started guide and I've installed the Edge TPU runtime library.

When I was trying to build the minimal.cc file (bazel 0.27.2), I got errors:

ERROR: /home/jedichen/edgetpu/src/cpp/examples/BUILD:47:1: Linking of rule '//src/cpp/examples:minimal' failed (Exit 1) gcc failed: error executing command /usr/bin/gcc @bazel-out/k8-fastbuild/bin/src/cpp/examples/minimal-2.params

Use --sandbox_debug to see verbose messages from the sandbox
bazel-out/k8-fastbuild/bin/src/cpp/examples/_objs/minimal/minimal.pic.o:minimal.cc:function main: error: undefined reference to 'edgetpu::EdgeTpuManager::GetSingleton()'
bazel-out/k8-fastbuild/bin/src/cpp/examples/_objs/model_utils/model_utils.pic.o:model_utils.cc:function coral::BuildEdgeTpuInterpreter(tflite::FlatBufferModel const&, edgetpu::EdgeTpuContext*): error: undefined reference to 'edgetpu::RegisterCustomOp()'
collect2: error: ld returned 1 exit status
Target //src/cpp/examples:minimal failed to build

I'm not sure how to solve this. Isn't edgetpu.h the only needed file to make this run? Thank you for any help.

p.s.: I didn't find the packages/edgetpu folder. Is it supposed to be downloaded during the Mendel setup process?

Failing to connect M.2 to Jetson Nano DevKit

Hi,

I am having issues connecting the M.2 accelerator to the Jetson Nano DevKit.
I followed each instruction, here is the output for some commands :

lspci
0:02.0 PCI bridge: NVIDIA Corporation Device 0faf (rev a1)
01:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168/8411 PCI Express Gigabit Ethernet Controller (rev 15)

lsusb (not really relevant, but hey, more info is better right?)
Bus 002 Device 002: ID 0bda:0411 Realtek Semiconductor Corp.
Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 001 Device 004: ID 046a:000c Cherry GmbH
Bus 001 Device 003: ID 046d:c31c Logitech, Inc. Keyboard K120
Bus 001 Device 002: ID 0bda:5411 Realtek Semiconductor Corp.
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

lsmod
Module Size Used by
bnep 16562 2
fuse 103841 2
overlay 48691 0
zram 26166 4
spidev 13282 0
nvgpu 1575721 18
bluedroid_pm 13912 0
ip_tables 19441 0
x_tables 28951 1 ip_tables

modinfo gasket
filename: /lib/modules/4.9.140-tegra/updates/dkms/gasket.ko
author: Rob Springer [email protected]
license: GPL v2
version: 1.1.3
description: Google Gasket driver framework
srcversion: 069B6D0F6AE12073F4EAF5D
depends:
vermagic: 4.9.140-tegra SMP preempt mod_unload modversions aarch64
parm: dma_bit_mask:int

modinfo apex
filename: /lib/modules/4.9.140-tegra/updates/dkms/apex.ko
author: John Joseph [email protected]
license: GPL v2
version: 1.1
description: Google Apex driver
srcversion: 508A8A34D57322CEA287D17
alias: pci:v00001AC1d0000089Asvsdbcsci*
depends: gasket
vermagic: 4.9.140-tegra SMP preempt mod_unload modversions aarch64
parm: allow_power_save:int
parm: allow_sw_clock_gating:int
parm: allow_hw_clock_gating:int
parm: bypass_top_level:int
parm: trip_point0_temp:int
parm: trip_point1_temp:int
parm: trip_point2_temp:int
parm: hw_temp_warn1:int
parm: hw_temp_warn2:int
parm: hw_temp_warn1_en:bool
parm: hw_temp_warn2_en:bool
parm: temp_poll_interval:int

dpkg -l | grep gasket
ii gasket-dkms 1.0-10 all DKMS source for the gasket driver

uname -a
Linux jetson-desktop 4.9.140-tegra #1 SMP PREEMPT Mon Dec 9 22:47:42 PST 2019 aarch64 aarch64 aarch64 GNU/Linux

dmesg | grep pci
[ 0.967687] tegra-xusb-padctl 7009f000.xusb_padctl: dev = phy-pcie.3, lane = pcie-0, function = pcie-x1
[ 0.967783] tegra-xusb-padctl 7009f000.xusb_padctl: dev = phy-pcie.4, lane = pcie-1, function = pcie-x4
[ 0.967870] tegra-xusb-padctl 7009f000.xusb_padctl: dev = phy-pcie.5, lane = pcie-2, function = pcie-x4
[ 0.967967] tegra-xusb-padctl 7009f000.xusb_padctl: dev = phy-pcie.6, lane = pcie-3, function = pcie-x4
[ 0.968054] tegra-xusb-padctl 7009f000.xusb_padctl: dev = phy-pcie.7, lane = pcie-4, function = pcie-x4
[ 0.968142] tegra-xusb-padctl 7009f000.xusb_padctl: dev = phy-pcie.8, lane = pcie-5, function = xusb
[ 0.968225] tegra-xusb-padctl 7009f000.xusb_padctl: dev = phy-pcie.9, lane = pcie-6, function = xusb
[ 0.977971] tegra-pcie 1003000.pcie: 4x1, 1x1 configuration
[ 0.979254] tegra-pcie 1003000.pcie: PCIE: Enable power rails
[ 0.979641] tegra-pcie 1003000.pcie: probing port 0, using 4 lanes
[ 0.983636] tegra-pcie 1003000.pcie: probing port 1, using 1 lanes
[ 1.071883] ehci-pci: EHCI PCI platform driver
[ 1.071939] ohci-pci: OHCI PCI platform driver
[ 1.406312] tegra-pcie 1003000.pcie: link 0 down, retrying
[ 1.818326] tegra-pcie 1003000.pcie: link 0 down, retrying
[ 2.231469] tegra-pcie 1003000.pcie: link 0 down, retrying
[ 2.233537] tegra-pcie 1003000.pcie: link 0 down, ignoring
[ 2.337986] tegra-pcie 1003000.pcie: PCI host bridge to bus 0000:00
[ 2.337992] pci_bus 0000:00: root bus resource [io 0x0000-0xffff]
[ 2.337995] pci_bus 0000:00: root bus resource [mem 0x13000000-0x1fffffff]
[ 2.337999] pci_bus 0000:00: root bus resource [mem 0x20000000-0x3fffffff pref]
[ 2.338003] pci_bus 0000:00: root bus resource [bus 00-ff]
[ 2.338023] pci 0000:00:02.0: [10de:0faf] type 01 class 0x060400
[ 2.338085] pci 0000:00:02.0: PME# supported from D0 D1 D2 D3hot D3cold
[ 2.338227] pci 0000:00:02.0: bridge configuration invalid ([bus 00-00]), reconfiguring
[ 2.338328] pci 0000:01:00.0: [10ec:8168] type 00 class 0x020000
[ 2.338351] pci 0000:01:00.0: reg 0x10: [io 0x0000-0x00ff]
[ 2.338381] pci 0000:01:00.0: reg 0x18: [mem 0x00000000-0x00000fff 64bit]
[ 2.338401] pci 0000:01:00.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit]
[ 2.338532] pci 0000:01:00.0: supports D1 D2
[ 2.338535] pci 0000:01:00.0: PME# supported from D0 D1 D2 D3hot D3cold
[ 2.349910] pci_bus 0000:01: busn_res: [bus 01-ff] end is updated to 01
[ 2.349936] pci 0000:00:02.0: BAR 14: assigned [mem 0x13000000-0x130fffff]
[ 2.349940] pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff]
[ 2.349946] pci 0000:01:00.0: BAR 4: assigned [mem 0x13000000-0x13003fff 64bit]
[ 2.349963] pci 0000:01:00.0: BAR 2: assigned [mem 0x13004000-0x13004fff 64bit]
[ 2.349979] pci 0000:01:00.0: BAR 0: assigned [io 0x1000-0x10ff]
[ 2.349987] pci 0000:00:02.0: PCI bridge to [bus 01]
[ 2.349990] pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]
[ 2.349996] pci 0000:00:02.0: bridge window [mem 0x13000000-0x130fffff]
[ 2.350212] pcieport 0000:00:02.0: Signaling PME through PCIe PME interrupt
[ 2.350215] pci 0000:01:00.0: Signaling PME through PCIe PME interrupt
[ 2.350220] pcie_pme 0000:00:02.0:pcie001: service driver pcie_pme loaded
[ 2.350299] aer 0000:00:02.0:pcie002: service driver aer loaded

dmesg | grep gasket yields nothing...

Does anyone have any idea how I could resolve this? Seems the EdgeTPU is not even powered up...

Edge TPU cannot do constant multiplication

Consider a very simple model: output=0.1*input. After building model in tf.keras -> convert and quantize using tflite converter -> comiple using edgetpu_compiler. I found out that the last step failed.

I think it turned out that tf_op_layer_Mul in tensorflow is not supported in edgetpu_compiler. The op is invoked when broadcasting is needed.

Please see the following quantized model files for references:

test_linear_mul_layer.zip

"For example, a fully-connected or softmax layer with 2D output" as the status

Similarly to #25 I engineered another artificial file: bench_model_16.zip

It is a matrix multiplication with the second matrix constant. I hoped that it will successfully compile with edgetpu_compiler, but nope, bad luck again.

Number of operations that will run on Edge TPU: 0
Number of operations that will run on CPU: 1

Operator                       Count      Status

FULLY_CONNECTED                1          For example, a fully-connected or softmax layer with 2D output

Now that I believe that the file can be crazy, wrong, etc., but the status message is really puzzling.

Decreased accuracy on usb accelerator when using an AutoML Vision trained model

Hello,

I am using the Coral usb accelerator for object detection. I trained a model on https://console.cloud.google.com/vision, optimized for latency, and successfully compiled the model with the edgetpu-compiler.

When I use the original (not compiled) .tflite, the objects are detected (~90% confidence) on each frame of the video.
But when I use the compiled .tflite, only background is detected (~60% confidence), on each frame.

Is such a decrease of accuracy expected ?

I'm using the same script for both .tflite : https://github.com/PINTO0309/TPU-MobilenetSSD/blob/master/MobileNet-SSD-TPU-sync.py, so i guess it is not a preprocessing issue.

Here is the output of the compilation


MEAN                           8          Mapped to Edge TPU
LOGISTIC                       8          Mapped to Edge TPU
LOGISTIC                       9          More than one subgraph is not supported
MAX_POOL_2D                    6          Mapped to Edge TPU
MAX_POOL_2D                    9          More than one subgraph is not supported
REDUCE_MAX                     8          Operation is otherwise supported, but not mapped due to some unspecified limitation
MUL                            8          Mapped to Edge TPU
MUL                            8          More than one subgraph is not supported
PACK                           24         Tensor has unsupported rank (up to 3 innermost dimensions mapped)
RESHAPE                        32         More than one subgraph is not supported
RESHAPE                        12         Tensor has unsupported rank (up to 3 innermost dimensions mapped)
ADD                            10         Mapped to Edge TPU
ADD                            21         More than one subgraph is not supported
CUSTOM                         1          Operation is working on an unsupported data type
CONCATENATION                  2          More than one subgraph is not supported
CONV_2D                        61         Mapped to Edge TPU
CONV_2D                        25         More than one subgraph is not supported
DEPTHWISE_CONV_2D              17         Mapped to Edge TPU
DEPTHWISE_CONV_2D              25         More than one subgraph is not supported

Regarding Boot2Qt

Has anyone tried to use Qt framework in edge dev board ? Is it possible to boot using Its bopt2qt device creation framework? I tried to find suitable document for using it with uboot but I cannot find any useful resources online.

Internal compiler error. Aborting!

Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): yes
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18.04
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: -
  • TensorFlow installed from (source or binary): Docker image latest-gpu-py3
  • TensorFlow version (use command below): 1.15.0
  • Python version: 3.6
  • Bazel version (if compiling from source): -
  • GCC/Compiler version (if compiling from source): -
  • CUDA/cuDNN version: 10.2
  • GPU model and memory: TITAN Xp / 12 GB

I have created a fully-quantized tf lite model from a saved model. But trying to compile it with the edgetpu_compiler, I get an error:

(venv) jiatian@desktop:~/tmp$ edgetpu_compiler saved_model_quant.tflite 
Edge TPU Compiler version 2.0.267685300

Internal compiler error. Aborting! 

Also tried the latest compiler, got same error:

Edge TPU Compiler version 2.0.291256449

Internal compiler error. Aborting! 

Below is the how I did quantize the tflite file from saved model:

    def generate_datasets(self):
        num_calibration_steps = 10
        for _ in range(num_calibration_steps):
            input = np.random.random_sample((1, self.img_height, self.img_width, 3))
            yield [np.array(input, dtype='float32')]

    def convert_savedModel_quant_tflite(self, savedModel_dir):
        converter = tf.lite.TFLiteConverter.from_saved_model(savedModel_dir)
        converter.optimizations = [tf.lite.Optimize.DEFAULT]
        converter.representative_dataset = self.generate_datasets
        converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
        converter.inference_input_type = tf.float32
        converter.inference_output_type = tf.float32
        tflite_quant_model = converter.convert()
        save_tflite_path = savedModel_dir + '/saved_model_quant.tflite'
        open(save_tflite_path, "wb").write(tflite_quant_model)

The tflite file is linked (14.5MB):

ERROR std::out_of_range: vector when compiling working tflite model

I am trying to convert a working tflite model with edgetpu_compiler version 2.0.267685300. The compilation terminates unsuccessfully with some console output:

Edge TPU Compiler version 2.0.267685300
terminating with uncaught exception of type std::out_of_range: vector

Internal compiler error. Aborting!

The tflite model was created by using the tensorflow nightly docker image. The code to reproduce the error can be found on my repo: test code. Just execute the commands in order

python keras_model.py
python keras_conversion.py

I'm afraid I cannot debug this, as I am unable to find the source code for the compiler. If I can contribute anything to resolve this issue, let me know.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.