Comments (6)
Hi @MuhammadBilal848 ,
Since TF2.16 uses Keras3 by default.In Keras3 saving to the TF SavedModel format via model.save() is no longer supported in Keras 3. Please refer to migration guide for some more details.
from tensorflow.
My thinking process onto this Error is probably due to multiple issues. I can lend a few ideas to the situation and anyone is free to correct me, if I'm wrong about it.
First, I would check to see if the TenserFlow compiler is updated to the versions you're using. Maybe it could cause these issues. As it did suggest rebuild TenserFlow with appropriate compiler.
Second, I would double check to see if the versions are even compatible. That could be causing issues.
from tensorflow.
So I trained another just to check and used model.export("FOLDER_NAME")
instead model.save("model.h5")
.
Got this as output,
and the folder is saved with assets , variables , pb file and fingerprint:
I load the model using tf.keras.layers.TFSMLayer("FOLDER_NAME", call_endpoint="serving_default")
It worked. @SuryanarayanaY Thank you 🖤
from tensorflow.
Hi @MuhammadBilal848 ,
Since TF2.16 uses Keras3 by default.In Keras3 saving to the TF SavedModel format via model.save() is no longer supported in Keras 3. Please refer to migration guide for some more details.
Also could you tell me how can I convert the pd model to h5? I want to convert the model to tflite.
Got:
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
W0000 00:00:1714042377.283703 4504 tf_tfl_flatbuffer_helpers.cc:390] Ignored output_format.
W0000 00:00:1714042377.284215 4504 tf_tfl_flatbuffer_helpers.cc:393] Ignored drop_control_dependency.
2024-04-25 15:52:57.285827: I tensorflow/cc/saved_model/reader.cc:83] Reading SavedModel from: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x
2024-04-25 15:52:57.287069: I tensorflow/cc/saved_model/reader.cc:51] Reading meta graph with tags { serve }
2024-04-25 15:52:57.287242: I tensorflow/cc/saved_model/reader.cc:146] Reading SavedModel debug info (if present) from: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x
2024-04-25 15:52:57.300472: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:388] MLIR V1 optimization pass is not enabled
2024-04-25 15:52:57.306249: I tensorflow/cc/saved_model/loader.cc:234] Restoring SavedModel bundle.
2024-04-25 15:52:57.378809: I tensorflow/cc/saved_model/loader.cc:218] Running initialization op on SavedModel bundle at path: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x
2024-04-25 15:52:57.391327: I tensorflow/cc/saved_model/loader.cc:317] SavedModel load for tags { serve }; Status: success: OK. Took 105497 microseconds.
2024-04-25 15:52:57.412351: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:268] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
Traceback (most recent call last):
File "F:\Projects\Trigger Word Detection\converter.py", line 10, in <module>
tflite_model = converter.convert()
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1175, in wrapper
return self._convert_and_export_metrics(convert_func, *args, **kwargs)
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1129, in _convert_and_export_metrics
result = convert_func(self, *args, **kwargs)
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1636, in convert
saved_model_convert_result = self._convert_as_saved_model()
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1617, in _convert_as_saved_model
return super(TFLiteKerasModelConverterV2, self).convert(
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1407, in convert
result = _convert_graphdef(
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert_phase.py", line 212, in wrapper
raise converter_error from None # Re-throws the exception.
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert_phase.py", line 205, in wrapper
return func(*args, **kwargs)
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert.py", line 995, in convert_graphdef
data = convert(
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert.py", line 367, in convert
raise converter_error
tensorflow.lite.python.convert_phase.ConverterError: Could not translate MLIR to FlatBuffer.
from tensorflow.
Hi @MuhammadBilal848 ,
Since TF2.16 uses Keras3 by default.In Keras3 saving to the TF SavedModel format via model.save() is no longer supported in Keras 3. Please refer to migration guide for some more details.Also could you tell me how can I convert the pd model to h5? I want to convert the model to tflite.
Got:
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING: All log messages before absl::InitializeLog() is called are written to STDERR W0000 00:00:1714042377.283703 4504 tf_tfl_flatbuffer_helpers.cc:390] Ignored output_format. W0000 00:00:1714042377.284215 4504 tf_tfl_flatbuffer_helpers.cc:393] Ignored drop_control_dependency. 2024-04-25 15:52:57.285827: I tensorflow/cc/saved_model/reader.cc:83] Reading SavedModel from: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x 2024-04-25 15:52:57.287069: I tensorflow/cc/saved_model/reader.cc:51] Reading meta graph with tags { serve } 2024-04-25 15:52:57.287242: I tensorflow/cc/saved_model/reader.cc:146] Reading SavedModel debug info (if present) from: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x 2024-04-25 15:52:57.300472: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:388] MLIR V1 optimization pass is not enabled 2024-04-25 15:52:57.306249: I tensorflow/cc/saved_model/loader.cc:234] Restoring SavedModel bundle. 2024-04-25 15:52:57.378809: I tensorflow/cc/saved_model/loader.cc:218] Running initialization op on SavedModel bundle at path: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x 2024-04-25 15:52:57.391327: I tensorflow/cc/saved_model/loader.cc:317] SavedModel load for tags { serve }; Status: success: OK. Took 105497 microseconds. 2024-04-25 15:52:57.412351: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:268] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable. Traceback (most recent call last): File "F:\Projects\Trigger Word Detection\converter.py", line 10, in <module> tflite_model = converter.convert() File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1175, in wrapper return self._convert_and_export_metrics(convert_func, *args, **kwargs) File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1129, in _convert_and_export_metrics result = convert_func(self, *args, **kwargs) File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1636, in convert saved_model_convert_result = self._convert_as_saved_model() File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1617, in _convert_as_saved_model return super(TFLiteKerasModelConverterV2, self).convert( File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1407, in convert result = _convert_graphdef( File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert_phase.py", line 212, in wrapper raise converter_error from None # Re-throws the exception. File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert_phase.py", line 205, in wrapper return func(*args, **kwargs) File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert.py", line 995, in convert_graphdef data = convert( File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert.py", line 367, in convert raise converter_error tensorflow.lite.python.convert_phase.ConverterError: Could not translate MLIR to FlatBuffer.
I tried this method and it worked:
from tensorflow.
Related Issues (20)
- Repository '@python' is not defined HOT 5
- Tensforflow-cpu 2.12.0 suddenly stopped working with Python 3.7-slim HOT 4
- tensorflow 1.x and 2.x produce different results during inferencing based on the same parameters HOT 4
- Compile Error: Undeclared Identifier ‘tsl’ in libtensorflow
- Conda Release supporting Windows + Python 3.11 HOT 3
- Tensorflow Building from Source Code HOT 2
- which version of keras is used by latest kws_streaming HOT 10
- Convert TFlite buffer created using TF1 to TF2 TFlite buffer HOT 2
- How to contribute?
- Error in computation = RQAComputation.create(settings, verbose=True) when I use HPC. However, I do not encounter any error when I use my personal laptop for the same code and installing the same package : from pyrqa.computation import RQAComputation
- I think we should use a separate api token named for this view. We may have additional clients of the APIs in the future, and we should be able to make the decision to limit one of the other of these endpoints from those additional clients. HOT 2
- Why doesn't `The calling iterator did not fully read the dataset being cached.` appear on Google Colab? HOT 2
- How to turn off mlir during tensorflow2.13 compilation? HOT 1
- Running the same model in TF and TFLiteMicro produces different outputs HOT 1
- TF-Keras mixed precision training leads to autograph errors HOT 2
- Profiler does not Seem to Output Timesteps in xplane.pb - "No step marker observed and hence the step time is unknown" from Tensorboard HOT 3
- Tensorflow compatibility with pyinstaller HOT 2
- Memory leak when jit compiling
- Add support for TensorRT 10 HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tensorflow.