Comments (3)
I forgot a return
in the optimizer... It should have been fixed in 0.9.0
.
from keras-radam.
Added in 0.8.0
:
from keras_radam.training import RAdamOptimizer
RAdamOptimizer(learning_rate=1e-3)
from keras-radam.
Thanks for your change. However, I still get errors when trying to use it with minimize method:
TypeError Traceback (most recent call last)
in ()
61 #optimizer = tf.train.AdamOptimizer(1e-3)
62 #optimizer = tf.contrib.estimator.clip_gradients_by_norm(optimizer,clip_norm=5.0)
---> 63 training_op = optimizer.minimize(loss)
~/ML_tensorflow/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py in minimize(self, loss, global_step, var_list, gate_gradients, aggregation_method, colocate_gradients_with_ops, name, grad_loss)
411
412 return self.apply_gradients(grads_and_vars, global_step=global_step,
--> 413 name=name)
414
415 def compute_gradients(self, loss, var_list=None,
~/ML_tensorflow/lib/python3.6/site-packages/tensorflow/python/training/optimizer.py in apply_gradients(self, grads_and_vars, global_step, name)
614 update_ops.append(processor.update_op(self, grad))
615 if global_step is None:
--> 616 apply_updates = self._finish(update_ops, name)
617 else:
618 with ops.control_dependencies([self._finish(update_ops, "update")]):
~/ML_tensorflow/lib/python3.6/site-packages/keras_radam/training.py in _finish(self, update_ops, name_scope)
254
255 def _finish(self, update_ops, name_scope):
--> 256 with ops.control_dependencies(update_ops):
257 step, beta1_power, beta2_power = self._get_beta_accumulators()
258 with ops.colocate_with(beta1_power):
~/ML_tensorflow/lib/python3.6/site-packages/tensorflow/python/framework/ops.py in control_dependencies(control_inputs)
5424 return NullContextmanager()
5425 else:
-> 5426 return get_default_graph().control_dependencies(control_inputs)
5427
5428
~/ML_tensorflow/lib/python3.6/site-packages/tensorflow/python/framework/ops.py in control_dependencies(self, control_inputs)
4865 (hasattr(c, "_handle") and hasattr(c, "op"))):
4866 c = c.op
-> 4867 c = self.as_graph_element(c)
4868 if isinstance(c, Tensor):
4869 c = c.op
~/ML_tensorflow/lib/python3.6/site-packages/tensorflow/python/framework/ops.py in as_graph_element(self, obj, allow_tensor, allow_operation)
3794
3795 with self._lock:
-> 3796 return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
3797
3798 def _as_graph_element_locked(self, obj, allow_tensor, allow_operation):
~/ML_tensorflow/lib/python3.6/site-packages/tensorflow/python/framework/ops.py in _as_graph_element_locked(self, obj, allow_tensor, allow_operation)
3883 # We give up!
3884 raise TypeError("Can not convert a %s into a %s." %
-> 3885 (type(obj).name, types_str))
3886
3887 def get_operations(self):
TypeError: Can not convert a NoneType into a Tensor or Operation.
from keras-radam.
Related Issues (20)
- not compatable with tf 1.8 , AttributeError: 'RAdamOptimizer' object has no attribute '_call_if_callable' HOT 6
- The keras backend support HOT 4
- ModuleNotFoundError: No module named 'tensorflow.python.keras.optimizer_v2' HOT 1
- epsilon not compatible with Adam HOT 1
- Could not interpret optimizer identifier HOT 3
- Issue about the dtype HOT 3
- Ranger Optimizer extension HOT 2
- amsgrad parameter HOT 3
- Maintenance HOT 1
- Generate First Release HOT 1
- TF.keras
- Very slow implementation HOT 1
- TypeError: Unexpected keyword argument passed to optimizer: lr HOT 1
- AttributeError: 'TFOptimizer' object has no attribute 'learning_rate' HOT 7
- Any other parameter needed when warmup? HOT 1
- Warmup causes NAN HOT 1
- ValueError: An operation has `None` for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval. HOT 1
- How can i use radam in tf1.4? HOT 1
- No module name math_ops HOT 2
- ModuleNotFoundError: No module named 'keras.legacy' HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from keras-radam.