Comments (17)
Feel free to reopen inquiries about specific components.
If you close this one that it is general enough then I need to post to every specific ticket or PR that is going to create any sort of potential duplication with Addons like #289.
I honestly avoided doing this on every single ticket in these months cause I was thinking that we could sit down and calmly assess the situation as we have already "duplicated" or superseded many components here since February.
But we have not collected any specific position after 2 months and it's a bit of a waste of resources.
from keras-cv.
/cc @seanpmorgan @yarri-oss @theadactyl
from keras-cv.
Sure, I will do a pass over the components and see what we are interested in taking.
from keras-cv.
We need to also to evaluate if we take the ownership of the tf.image
namespace and handle the deprecation in Tensorflow:
https://www.tensorflow.org/api_docs/python/tf/image
See more (/cc @qlzh727 ):
#122 (comment)
from keras-cv.
Checked with @fchollet offline. It seems that we don't have an active owner for tf.image API, but in the same time, we can't easily remove the TF API due to backward compatibility contract. Deprecating certain duplicated tf.image API could be an approach, until keras-cv has enough API/feature coverage.
from keras-cv.
Yes I think any public TF API removal requires a deprecation window:
https://github.com/tensorflow/community/blob/master/governance/api-reviews.md
from keras-cv.
from keras-cv.
Is there any reason to leave this open @bhack ? Iād like to begin triaging issues and closing more thoroughly. Feel free to reopen inquiries about specific components.
from keras-cv.
Okay, we can reopen this then and discuss.
from keras-cv.
@bhack for organizational purposes, can we keep discussions regarding duplication to this issue? I don't want to confuse users and have them close issues that they should not be closing. Re: the focal loss discussion here:
@sebastian-sz Honestly It was the scope of #74 to do this in an orderly manner but it seems that in the end it is quite recurrent to never manage this kind of things in a clean way.
What do you have in mind regarding a clean and orderly way to manage migration? As it stands now, we are opening PRs here, holding API design reviews, and pulling ideas from various reference implementations. To me, this is organized, orderly, and most importantly delivers the best end user API.
Please raise any issues with this process here.
from keras-cv.
I've a quite clear view on how to do this but I've reached the limit with my OSS/Gitbub visiblity.
I personally hoped not to make TF-Addons slowly obsolete PR after PR on this repository just out of respect for the many years invested by its community of volunteers.
I've tried to open this to sit down and identify how many TF addons components we want to replicate/refactor here and to eventually put TFAddons in a sort of maintenance mode but with a day by day (or PR by PR) approach we are not able to do this.
I hope you could follow-up on this internally with the other TF members I've already mentioned in this ticket.
Thanks
from keras-cv.
Hi @bhack - I've done some reading, discussion, etc and have the following response to the issue about duplication of components.
The role of TF-Addons doesn't change -- a community-contributed set of extensions to TF and Keras APIs that are too niche to be included in core TF or core Keras (including KerasCV and KerasNLP).
It has always been the case that TF-Addons components are expected to potentially migrate to core if there's interest. For instance, that's what we did with the AdamW optimizer recently (although we reimplemented it rather than reusing the TF-Addons version). KerasCV and KerasNLP are basically the same as core Keras, just domain-specific, so the same expectations apply. We will migrate or reimplement TF-Addons components there if there's a need.
However this will always be limited to a minority of TF-Addons components, and it does not affect TF-Addons' purpose and positioning. There is no risk of TF-Addons being "taken over" or "made irrelevant". There will always be a need for a repository of extensions that are too niche for the core API.
Rest assured, TF-Addons is still VERY relevant and important! KerasCV and KerasNLP are not a replacement for TF-Addons. They are not addons for TF; they are the same as the core Keras API, just more domain specific.
Finally, it is normal and expected that TF-Addons components will end up in the core APIs. This should be considered a success for the contributor, as their component is highly impactful and this means there is a clear need for the component!
Please let me know if there are any more questions. I want to make sure to effectively communicate with all parties involved.
from keras-cv.
The role of TF-Addons doesn't change -- a community-contributed set of extensions to TF and Keras APIs that are too niche to be included in core TF or core Keras (including KerasCV and KerasNLP).
This was historically very confusing and we are full of examples in these years with components duplicated, in some case just after few weeks from when the component was reviewed and contributed in Addons.
Also we have almost the same papers citations threshold in Keras-CV and Keras-NLP and TF Addons so I have some strong doubt about the "too niche" claim as I think that not so much people are interested to maintain "too niche" contributions. For this it is better to maintain the components in your own repository as the codeownreship doesn't scale.
that's what we did with the AdamW optimizer recently (although we reimplemented it rather than reusing the TF-Addons version).
I needed to personally route the contributor from TF Addons to Keras for this (see tensorflow/addons#2681 (comment)).
However this will always be limited to a minority of TF-Addons components, and it does not affect TF-Addons' purpose and positioning.
I don't have the time now to compile again the list of components that I need to subtract from TF Addons and that are now already here in master but it is already quite long for sure. Probably with a slightly different API or with other implementation details but this wasn't the point here.
Finally, it is normal and expected that TF-Addons components will end up in the core APIs. This should be considered a success for the contributor, as their component is highly impactful and this means there is a clear need for the component!
Almost all the cases are not about a porting but a duplication as it often happen in the TF ecosystem. See the history of a process that it has never worked:
from keras-cv.
Hi Stefano,
Our key point here is that TF Addons serves a valuable purpose in the ecosystem, and the purpose of KerasCV is not to be replacement for TF Addons CV tooling. We hope that folks will contribute to TF Addons and KerasCV. We value all contributions.
It is inevitable that there will be some duplication of functionality. That's fine, and we can live with it. In any large ecosystem, there will always be some level of duplication.
If a component gets adopted by core APIs, that doesn't mean the TF Addons version stops being useful or should be removed. For instance there are many workflows today that use AdamW
from TF Addons. It will probably take ~2 years for most users to transition to the newly introduced core version.
It just means that we think there should be a core version of the component. And that judgment call is not affected by whether something is already in TF Addons or not.
For instance, we added AdamW
to Keras because we found that lots of people were using it (e.g. on keras.io). Critically, it is the availability of AdamW
in TF Addons that helped pointing out the need. And further, when we added AdamW
, we learned from the API and implementation details of the TF Addons version. Overall, TF Addons played an important role, and our end state is one that is nicer for every Keras user (just import keras.optimizers.AdamW
).
Almost all the cases are not about a porting but a duplication as it often happen in the TF ecosystem. See the history of a process that it has never worked:
There is no functional difference between migrating a component and reimplementing it. What matters is the role the component plays in a workflow and how people use it, not the specific code used to implement it. The point of "staging" a component in a package that has low backwards-compatibility requirements is not to try out the code, it's to try out the API, and to hedge for whether the component will end up being broadly useful or not. The point is to learn something, which we do, thanks to TF Addons. The fact that we don't migrate components via copy/pasting is not important.
from keras-cv.
I honestly don't think that cerry-picking arguments to support one's thesis that doesn't adhere to what happens every day in TF/Keras repositories and Github Orgs will help us to solve real problems that don't want to be acknowledged.
I think we are not aligned as we have two different points of view:
the competition between internal teams is natural and also encouraged I presume but from the external point of view, as a community, we would like to cooperate in the ecosystem and we have no interest in competing within the same ecosystem. The TF Ecosystem has not a so large mass of external contributors to care about redundancies.
I close this ticket because given the approach to the discussion I believe that everyone should think about the fate of their project independently.
Honestly I have no specific critic about Keras-CV or Keras-NLP if these project will be really community driven and they will improve the aligment between the internal team members and contributors also in the codeownership. I'have already collaborated a bit in Keras-cv but I will not contribute single components personally untill It will be clear the TF-Addons status.
@seanpmorgan the current TFAddons SIG lead will evaluate what to do in coordination with the TF SIGs program managers.
P.s. I don't know why you focus on copy pasting component. My point in these year was never about to copy-paste components but related to the API and to contributors routing.
Also my point was always related to not waste the very limited community resourcers and confusing contributors.
from keras-cv.
There is no functional difference between migrating a component and reimplementing it.
from keras-cv.
Does this issue need to be reopened now that TFA is deprecated? tensorflow/addons#2807
Is there a systematic mapping from tfa.image to KerasCV functions (including which ones will not be included or are in other libraries)?
from keras-cv.
Related Issues (20)
- Wrong bounding boxes in the visualization of `tfds.datasets.kitti` HOT 2
- Error when training YOLOV8 with jax backend HOT 10
- Bad YOLOv8 performance when compared to Ultralytics implementation HOT 6
- Using large dataset from a TF record the model doesnt train anything HOT 3
- How to contribute pretrained models? HOT 2
- Add DINOv2 HOT 4
- take over video swin checkpoints HOT 11
- FasterRCNN and ROIAlign are Non-Functional in Master HOT 1
- DropBlock2D not working with TensorFlow backend in graph mode on Keras 3 HOT 1
- Detecting more than 100 objects in a single image using retinanet HOT 1
- Image Classifier Task Can't Export to Tensorflow saved model format or to TFLite HOT 6
- Can't Select Activation / Output for DeepNetV3Plus, broken example for 1 class
- How could I segment object without need to resize the whole image for it
- Attempting to load SegFormer preset "segformer_b0_imagenet" results in 403 error
- Inference on ONNX YOLOv8 model HOT 2
- YOLOV8Backbone inconsistent output tensor shapes on Torch backend
- Fine tuned model unable to detect objects HOT 1
- Issues I am Facing while training KerasCV YOLO model on my own dataset HOT 2
- Update Keras-CV object detection tutorial from Keras 2 to Keras 3 for new YOLO version (at least YOLOv9 or newer) HOT 4
- StableDiffusion.text_to_image() casuses an excaption in Colab HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
š Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ššš
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ā¤ļø Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from keras-cv.