Comments (9)
Personally, I'm less worried about something like codespeed.io going away. It might be helpful, but it's not critical. If it goes away, we're just where we are now without regular benchmarks. That's different from e.g. github itself; while we could move to bitbucket or gitlab, that would be a lot more disruptive. So the question is how much effort it would be to set up. If someone can set it on in 4 h, and they start charging, we only loose 4 h of work. If it's a lot more effort to set up an maintain, then we should be more careful in vetting.
from astropy-benchmarks.
https://news.ycombinator.com/item?id=36682012
https://blog.pydantic.dev/blog/2022/07/10/pydantic-v2-plan/#performance
codspeed
was one of the tools used by Pydantic v2 to achieve their 17x speedup.
from astropy-benchmarks.
you lost me at "free for open source"
Like many of GH's tools? Or RTD? Or Pre-commit.ci? 😁 I think most of our CI is in the category "free for open source".
from astropy-benchmarks.
you lost me at "free for open source"
Like many of GH's tools? Or RTD? Or Pre-commit.ci? 😁 I think most of our CI is in the category "free for open source".
Yes, but also like Travis CI 😜
I see what @Cadair says, maybe I'm not so negative about it, but "free for open source" is not really something that screams trustworthiness these days
from astropy-benchmarks.
I'm just not sure where we would go from there. We should definitely vet our tools. But if we fundamentally don't trust anything in industry then we shouldn't use GH Actions under the free-for-open-source tier (we do), RTD under the free-for-open-source tier, Pre-commit under the same, encourage new users to use GH virtual development environments, CircleCI, etc. Our actual inert code is one of the few things that isn't in a free-for-open-source tier. And Codspeed is kind of like codecov, in that it is a user-friendly layer on top of open-source tooling, like pytest-benchmark.
Now, maybe we vet it and don't like it, but that hasn't happened yet.
from astropy-benchmarks.
We actually pat for ReadtheDocs because we think they need the money, plus we wanted something fomr them, but I now forget what. Details are here astropy/astropy-project#105
from astropy-benchmarks.
https://docs.codspeed.io/#how-long-does-it-take-to-install
If you're already benchmarking your codebase, you can plug your existing benchmarks into CodSpeed in less than 5 minutes since CodSpeed benchmark's API is compatible with the most popular benchmarking frameworks(pytest-benchmark, bencher, criterion.rs, vitest, tinybench, benchmark.js).
from astropy-benchmarks.
you lost me at "free for open source" I will generally push back against anything which isn't actually open source, especially if it's a hosted service at the whims of VC (I am assuming).
from astropy-benchmarks.
I don't think this replaces the part where we run the benchmark for every commit on main
etc? Looks like it only does PR continuous integration. I would like a more technical blog on the pros and cons, and what projects have used this service to-date and what lessons are learned.
from astropy-benchmarks.
Related Issues (20)
- Move benchmark codes to astropy core lib and only keep results here HOT 5
- MNT: Rename default branch from master to main HOT 1
- Refactor benchmarking process to run relative benchmark for PR HOT 2
- STScI site with updated benchmarks HOT 1
- List all the packages and their versions in benchmark log
- How to see which code is covered or not by benchmarks? HOT 3
- How we do benchmarking HOT 11
- Profile/document the details of the performance enhancement from astropy#7324
- Modeling benchmarks failing HOT 4
- Add benchmarks for FITS Header parsing
- Adding coordinates benchmarks: search_around_sky and separation
- Do we need oneesk benchmark with Numpy 1.16? HOT 1
- Benchmarks stopped running? HOT 1
- Some table memory benchmarks failing HOT 1
- Add nddata benchmarks
- Add a benchmark for io.ascii with wide files that have thousands of columns
- bug in benchmark - but would fix break history? HOT 5
- Add time benchmarks
- Add benchmarks for table creation from lists
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from astropy-benchmarks.