softdevteam / lua_benchmarking Goto Github PK
View Code? Open in Web Editor NEWLua benchmark suite
License: Other
Lua benchmark suite
License: Other
I think we need to provide a simple way of people who don't have Krun to run benchmarks quickly, using whatever Lua VM they have knocking around, to encourage people to contribute to this repo. At its most basic it should just be something like:
$ lua quick_run.lua
Running bench1: .........................
Mean: 2.2s +/- 0.1s
Running bench2: .........................
Mean: 0.8s +/- 0.05s
What we need to do is a get a reasonable trade-off between ease of use and not being completely statistically misleading. My suggestion is that we run each benchmark for 30 in-process iterations (printing a "." to stdout as each is being run, so the user knows something is happening), and after each printing a mean and a 95% confidence interval. quick_run.lua
could take an optional integer parameter which varies the number of in-process iterations (so quick_run.lua 10
would run 10 in-process iterations).
We would of course need to put a warning in the README along the lines of "this isn't statistically reliable, but it is better than nothing." I can handle that part if you want me to.
Some benchmarks might need to build external libraries or do other work in order to run. We should allow benchmarks to have a build.sh
script in their directory which is automatically run (the first time the benchmark is run? or when the top-level build is invoked?) to set things up. We may want to pass a specific work
dir to this script so that benchmarks can avoid polluting global directories?
simplerunner.lua
currently accepts a number as a final argument. This is ambiguous: we can't, for example, have a benchmark called "30". This argument should be moved to a command-line switch -n
. This will also make it easier to allow multiple benchmarks to be specified on the command line.
At the moment anything which a Lua benchmark wants to import ends up in lualibs
. For modules which are shared between benchmarks, this is fine, but it's going to get very messy very quickly for anything which isn't shared. Let's move non-shared modules into the benchmark directories that use them, and have the benchmark update its global search path appropriately.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.