Comments (11)
JIT compilation and generating compilation output that can be reused in another session are fundamentally different. JIT compilation can avoid some indirection since it can assume the locations of procedures will not change. Whereas compilation that can be reloaded requires some indirection.
Julia's compilation modes can be affected by the options below, but they are not meant for the end user. The closest end user implementation to what you want are the pkgimages, which uses the options below internally.
I'm unclear if you have tried pkgimages and PrecompileTools.jl and how they may or may not be applicable to your problem.
$ julia --help-hidden
julia [switches] -- [programfile] [args...]
Switches (a '*' marks the default value, if applicable):
--compile={yes*|no|all|min}
Enable or disable JIT compiler, or request exhaustive or minimal compilation
--output-o <name> Generate an object file (including system image data)
--output-ji <name> Generate a system image data file (.ji)
--strip-metadata Remove docstrings and source location info from system image
--strip-ir Remove IR (intermediate representation) of compiled functions
--output-unopt-bc <name> Generate unoptimized LLVM bitcode (.bc)
--output-bc <name> Generate LLVM bitcode (.bc)
--output-asm <name> Generate an assembly file (.s)
--output-incremental={yes|no*}
Generate an incremental output file (rather than complete)
--trace-compile={stderr,name}
Print precompile statements for methods compiled during execution or save to a path
--image-codegen Force generate code in imaging mode
--permalloc-pkgimg={yes|no*} Copy the data section of package images into memory
from julia.
Would be also amazing if those precompiled modules could be saved as shared libraries locally to the files/scripts/dev packages as Manifest.toml
currently does. So whenever someone starts a script or developing a package those precompiled modules would be picked up automatically.
from julia.
This is kind of already here, I think. It's possible to track all compilation, saving precompile calls to a file with --trace-compile=file_name
. After that just put the precompile statements into a package and use the package from your startup.jl. Perhaps some of this should be further automated?
from julia.
@nsajko
That sounds like a lot of work for a regular user, especially for those who are not computer scientists or have a poor understanding of how compilers work. Additionally, this setup might not work well with Revise
.
Perhaps some of this should be further automated?
Thats how I read the initial proposal yes, would be nice to automate all of this, making the precompile statements run behind the scenes and enabling the ability to dump and reload the precompiled modules between Julia sessions.
from julia.
This is kind of already here, I think. It's possible to track all compilation, saving precompile calls to a file with
--trace-compile=file_name
. After that just put the precompile statements into a package and use the package from your startup.jl. Perhaps some of this should be further automated?
Thanks, @nsajko , indeed I am currently following a similar, but even more complicated manual workflow (due to supporting both PackageCompiler and a "development build" and needing to investigate which precompiles were triggered by which module). I was thinking towards a better solution and this proposal is what I came up with from a user perspective. I would probably even set an alias on my system to have this parameter activated by default, because that's the behavior what I want nearly all the time.
I also assume that we have most of the major building blocks already. As far as I know only the "which module was responsible for this runtime precompile" is missing a user interface and this sounds like a low fruit. I guess it's only a matter of putting the blocks together, but I do not know about the implementation details of these blocks. But to me that sounds like the best of both worlds (classical AoT compiled languages and interpreted/JIT/JAoT compiled languages).
from julia.
Would be also amazing if those precompiled modules could be saved as shared libraries locally to the files/scripts/dev packages as
Manifest.toml
currently does. So whenever someone starts a script or developing a package those precompiled modules would be picked up automatically.
I agree that this would be cool when it works, but would it work so often? At least the native code would fail whenever someone has a different computer architecture / instruction set. Optimizations are also sometimes very CPU-specific (e.g. AES-512). So this, although interesting, opens up a whole lot of additional questions. Therefore, I propose to have a separate issue for that question, if you want to follow up on this proposal.
from julia.
but would it work so often?
I think so. In my opinion, it would cover the vast majority of cases since most people use the same laptop or computer for months/years. For example, if I worked on experiments yesterday, I could restart the Julia session today with all the functions precompiled from yesterday. That's the use case for 99% of users. Currently, most of the cache is lost between sessions (although packages do precompile some stuff, the majority of session-specific compilations are wiped). What I was trying to propose is basically equivalent to not closing Julia terminal session over night and keeping it alive for days. Would be nice to just dump all the available precompiled cache in a binary file and restart julia session with this binary file later on such that it feels you never actually closed your terminal.
You are correct that this wouldn't work on a different computer architecture. However, I believe your original proposal would face the same issue, as precompiled modules wouldn't work between different computer architectures anyway. I may have misunderstood your proposal, though.
from julia.
Would be nice to just dump all the available precompiled cache in a binary file and restart julia session with this binary file later on such that it feels you never actually closed your terminal.
The use case of the same computer should already be covered in my proposal. I thought you wanted to extend it, but I think we have the same use case in our mind.
from julia.
Relaying the thought from the related Discourse thread, have you considered using PrecompileTools.jl with a Startup package?
https://julialang.github.io/PrecompileTools.jl/stable/#Tutorial:-local-%22Startup%22-packages
from julia.
JIT compilation can avoid some indirection since it can assume the locations of procedures will not change. Whereas compilation that can be reloaded requires some indirection.
Thanks for the explanations. So if I got it, then the feature request really is "if the flag is provided, generate relocatable JIT code which is then saved to the image file if it is not in there already".
The idea is to make use of pkgimages to basically achieve a similar, but faster and more comfortable effect than using PrecompileTools.jl.
I am not sure which of these command line arguments might support the described use case, so I tested them separately and commented them so that we can see whether my understanding is correct. Probably I need at least a combination of them, but I am unclear which one.
--compile={yes*|no|all|min} Seems to be orthogonal
--output-o <name> Results in "ERROR: File "boot.jl" not found"
--output-ji <name> Results in "ERROR: File "boot.jl" not found"
--strip-metadata Seems to be orthogonal
--strip-ir Seems to be orthogonal
--output-unopt-bc <name> Seems to be orthogonal
--output-bc <name> Seems to be orthogonal
--output-asm <name> Seems to be orthogonal
--output-incremental={yes|no*} This works, but I am not sure what it does without any other options, but it does not seem to save runtime precompiles per se.
--trace-compile={stderr,name} This might be a building block, but as long as it does not output the calling module, I am not sure how much it will help.
--image-codegen I am not sure what it does without any other options, but it does not seem to save runtime precompiles per se.
--permalloc-pkgimg={yes|no*} This sounds unreleated
I have the feeling, that I did not state very well, what I want to have as a solution. I tried to improve the wording, but please give me hints what would help you to help me. :-)
from julia.
Relaying the thought from the related Discourse thread, have you considered using PrecompileTools.jl with a Startup package?
https://julialang.github.io/PrecompileTools.jl/stable/#Tutorial:-local-%22Startup%22-packages
Thanks for the hint, I guess you are referring to this thread. Ideally I do not want to have to setup nor to maintain anything as this should really support a developing workflow where things change. If a function gets precompiled outside of a module precompilation run, it should just be saved by Julia for next time without any function-specific configuration and without an additional run (as with a workset with PrecompileTools.jl
.
from julia.
Related Issues (20)
- SVD's `propertynames` method doesn't respect private argument.
- does `methodswith` always return public API methods in v1.11?
- TTFX regression with Comonicon CLI on Julia 1.11 HOT 2
- possible dispatch bug: method static parameter matching fails
- `LinearAlgebra` with `BigFloat` edgecases crash Julia HOT 3
- `make JULIA_PRECOMPILE=0` doesn't skip precompilation of stdlibs anymore HOT 8
- incorrect method lookup for Union{} type dispatch
- Regression: failure of TypeVar in recursive struct with `NTuple` field HOT 1
- CI: test failure in UUIDs with `uuid7(Xoshiro(0))` HOT 5
- Cornercase where closure boxing fails in function bodies
- Crash in recursive function with many intermediate allocations HOT 4
- Cannot load shared library HOT 1
- literal_pow isn't marked public and isn't documented HOT 3
- help REPL: syntax highlighting makes some words invisible HOT 1
- AnnotatedString: wrong join() result with other string types HOT 2
- AnnotatedString displays as a regular String, annotations aren't shown HOT 3
- Make stack size configurable in `Threads.@spawn`
- Weird inheritance of `TypeVar` of types used as lower bounds in parametric types. HOT 5
- Rational power of `BigFloat` should use specialised MPFR function HOT 9
- Runtime crash with unreachable reached HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from julia.