Giter Club home page Giter Club logo

Comments (8)

tarsa avatar tarsa commented on June 7, 2024

I've not done any substantial metaprogramming, but (most probably, to keep things reasonable) if the changed literals or source code positions are used in macros (or any sort of metaprogramming), especially in conditions, then the compiler would need to fall back to regular compilation process.

from scala-js.

sjrd avatar sjrd commented on June 7, 2024

Also copying @gzm0's answer from #1626 (comment) :

Is it feasible to add special-case compiler (and linker) support for very quick processing of changes mentioned in initial posts, i.e. changes to literals and source code formatting?

This should be possible, yes. Essentially, what we could do (in Scala.js linker internal parlens), re-use the previous Analysis if none of the Infos changed.

However, I'm not sure we can do this without incurring additional cost on linking, so we need to weight this trade-off carefully.

If you feel we should investigate this, may I suggest you open another issue to discuss this (IMHO the scope is quite different)?

from scala-js.

sjrd avatar sjrd commented on June 7, 2024

TBH I don't think it makes sense to implement this. The crux of the matter is:

However, I'm not sure we can do this without incurring additional cost on linking, so we need to weight this trade-off carefully.

Do we really want to spend additional time on every linker run so that we can save time on the (likely vanishingly) small number of runs where Infos don't change at all? What good are "impressive demos" if we make the actual use cases worse?

from scala-js.

tarsa avatar tarsa commented on June 7, 2024

Well, it depends on overhead (in non-optimizable case) then. If it's non-negligible, but reasonably small, then perhaps a survey would help deciding whether to keep it?

Personally, I don't have strong opinion on this optimization. I don't feel I need it for myself, but I've thought it could make Scala.js more attractive to programmers, especially newcomers. If you think otherwise, then this ticket could be closed.

from scala-js.

gzm0 avatar gzm0 commented on June 7, 2024

I've been thinking about this: I'll start experimenting with eagerly re-loading all previously loaded infos in a subsequent analyzer run. This is probably a good heuristic. If this doesn't hurt performance (for all we know, it might even increase it, since it flattens the parallelism graph), we might have a case to build this.

from scala-js.

gzm0 avatar gzm0 commented on June 7, 2024

Seems like preloading hurts performance (even if nothing changed). I suspect lock contention.

IIUC the only additional work we introduce with the change blow (in a "nothing changed" incremental compile) is the outermost loop and some additional GC pressure.

plot
logger-timings.csv

--- a/linker/shared/src/main/scala/org/scalajs/linker/analyzer/Analyzer.scala
+++ b/linker/shared/src/main/scala/org/scalajs/linker/analyzer/Analyzer.scala
@@ -53,11 +53,18 @@ final class Analyzer(config: CommonPhaseConfig, initial: Boolean,
     )
   }
 
+  private var previousAnalysis: Analysis = _
+
   def computeReachability(moduleInitializers: Seq[ModuleInitializer],
       symbolRequirements: SymbolRequirement, logger: Logger)(implicit ec: ExecutionContext): Future[Analysis] = {
 
     infoLoader.update(logger)
 
+    if (previousAnalysis != null) {
+      for (className <- previousAnalysis.classInfos.keys)
+        infoLoader.loadInfo(className)
+    }
+
     val run = new AnalyzerRun(config, initial, infoLoader)(
         adjustExecutionContextForParallelism(ec, config.parallel))
 
@@ -67,6 +74,10 @@ final class Analyzer(config: CommonPhaseConfig, initial: Boolean,
         if (failOnError && run.errors.nonEmpty)
           reportErrors(run.errors, logger)
 
+        // TODO: Make it depend on the real future.
+        // Don't store if it has errors.
+        previousAnalysis = run
+
         run
       }
       .andThen { case _ => infoLoader.cleanAfterRun() }

from scala-js.

gzm0 avatar gzm0 commented on June 7, 2024
> d %>%
  filter(grepl('Compute reachability', op)) %>%
  group_by(variant, op) %>%
  summarise(t_ns = median(t_ns))
`summarise()` has grouped output by 'variant'. You can override using the `.groups` argument.
# A tibble: 4 × 3
# Groups:   variant [2]
  variant op                                  t_ns
  <fct>   <fct>                              <dbl>
1 main    Linker: Compute reachability  148486462.
2 main    Refiner: Compute reachability 139398791 
3 preload Linker: Compute reachability  181934764 
4 preload Refiner: Compute reachability 150051568 

Seems we're losing about 30ms on the first pass and 10ms on the refiner.

from scala-js.

gzm0 avatar gzm0 commented on June 7, 2024

I might have spoken too soon: Parallelizing the pre-loading right away gives comparable performance:

    if (previousAnalysis != null) {
      Future.traverse(previousAnalysis.classInfos) { case (className, prevInfo) =>
        infoLoader.loadInfo(className) match {
          case None      => Future.successful(false)
          case Some(fut) => fut.map(_ == prevInfo.data)
        }
      }
    }

plot
logger-timings.csv

from scala-js.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.