Giter Club home page Giter Club logo

mill's People

Contributors

adadima avatar ajrnz avatar alexarchambault avatar atty303 avatar baccata avatar carlosedp avatar chikei avatar ckipp01 avatar francisdb avatar ggrossetie avatar heksesang avatar idiosapps avatar jeantil avatar jkstrauss avatar joan38 avatar jodersky avatar lefou avatar lihaoyi avatar lolgab avatar nafg avatar nightscape avatar nrktkt avatar nvander1 avatar robby-phd avatar rockjam avatar rtimush avatar sake92 avatar samvel1024 avatar scala-steward avatar smarter avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mill's Issues

Extract Target-selection logic out of Main and write some unit tests

https://github.com/lihaoyi/mill/blob/1382e27191e662da2be0ec904c6266c7f21bc68d/core/src/main/scala/mill/Main.scala#L25-L112
This is currently a bit messy, but the logic is complex enough (with enough error conditions) that it deserves some proper testing.

Cross-building with mill Core.cross[a].printIt works, but mill Core.cross[doesntExist].printIt fails with an ugly exception, and it would be great if we could tighten up error handling and lock it down with unit tests as part of this effort

Support long-lived worker processes

Currently, Mill builds are stateless: build results are persisted to disk, but all in-memory state lives on the call-stack of evaluate, and is discarded after every run.

There is a large class of build tooling that works better with long-lived worker processes. A Scala Zinc compile daemon which can keep the JVM hot & fast is one example, but there are countless others: Javascript bundlers e.g. Webpack, websocket servers for browser-reloading e.g. Workbench, and many others.

We should come up with a way to support long lived build processes that can support all these cases, keeping the worker processes around even as the Mill process repeatedly starts and exits.

Parallel Task Evaluation

The current Evaluator is sequential: tasks are evaluated one by one in a linear topological order.

Because we have the dependency graph between tasks, it should be straightforward to implement a parallel evaluator that evaluates tasks N-ways parallel. This would not require any change to the user-code implementing tasks and targets.

Open questions include how such a parallel evaluator will fit into the existing codebase (we probably want to keep the sequential evaluator around as an option indefinitely) and how to make sense of the interleaved log-spam from the parallel-executing tasks (we could provide a T.ctx().println method that will prefix the stdout with the name of the target being evaluated, and ask task writers to use that)

Scala.js testing framework support

Tests for ScalaJSModule should be executed with JavaScript engine. We probably don't need to support all JavaScript engines available in SBT, but having DOM support looks like a necessity.

Write a minimal unit test for scalaplugin/

#2 will be an integration test for the module, but we really should have a truly "hello world" unit test that exercises it's functionality in as little code/complexity as possible.

Once we have unit tests, we can start adding more functionality to the scalaplugin/ package, which we currently cannot do since testing it is manual and annoying. e.g. TestModule helpers which come with test pre-defined, SbtModule helpers which come with main+test in one bundle.

Enable just-in-time compilation of compiler-bridge

The Zinc incremental compiler requires versions of org.scala-sbt::compiler-bridge compiled for exactly the scala-compiler version you are using. Currently we pre-compile a few versions of compiler-bridge in our own build and use those, but we don't pre-compile all the publicly released versions (because it would take very long...) and we can't have pre-compiled versions for snapshots or other odd versions of Scala that may be floating around (type-level, release candidates, milestones, ...)

The correct thing to do is to download and compile compiler-bridge on-demand as part of a build, and cache/re-use it there-after.

ScalaTest tests do not appear to be running

sbt "~integration/testOnly -- mill.integration.JawnTests.scala2123"
Running jawn[2.12.3].Parser.test.test
CharBuilderSpec:
ChannelSpec:
JNumIndexCheck:
SyntaxCheck:

We should figure out why and get it running

Get rid of uPickle

Ammonite uses it, and we use it too for convenience, but it's basically unmaintained and should go away.

I tried using play-json for a while but that added a non-trivial amount of startup time (200-300ms?) just from classloading Jackson, which for some reason is a lot of classfiles.

The correct thing to do is probably to strip out all the gnarly recursive-implicit-derivation part of uPickle (which isn't needed for how we are using it here), keep the implicits and the non-implicit caseR/caseW/caseRW macro, and just use that instead.

We should also push this upstream to Ammonite to also use this stripped-down uPickle

Figure out some formal logging policy

Our logging right now is a bit of a mess, we need to answer:

  • How do we remove the verbosity of things like Running Core.test.upstreamCompileOutput, which hardly ever do any work?

  • What goes to stdout and what goes to stderr?

  • Where does --show fit into this, since a user calling --show is likely to want to capture the result as a clean JSON blob without all the misc log spam?

Ammonite's policy is that "misc" low-priority output (compiling files, blah) goes to stderr, leaving stdout for things that a user explicitly prints, so users can foo.sc > out.txt and be sure `out.txt doesn't contain any random trash.

Investigate Mill vs SBT performance

My own subjective feeling is that Mill is slower at the watch-edit-recompile-test workflow than SBT is. Previously this was largely due to Mill not keeping around warm copies of Scalac, but that should have been fixed already. Another point is that Mill doesn't currently run independent tasks in parallel, while SBT does.

This task is to investigate the relative performance of Mill vs SBT at the common watch-edit-recompile-test, to understand what the performance differences are and exactly what is causing them. There's no fundamental reason we should be any slower, and understanding our performance is the first step in making sure we're up to par

Create a "release" artifact that is configured to pull necessary jars from Ivy/Maven

Currently, Mill depends on a number of compiler-bridge jars which are created as part of the Mill build. Both sbt scalaplugin/test and the executable generated by sbt scalaplugin/test:assembly reference these jars, by hard-coding their absolute paths on disk. Thus the executable built by sbt scalaplugin/test:assembly would only work on the same computer you ran the SBT command on. The number of locally-built jars that Mill needs to reference programmatically is likely to grow over time (e.g. TestRunner will likely move into it's own jar, along with the ScalaModule zinc integration)

Obviously, this will not do if we want to make Mill usable by anyone on the internet. What we need is a "release" build of the Mill executable that will resolve the necessary compiler-bridge (and other) jars from Maven Central (or from your on-disk publishLocal maven cache, for testing) instead of hard-coding absolute paths to recently-built jars on disk.

Idea task should not depend on compilation success

It was a common problem in the days of sbt-idea and now we have the same issue in Mill โ€” idea task depends on the compilation results, so if my code is broken I cannot import a project into IDEA to fix it.

Watch mode doesn't handle new files

Currently watch mode (mill --watch proj.compile ) reacts only on changes from existed at startup moment files.
At least it should react on file creation. I'm not sure about removing, maybe there are some inotify limitations (comparing to sbt - it reacts on new files, but ignore removing)

Build Gitbucket using Mill

https://github.com/gitbucket/gitbucket seems like a nice project to shoot for: an open source application (rather than a library), but a relatively simple one with a relatively simple build (v.s. something like https://github.com/ornicar/lila which has a bajillion submodules), that uses some SBT plugins but not too many.

We should write a Mill build for Gitbucket, both to stretch Mill's capabilities (and discover any missing features that we need to support) as well as as an example build for the silent majority of Scala programmers building applications to do real work, rather than playing with open-source libraries.

Support for downloading & caching a binary as part of a build

Necessary to fully support the better-files build

Generally useful for all sorts of other things too.

We probably can just ignore the problem of expiration: once the file is downloaded, we keep it forever unless someone explicitly deletes it.

This is basically already the behavior of Targets e.g. def foo = T{...}, so we can just define a helper method:

def downloadedJar = T{ Util.download("https://github.com/williamfiset/FastJavaIO/releases/download/v1.0/fastjavaio.jar") }

Where Util.download is a Task that downloads the file to the T.ctx().dest path and returns a PathRef that we can feed into downstream Targets, eg. ScalaModule#depClasspath

Allow other kinds of input tasks which recompute every time

Currently, of all Tasks, only T.source has a non-constant sideHash that forces it to re-evaluate whenever the contents of the filesystem path it represents changes. This is necessary because otherwise, a Task with no inputs will never get re-evaluated.

We should generalize this to allow other non-filesystem-related Tasks to force re-evaluation every time: use cases include Tasks that load the sys.env("JAVA_HOME"), or which shell out to git to load the current commit hash. In these cases, the operation is fast enough I'm fine with never caching it ever.

We should provide a T.input task, used as such:

val currentGitHash = T.input{ %%("git", "rev-parse", "head").out }
val javaHome = T.input{ sys.env("JAVA_HOME") }

These T.inputs can be used in downstream Tasks and Targets. The body of the T.input will be re-evaluated every time, and if the result changes, it will bust caches of all downstream Tasks and Targets and force them to re-evaluate too

Dependencies resolution doesn't fail when artifact not found.

When dependency not found - for example, such version doesn't exist, or we don't have right resolver - dependency resolution doesn't fail, and we get error at compile time with error message like:

[error] application/src/main/scala/Main.scala:12:13: object circe is not a member of package io
[error]   import io.circe._

Commands should be given a `dest` directory

Currently T.ctx().dest only works for labelled Targets, which are publicly accessible. Tasks (which don't have a label) and overriden Targets are not given dest directories, because it's not clear what path that dest directory would have.

Currently, Commands do not have a dest directory. Given that the Commands do have a unique path by which they are run, they too should have a dest path assigned to them for them to use for scratch & output files. This would let us avoid creating temp dirs as we do here https://github.com/lihaoyi/mill/blob/bb61c05217671e80ba381b2cac869130c306baa8/scalaplugin/src/main/scala/mill/scalaplugin/ScalaModule.scala#L20

Bust caches when Mill classpath changes

Currently we bust caches every time the inputs of a Target group change: only if the inputs change do we consider the output change able to change, and thus the Target group needing to recompute.

This is insufficient: if the Target group's implementation itself changes, the output could change, and the Target should recompute. There is no way to account for this at a fine-grained level, but we can be conservative: we should incorporate the classpath-signature of the Mill build into our cache keys. If the classpath of the Mill build changes, we invalidate all caches.

We already have code in Ammonite (which we already depend on for the Scala script-running capability) that flushes Ammonite's script-compile caches when the classpath changes. We can re-use the same code to flush Mill's caches

This will flush things caches every time someone changes the build, even when it's unnecessary. Nevertheless, it's probably the right thing to do, especially when you consider builds change much more rarely than the code being built/run/tested, and stale caches are a pain in the neck

Only one `Task` in a group should be allowed to use the `T.ctx().dest` directory

Tasks are grouped together under their downstream Target before running, and every Task in that group has the same T.ctx().dest which is defined by the path to the downstream Target. Currently there is nothing stopping multiple Tasks from reading/writing from dest, which could allow for unintended interference between them.

We should fail with an error if more than one Task attempts to use the dest dir, for any purpose. How to provide a nice error message, given that Tasks don't need to have names, is a bit of an open question: maybe the stack trace of the points which use .dest are good enough?

Tighten up Ammonite script integration

Currently, the integration with Ammonite is pretty hacky and fragile. e.g. compile errors in build.sc files are treated badly:

lihaoyi mill$ git diff
diff --git a/build.sc b/build.sc
index f58c160..28d5706 100755
--- a/build.sc
+++ b/build.sc
@@ -18,7 +18,7 @@ trait MillModule extends ScalaModule{ outer =>
     def testFramework = "mill.UTestFramework"
   }
 }
-
+?
 object Core extends MillModule {
   override def compileIvyDeps = Seq(
     Dep.Java("org.scala-lang", "scala-reflect", scalaVersion())
lihaoyi mill$ scalaplugin/target/mill run Core.test
Compiling /Users/lihaoyi/Dropbox/Workspace/mill/build.sc
build.sc:21: not found: value ?
val res_4 = ?
            ^
(Failure(Compilation Failed),ArrayBuffer((/Users/lihaoyi/.ammonite/predef.sc,0), (/Users/lihaoyi/Dropbox/Workspace/mill/build.sc,1512792521000)))
Exception in thread "main" scala.NotImplementedError: an implementation is missing
	at scala.Predef$.$qmark$qmark$qmark(Predef.scala:284)
	at mill.Main.run(Main.scala:286)
	at mill.Main$.main(Main.scala:227)
	at mill.Main.main(Main.scala)

Ammonite's own CLI script-running functionality is rock solid, even in failure cases. We should make sure Mill's is of similar quality: informative error messages, no uncaught stack traces, and well-behaved in all cases including things like watch-&-rerun.

Code-generate .zipMap functions to support Applicative macro

Currently, these are defined manually, up to a maximum arity of 7. That means a T{...} call can only depend on up to 7 other tasks: an arbitrary limitation.

We should use source-code-generation to generate the zipMap functions up to the "standard" arity of 22, which is the limit on how many args the function literal can have. In doing so, we'd also need to flesh out the source-code-generation story for ScalaModule, making sure it's easy to generate and add arbitrary sources to the build (which would require changes to compileScala)

bug: console is not working

The new definition of mill.modules.Jvm.subprocess doesn't seem to work with interactive processes like scala console. Old way to run subprocess seems to work fine:

def subprocess(mainClass: String,
               classPath: Seq[Path],
               options: Seq[String] = Seq.empty) = {
  import ammonite.ops.ImplicitWd._
  %("java", "-cp", classPath.mkString(":"), mainClass, options)
}

If we are okay with two implementations, here is the fix

Incremental Scala compile with Zinc isn't working right

To repro:

git clean -xdf
sbt scalaplugin/test:assembly
scalaplugin/target/mill run ScalaPlugin.assembly
echo " " >> core/src/main/scala/mill/define/Graph.scala
scalaplugin/target/mill run ScalaPlugin.assembly

The last command will run into a bunch of compile errors, when it should compile successfully

Add nice prettyprints/toString for the Build REPL

Currently, the toStrings you get inspecting tasks/modules in the REPL is pretty bare:

lihaoyi mill$ scalaplugin/target/mill --repl
Loading...
@ build
res0: build.type = build

@ build.Core
res1: Core.type = ammonite.predef.$up.build$Core$@33ce764b

@ build.Core.compile
res2: T[scalaplugin.CompilationResult] = mill.define.Persistent@ddd029e

While you can poke around with autocomplete or typeOf(build).members, it would be nice to hook into Ammonite's pretty-printer and use it to show nicer output at the REPL:

lihaoyi mill$ scalaplugin/target/mill --repl
Loading...
@ build
res0: build.type = build
Children:
- build.Core
- build.ScalaPlugin
- build.bridges

@ build.Core
res1: Core.type = build.Core
Children:
- build.Core.assembly
- build.Core.basePath
- build.Core.classpath
- build.Core.compile
- build.Core.compileDepClasspath
- build.Core.console
...

@ build.Core.compile
res2: T[scalaplugin.CompilationResult] = build.Core.compile
Inputs:
- build.Core.scalaVersion
- build.Core.allSources
- build.Core.compileDepClasspath
- build.Core.scalaCompilerClasspath
- build.Core.compilerBridgeClasspath
- build.Core.scalacOptions
- build.Core.scalacPluginClasspath
- build.Core.javacOptions
- build.Core.upstreamCompileOutput

This can be done by hooking into the Ammonite REPL pretty-printer:

repl.pprinter() = repl.pprinter().copy(additionalHandlers = ...)

We can make use of the Discovered mirror within the additionalHandlers, which should contain the information needed for labelling all the Tasks

It might be harder to hook into toString output, since we can't make use of the Discovered mirror within toString, but maybe there's still something we can do there

Build REPL

Since we are using Ammonite to run build.sc scripts, it shouldn't be a stretch to also use Ammonite as the build REPL: to keep Mill alive and run multiple commands with a warm JVM, and possibly to inspect and query the build programmatically.

I'd like to be able to build targets using the Build REPL the same way you can refer to in build.sc files:

Core.jar()
bridges("2.10.6").compile()

Where () forces the target to be build & evaluated. This is in addition to running arbitrary Scala code.

This will take some changes to the way () is handled, so we can use () for the applicative T{...} syntax in build files, () for "evaluate now" functionality in the REPL, but not let people use () in build files for "evaluate now" (they should only use it in T{...} syntax within build files)

Watch & re-run functionality in the REPL

You currently can use Ammonite's --watch flag to watch & re-run targets from the Bash shell. When used to run the REPL, --watch only serves to re-start the REPL if you exited it after making changes.

We should provide a watch(Core.compile) function that can be used to watch & re-run a one (or more) targets from Build REPL

Support forked execution for ScalaModule#run and TestScalaModule#test

Currently, ScalaModule#run runs in a subprocess and TestScalaModule#test runs in-process in a Classloader. This is arbitrary and inconsistent

We should standardize on a way to let either method run either subprocessed or in-process, and make this available to the user of Mill if they want to run any arbitrary JVM classpath as part of any task.

`mill.Module`s should have a default `basePath`

e.g. Core's should default to Core/ (or maybe core/?), bridges[2.10.6] should default to bridges/2.10.6/, and so on.

This can be used to automatically have e.g. ScalaModule pick up sources in the right place, rather than having to specify the basePath manually each time. For now Modules would be able to continue picking up sources outside the basePath, but this at least let's people provide a nice default, and later on if we wanted we could restrict Modules to only picking up sources within their allocated basePath

Add an SbtScalaModule for easy migration from SBT project layouts

Currently, ScalaModule is a bit low-level: it defaults to your Scala code being in src/ (which differs from SBT's convention of src/main/scala/) and you need to wire up a val test = new Tests suite yourself.

While I think there's value in providing a "simpler" ScalaModule without the decades of SBT/Maven/Ivy baggage, we should also provide a SbtScalaModule to ease the conversion of projects from SBT to Mill. SbtScalaModule should only require you to define your basePath, and automatically set:

  • source code in src/main/scala/
  • resources in src/main/resource/
  • test code in src/test/scala/
  • test resources in src/test/resource/

We should also figure out a story for cross-SbtScalaModules build against multiple Scala versions (with version-specific sources) as these are very common in open-source libraries

Set up CI infrastructure

Sooner or later, we'll need a CI for mill. We could start it sooner, to have more confidence when merging PR's

Task Sandboxing

Bazel restricts tasks being run to their given work directory: using sandbox-exec on OS-X, and LXC Containers on Linux. This is extremely useful to ensure that

  • Badly-configured builds don't "accidentally" pass due to stale state lying around the filesystem
  • Those builds don't fail mysteriously when that stale state is removed
  • The presence of stale state doesn't cause other builds to fail mysteriously

We can do the same thing, at least to a best effort: using Java SecurityManagers to limit file access in Mill JVM code, and forcing people to use a "blessed" subprocess interface that restricts subprocesses using the same OS tools that Bazel uses.

Builds becoming brittle because parts of it accidentally/implicitly depend on other parts that "have to"/"have always been" run before-hand is a common problem, and this would fix it.

Cross-compiling across minor versions of Scala is broken

Apply this patch:

lihaoyi mill$ git diff
diff --git a/scalaplugin/src/test/scala/mill/scalaplugin/AcyclicTests.scala b/scalaplugin/src/test/scala/mill/scalaplugin/AcyclicTests.scala
index 7ba8b26..89ed57c 100644
--- a/scalaplugin/src/test/scala/mill/scalaplugin/AcyclicTests.scala
+++ b/scalaplugin/src/test/scala/mill/scalaplugin/AcyclicTests.scala
@@ -9,7 +9,7 @@ import utest._
 import mill.util.JsonFormatters._
 object AcyclicBuild{
   val acyclic =
-    for(crossVersion <- Cross("2.10.6", "2.11.8", "2.12.4"))
+    for(crossVersion <- Cross("2.10.6", "2.11.8", "2.12.3"))
     yield new ScalaModule{outer =>
       def basePath = AcyclicTests.workspacePath
       def organization = "com.lihaoyi"
@@ -46,7 +46,7 @@ object AcyclicTests extends TestSuite{

     'scala210 - check("2.10.6")
     'scala211 - check("2.11.8")
-    'scala212 - check("2.12.4")
+    'scala212 - check("2.12.3")

     val allBinaryVersions = Seq("2.10", "2.11", "2.12")
     def check(scalaVersion: String) = {

Run the test:

sbt "scalaplugin/test-only -- mill.scalaplugin.AcyclicTests.scala212"

See it blow up:

------------- Running Tests mill.scalaplugin.AcyclicTests.scala212 -------------
[info] Compiling 5 Scala sources to /Users/lihaoyi/Dropbox/Workspace/mill/target/workspace/acyclic/acyclic/2.12.3/compile/classes ...
[error] ## Exception when compiling 5 sources to /Users/lihaoyi/Dropbox/Workspace/mill/target/workspace/acyclic/acyclic/2.12.3/compile/classes
[error] vtable stub
[error] java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:488)
[error] java.lang.StringBuilder.append(StringBuilder.java:166)
[error] xsbt.ClassName.$anonfun$classNameAsSeenIn$1(ClassName.scala:83)
[error] scala.reflect.internal.SymbolTable.enteringPhase(SymbolTable.scala:237)
[error] xsbt.ClassName.classNameAsSeenIn(ClassName.scala:78)
[error] xsbt.ClassName.classNameAsSeenIn$(ClassName.scala:76)
[error] xsbt.ExtractAPI.classNameAsSeenIn(ExtractAPI.scala:49)
[error] xsbt.ExtractAPI.mkClassLike(ExtractAPI.scala:637)
[error] xsbt.ExtractAPI.$anonfun$classLike$1(ExtractAPI.scala:620)
[error] scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:82)
[error] xsbt.ExtractAPI.classLike(ExtractAPI.scala:620)
[error] xsbt.ExtractAPI.extractAllClassesOf(ExtractAPI.scala:605)
[error] xsbt.API$TopLevelHandler.class(API.scala:65)
[error] xsbt.API$TopLevelTraverser.traverse(API.scala:73)
[error] xsbt.API$TopLevelTraverser.traverse(API.scala:69)
[error] scala.reflect.api.Trees$Traverser.$anonfun$traverseStats$2(Trees.scala:2498)
[error] scala.reflect.api.Trees$Traverser.atOwner(Trees.scala:2507)
[error] scala.reflect.api.Trees$Traverser.$anonfun$traverseStats$1(Trees.scala:2498)
[error] scala.reflect.api.Trees$Traverser.traverseStats(Trees.scala:2497)
[error] scala.reflect.internal.Trees.itraverse(Trees.scala:1337)
[error] scala.reflect.internal.Trees.itraverse$(Trees.scala:1211)
[error] scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:16)
[error] scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:16)
[error] scala.reflect.api.Trees$Traverser.traverse(Trees.scala:2475)
[error] xsbt.API$TopLevelTraverser.traverse(API.scala:75)
[error] xsbt.API$TopLevelTraverser.traverse(API.scala:69)
[error] scala.reflect.api.Trees$Traverser.$anonfun$traverseStats$2(Trees.scala:2498)
[error] scala.reflect.api.Trees$Traverser.atOwner(Trees.scala:2507)
[error] scala.reflect.api.Trees$Traverser.$anonfun$traverseStats$1(Trees.scala:2498)
[error] scala.reflect.api.Trees$Traverser.traverseStats(Trees.scala:2497)
[error] scala.reflect.internal.Trees.itraverse(Trees.scala:1337)
[error] scala.reflect.internal.Trees.itraverse$(Trees.scala:1211)
[error] scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:16)
[error] scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:16)
[error] scala.reflect.api.Trees$Traverser.traverse(Trees.scala:2475)
[error] xsbt.API$TopLevelTraverser.traverse(API.scala:75)
[error] xsbt.API$TopLevelTraverser.traverse(API.scala:69)
[error] scala.reflect.api.Trees$Traverser.apply(Trees.scala:2513)
[error] xsbt.API$ApiPhase.processScalaUnit(API.scala:43)
[error] xsbt.API$ApiPhase.processUnit(API.scala:35)
[error] xsbt.API$ApiPhase.apply(API.scala:33)
[error] scala.tools.nsc.Global$GlobalPhase.$anonfun$applyPhase$1(Global.scala:426)
[error] scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:419)
[error] scala.tools.nsc.Global$GlobalPhase.$anonfun$run$1(Global.scala:390)
[error] scala.tools.nsc.Global$GlobalPhase.$anonfun$run$1$adapted(Global.scala:390)
[error] scala.collection.Iterator.foreach(Iterator.scala:929)
[error] scala.collection.Iterator.foreach$(Iterator.scala:929)
[error] scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
[error] scala.tools.nsc.Global$GlobalPhase.run(Global.scala:390)
[error] xsbt.API$ApiPhase.run(API.scala:27)
[error] scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1431)
[error] scala.tools.nsc.Global$Run.compileUnits(Global.scala:1416)
[error] scala.tools.nsc.Global$Run.compileSources(Global.scala:1412)
[error] scala.tools.nsc.Global$Run.compile(Global.scala:1515)
[error] xsbt.CachedCompiler0.run(CompilerInterface.scala:131)
[error] xsbt.CachedCompiler0.run(CompilerInterface.scala:106)
[error] xsbt.CompilerInterface.run(CompilerInterface.scala:32)
[error] sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] java.lang.reflect.Method.invoke(Method.java:498)
[error] sbt.internal.inc.AnalyzingCompiler.call(AnalyzingCompiler.scala:237)
[error] sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:111)
[error] sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:90)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3(MixedAnalyzingCompiler.scala:83)
[error] scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
[error] sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:134)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:74)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:117)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:305)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:305)
[error] sbt.internal.inc.Incremental$.doCompile(Incremental.scala:101)
[error] sbt.internal.inc.Incremental$.$anonfun$compile$4(Incremental.scala:82)
[error] sbt.internal.inc.IncrementalCommon.recompileClasses(IncrementalCommon.scala:117)
[error] sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:64)
[error] sbt.internal.inc.Incremental$.$anonfun$compile$3(Incremental.scala:84)
[error] sbt.internal.inc.Incremental$.manageClassfiles(Incremental.scala:129)
[error] sbt.internal.inc.Incremental$.compile(Incremental.scala:75)
[error] sbt.internal.inc.IncrementalCompile$.apply(Compile.scala:70)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:309)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:267)
[error] sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:158)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:237)
[error] sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:68)
[error] mill.scalaplugin.ScalaModule$.compileScala(ScalaModule.scala:129)
[error] mill.scalaplugin.ScalaModule.$anonfun$compile$3(ScalaModule.scala:327)
[error] mill.eval.Result$.create(Result.scala:6)
[error] mill.scalaplugin.ScalaModule.$anonfun$compile$2(ScalaModule.scala:328)
[error] mill.define.Applicative$Applyer.$anonfun$zipMap$7(Applicative.scala:43)
[error] mill.define.Task$MappedDest.evaluate(Task.scala:186)
[error] mill.eval.Evaluator.$anonfun$evaluateGroup$7(Evaluator.scala:167)
[error] mill.eval.Evaluator.$anonfun$evaluateGroup$7$adapted(Evaluator.scala:156)
[error] scala.collection.Iterator.foreach(Iterator.scala:929)
[error] scala.collection.Iterator.foreach$(Iterator.scala:929)
[error] scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
[error] scala.collection.IterableLike.foreach(IterableLike.scala:71)
[error] scala.collection.IterableLike.foreach$(IterableLike.scala:70)
[error] scala.collection.AbstractIterable.foreach(Iterable.scala:54)
[error] mill.eval.Evaluator.evaluateGroup(Evaluator.scala:156)
[error] mill.eval.Evaluator.evaluateGroupCached(Evaluator.scala:107)
[error] mill.eval.Evaluator.$anonfun$evaluate$2(Evaluator.scala:33)
[error] mill.eval.Evaluator.$anonfun$evaluate$2$adapted(Evaluator.scala:29)
[error] scala.collection.Iterator.foreach(Iterator.scala:929)
[error] scala.collection.Iterator.foreach$(Iterator.scala:929)
[error] scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
[error] mill.eval.Evaluator.evaluate(Evaluator.scala:29)
[error] mill.scalaplugin.TestEvaluator$.eval(TestEvaluator.scala:21)
[error] mill.scalaplugin.AcyclicTests$.eval$1(AcyclicTests.scala:43)
[error] mill.scalaplugin.AcyclicTests$.check$1(AcyclicTests.scala:64)
[error] mill.scalaplugin.AcyclicTests$.$anonfun$tests$76(AcyclicTests.scala:49)
[error] utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
[error] utest.framework.TestCallTree.run(Model.scala:35)
[error] utest.framework.TestCallTree.run(Model.scala:33)
[error] utest.TestRunner$.$anonfun$runAsync$5(TestRunner.scala:74)
[error] utest.framework.Executor.utestWrap(Executor.scala:12)
[error] utest.framework.Executor.utestWrap$(Executor.scala:10)
[error] utest.TestSuite.utestWrap(TestSuite.scala:18)
[error] utest.TestRunner$.$anonfun$runAsync$4(TestRunner.scala:71)
[error] utest.framework.StackMarker$.dropOutside(StackMarker.scala:13)
[error] utest.TestRunner$.$anonfun$runAsync$2(TestRunner.scala:71)
[error] utest.TestRunner$.evaluateFutureTree(TestRunner.scala:170)
[error] utest.TestRunner$.$anonfun$evaluateFutureTree$2(TestRunner.scala:173)
[error] scala.concurrent.Future$.$anonfun$traverse$1(Future.scala:841)
[error] scala.collection.IndexedSeqOptimized.foldLeft(IndexedSeqOptimized.scala:56)
[error] scala.collection.IndexedSeqOptimized.foldLeft$(IndexedSeqOptimized.scala:64)
[error] scala.collection.mutable.ArrayBuffer.foldLeft(ArrayBuffer.scala:48)
[error] scala.concurrent.Future$.traverse(Future.scala:841)
[error] utest.TestRunner$.evaluateFutureTree(TestRunner.scala:173)
[error] utest.TestRunner$.runAsync(TestRunner.scala:98)
[error] utest.runner.BaseRunner.runSuite(BaseRunner.scala:160)
[error] utest.runner.BaseRunner.$anonfun$makeTask$1(BaseRunner.scala:171)
[error] utest.runner.Task.execute(Task.scala:20)
[error] sbt.TestRunner.runTest$1(TestFramework.scala:76)
[error] sbt.TestRunner.run(TestFramework.scala:85)
[error] sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
[error] sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
[error] sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:185)
[error] sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
[error] sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
[error] sbt.TestFunction.apply(TestFramework.scala:207)
[error] sbt.Tests$.sbt$Tests$$processRunnable$1(Tests.scala:239)
[error] sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
[error] sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:245)
[error] sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
[error] sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
[error] sbt.std.Transform$$anon$4.work(System.scala:63)
[error] sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
[error] sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
[error] sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
[error] sbt.Execute.work(Execute.scala:237)
[error] sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
[error] sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
[error] sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
[error] sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
[error] java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] java.lang.Thread.run(Thread.java:748)
[error]
X mill.scalaplugin.AcyclicTests.scala212 18260ms
  scala.MatchError: Left(Exception(java.lang.IncompatibleClassChangeError: vtable stub)) (of class s
  cala.util.Left)
    mill.scalaplugin.AcyclicTests$.check$1(AcyclicTests.scala:64)
    mill.scalaplugin.AcyclicTests$.$anonfun$tests$76(AcyclicTests.scala:49)
Tests: 1, Passed: 0, Failed: 1
[error] Failed tests:
[error] 	mill.scalaplugin.AcyclicTests
[error] (scalaplugin/test:testOnly) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 46 s, completed 29 Nov, 2017 7:53:48 PM

Write a `TestScalaModule` class to make it easier to specify test suites

Currently specifying a test suite is kind of annoying

object CoreTests extends ScalaModule {
  def scalaVersion = "2.12.4"
  override def projectDeps = Seq(Core)
  def basePath = pwd / 'scalaplugin
  override def sources = pwd/'core/'src/'test/'scala
  override def ivyDeps = Seq(
    Dep("com.lihaoyi", "utest", "0.6.0")
  )

  def test() = T.command{
    TestRunner.apply(
      "mill.UTestFramework",
      runDepClasspath().map(_.path) :+ compile().path,
      Seq(compile().path)
    )
  }
}

TestScalaModule would encapsulate all the standard stuff into

object CoreTests extends TestScalaModule {
  def projectDeps = Seq(Core) // scalaVersion inferred from projectDeps
  override def sources = pwd/'core/'src/'test/'scala
  override def ivyDeps = Seq(Dep("com.lihaoyi", "utest", "0.6.0"))
  def testFramework = "mill.UTestFramework"
}

Changes in build don't invalidate caches

Minimal reproducible:

mkdir mill-repr
cd mill-repr

echo 'import mill._
import mill.scalaplugin._
import ammonite.ops._

object Invalidate extends ScalaModule {
  def scalaVersion = "2.12.4"
  def basePath = pwd
  override def mainClass = Some("Main")

  override def ivyDeps = Seq(
    Dep("io.circe", "circe-core", "0.8.0"),
    Dep("io.circe", "circe-parser", "0.8.0")
  )
}' > build.sc

mkdir -p src/main/scala

echo 'object Main extends App {
  println(io.circe.parser.parse("""{"a": 2}"""))
}' > src/main/scala/Main.scala

mill run Invalidate.run

now change circe version to 0.9999.0 and run mill run Invalidate.run once again.
Expected behavior: mill should invalidate build caches and fail at resolving/compilation.
Actual behavior: build ran as the previous one

If you rm -rf out and run mill run Invalidate.run once again - you'll get error at compilation

Create simple task that downloads and caches dependencies

Currently, forcing a mill to update dependencies requires the user to execute a task named externalCompileDepClasspath which is hard to remember and hard to type.

Similar to sbt's update, we need to create a simply named task that forces dependency resolution.
Suggestions

  • update
  • resolve
  • deps

Write unit tests for GenIdea

This is pretty fragile code that is reasonably entangled with ScalaModule's internals, and has broken in the past. We should have at least a simple unit test that does GenIdea on a trivial build and checks the output against some hardcoded XML files, just to avoid accidental regressions and breakage

Log task output to files

Currently, every task Ctx exposes a logger, and we also capture stdout/stderr from task body evaluation and push it through the logger. This is useful for e.g. running Mill evaluations silently in unit tests, to avoid logspam

We should also log the output related to each task to a log file. Each task foo.bar already has a out/foo/bar folder to use for output files and out/foo/bar.mill.json to cache its return value. We can stream the logs to out/foo/bar.log as the task executed, so a user can easily find the logs related to each task.

This is especially important once we parallelize task execution, and the console output of Mill becomes all interleaved, and the logfiles will be the only place to find non-interleaved task output.

Sonatype PublishSigned support?

One of the major things Mill is currently missing is the ability to publishSigned artifacts (along with source jars, etc.) to Maven central.

We already have the ability to create Jars, e.g. mill run ScalaPlugin.assembly. #3 will allow us to make them executable. The last step is to let us publish them to some artifact repository: Maven Central, Bintray, etc.

I'm not familiar with what would go into uploading a Jar to maven central, but some good resources may be:

Basically whatever SBT-PGP is doing, we should be able to do the same

Implement `eval` function to let user build multiple targets in the REPL

Currently, you can only select one target at a time:

Core.compile()
Core.test.compile()
Core.test.test()
ScalaPlugin.assembly()

Mill's evaluate function already lets you select a Vector of Tasks you want to build. We should expose that to the user in the REPL:

val (a, b) = eval(Core.compile, Core.test.compile)
val (a, b, c) = eval(Core.compile, Core.test.compile, ScalaPlugin.assembly)

More flexible query language for running multiple tasks

Currently running multiple tasks in Mill is a bit of a pain:

mill run Core.test && mill run ScalaPlugin.test

Even though the second task will benefit from caching of the first, the tasks run sequentially. Mill's execution engine is actually happy to evaluate multiple tasks in parallel, we just need to expose it to the user.

uTest has a similar problem (users want to select multiple paths of tests to run) and allows syntax like:

testOnly -- test.examples.NestedTests.outer1.{inner1,inner2}
testOnly -- test.examples.NestedTests.{outer1.inner1,outer2.inner3}
testOnly -- {test.examples.HelloTests.test1,test.examples.NestedTests.outer2}

We could copy that, but we'd probably also want more flexibility, e.g.

mill run {Core,ScalaPlugin}.test
# Run all tasks ending in `.test`; might need to pick a different syntax for bash-compatibility
mill run *.test 

Notably, whatever syntax we choose should be able to play nicely with tasks taking command-line arguments

mill run Core.test mill.define.ApplicativeTests

And hopefully we'll be able to find something with less awkward quoting than SBT's

sbt "core/testOnly -- mill.define.ApplicativeTests"

Pass analysis of dependent projects to Zinc

Zinc needs a lookup implementation that maps directories on the classpath to an Analysis object. This allows Zinc to "know" when to recompile sources when they depend on other sub-projects in the build. Right now, dependent projects would always be recompiled, regardless of what changed or not. See eclipse implementation

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.