Giter Club home page Giter Club logo

rules_scala's Introduction

higherkindness/rules_scala

Build Status

higherkindness/rules_scala evolved, in part, from the need for Bazel adoption support for large, monorepo Scala projects. Bazel is wonderful because it makes use of parallelism and caching to vastly improve build times. However, to see these benefits, a project must first be broken down into tiny packages and make use of fine-grained dependencies. This is not always a realistic short-term goal for large, monorepo Scala projects.

higherkindness/rules_scala allows for the optional use of Zinc incremental compilation to provide a stepping stone for these projects as they migrate to Bazel.

higherkindness/rules_scala is written with maintainability and accessibility in mind. It aims to facilitate the transition to Bazel, and to satisfy use cases throughout the Scala ecosystem.

Principles

  1. Support the breadth of the Scala ecosystem.
  2. Follow Bazel best practices.
  3. Be accessible and maintainable.
  4. Have high-quality documentation.

If the right design principles are kept, implementing additional features should be simple and straightforward.

Features

Usage

WORKSPACE

load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")

# Load rules scala annex
rules_scala_annex_version = "ae99fcb08bbddfc24fef00d7b13f6c065e1df8d5"
rules_scala_annex_sha256 = "1630fc7ecc7a4ffeabcdef73c7600eab9cf3fd2377db1f69b8ce1927560211ff"
http_archive(
    name = "rules_scala_annex",
    sha256 = rules_scala_annex_sha256,
    strip_prefix = "rules_scala-{}".format(rules_scala_annex_version),
    url = "https://github.com/higherkindness/rules_scala/archive/{}.zip".format(rules_scala_annex_version),
)

rules_jvm_external_tag = "2.9"
rules_jvm_external_sha256 = "e5b97a31a3e8feed91636f42e19b11c49487b85e5de2f387c999ea14d77c7f45"
http_archive(
    name = "rules_jvm_external",
    sha256 = rules_jvm_external_sha256,
    strip_prefix = "rules_jvm_external-{}".format(rules_jvm_external_tag),
    url = "https://github.com/bazelbuild/rules_jvm_external/archive/{}.zip".format(rules_jvm_external_tag),
)

load("@rules_scala_annex//rules/scala:workspace.bzl", "scala_register_toolchains", "scala_repositories")
scala_repositories()
load("@annex//:defs.bzl", annex_pinned_maven_install = "pinned_maven_install")
annex_pinned_maven_install()
scala_register_toolchains()

load("@rules_scala_annex//rules/scalafmt:workspace.bzl", "scalafmt_default_config", "scalafmt_repositories")
scalafmt_repositories()
load("@annex_scalafmt//:defs.bzl", annex_scalafmt_pinned_maven_install = "pinned_maven_install")
annex_scalafmt_pinned_maven_install()
scalafmt_default_config()

load("@rules_scala_annex//rules/scala_proto:workspace.bzl", "scala_proto_register_toolchains", "scala_proto_repositories",)
scala_proto_repositories()
load("@annex_proto//:defs.bzl", annex_proto_pinned_maven_install = "pinned_maven_install")
annex_proto_pinned_maven_install()
scala_proto_register_toolchains()

# Load bazel skylib and google protobuf
bazel_skylib_tag = "1.0.2"
bazel_skylib_sha256 = "97e70364e9249702246c0e9444bccdc4b847bed1eb03c5a3ece4f83dfe6abc44"
http_archive(
    name = "bazel_skylib",
    sha256 = bazel_skylib_sha256,
    urls = [
        "https://mirror.bazel.build/github.com/bazelbuild/bazel-skylib/releases/download/{tag}/bazel-skylib-{tag}.tar.gz".format(tag = bazel_skylib_tag),
        "https://github.com/bazelbuild/bazel-skylib/releases/download/{tag}/bazel-skylib-{tag}.tar.gz".format(tag = bazel_skylib_tag),
    ],
)

protobuf_tag = "3.10.1"
protobuf_sha256 = "678d91d8a939a1ef9cb268e1f20c14cd55e40361dc397bb5881e4e1e532679b1"
http_archive(
    name = "com_google_protobuf",
    sha256 = protobuf_sha256,
    strip_prefix = "protobuf-{}".format(protobuf_tag),
    type = "zip",
    url = "https://github.com/protocolbuffers/protobuf/archive/v{}.zip".format(protobuf_tag),
)

load("@com_google_protobuf//:protobuf_deps.bzl", "protobuf_deps")
protobuf_deps()

# Specify the scala compiler we wish to use; in this case, we'll use the default one specified in rules_scala_annex
bind(
    name = "default_scala",
    actual = "@rules_scala_annex//src/main/scala:zinc_2_12_10",
)

BUILD

load("@rules_scala_annex//rules:scala.bzl", "scala_library")

scala_library(
  name = "example",
  srcs = glob(["**/*.scala"])
)

Further Documentation

See contributing guidlines for help on contributing to this project.

For all rules and attributes, see the Stardoc.

rules_scala's People

Contributors

andyscott avatar bjchambers avatar borkaehw avatar coreywoodfield avatar darl avatar hmemcpy avatar jakemcc avatar jaredneil avatar jhenline-lucid avatar jin avatar jjudd avatar joprice avatar jvican avatar kerinin avatar pauldraper avatar srodriguezo avatar stephenjudkins avatar timothyklim avatar virusdave avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rules_scala's Issues

Trouble starting a project

Trying to get off the ground with bazel, but having trouble with a minimal example: when I bazel build :scalabook for example I get the following error:

WARNING: /private/var/tmp/_bazel_parth.mehrotra/56e65e76c6e5acbbd30fd286bccbe6a3/external/bazel_tools/tools/jdk/BUILD:116:1: in alias rule @bazel_tools//tools/jdk:java: target '@bazel_tools//tools/jdk:java' depends on deprecated target '@local_jdk//:java': Don't depend on targets in the JDK workspace; use @bazel_tools//tools/jdk:current_java_runtime instead (see https://github.com/bazelbuild/bazel/issues/5594)
WARNING: /private/var/tmp/_bazel_parth.mehrotra/56e65e76c6e5acbbd30fd286bccbe6a3/external/rules_scala_annex/src/main/scala/higherkindness/rules_scala/common/args/BUILD:4:1: in scala_library rule @rules_scala_annex//src/main/scala/higherkindness/rules_scala/common/args:args: target '@rules_scala_annex//src/main/scala/higherkindness/rules_scala/common/args:args' depends on deprecated target '@local_jdk//:java': Don't depend on targets in the JDK workspace; use @bazel_tools//tools/jdk:current_java_runtime instead (see https://github.com/bazelbuild/bazel/issues/5594)
ERROR: /private/var/tmp/_bazel_parth.mehrotra/56e65e76c6e5acbbd30fd286bccbe6a3/external/rules_scala_annex/src/main/scala/higherkindness/rules_scala/common/args/BUILD:4:1: in scala_library rule @rules_scala_annex//src/main/scala/higherkindness/rules_scala/common/args:args:
Traceback (most recent call last):
        File "/private/var/tmp/_bazel_parth.mehrotra/56e65e76c6e5acbbd30fd286bccbe6a3/external/rules_scala_annex/src/main/scala/higherkindness/rules_scala/common/args/BUILD", line 4
                scala_library(name = 'args')
        File "/private/var/tmp/_bazel_parth.mehrotra/56e65e76c6e5acbbd30fd286bccbe6a3/external/rules_scala_annex/rules/scala.bzl", line 203, in _scala_library_implementation
                _run_phases(ctx, [("resources", _phase_resourc...)])
        File "/private/var/tmp/_bazel_parth.mehrotra/56e65e76c6e5acbbd30fd286bccbe6a3/external/rules_scala_annex/rules/private/phases/api.bzl", line 28, in _run_phases
                function(ctx, g)
        File "/private/var/tmp/_bazel_parth.mehrotra/56e65e76c6e5acbbd30fd286bccbe6a3/external/rules_scala_annex/rules/private/phases/phase_javainfo.bzl", line 25, in function
                java_common.run_ijar(ctx.actions, jar = ctx.outputs.jar, <2 more arguments>)
expected value of type 'JavaToolchainSkylarkApiProvider' for parameter 'java_toolchain', for call to method run_ijar(actions, jar, target_label = None, java_toolchain) of 'java_common'
ERROR: Analysis of target '//:scalabook' failed; build aborted: Analysis of target '@rules_scala_annex//src/main/scala/higherkindness/rules_scala/common/args:args' failed; build aborted
INFO: Elapsed time: 0.076s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded, 0 targets configured)

isolation = "process" doesn't include Predef

Tried out the new isolation features for tests. Ran into the following error with isolation = "process"

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
	at org.specs2.runner.SbtRunner.tasks(SbtRunner.scala:27)
	at annex.SubprocessTestRunner$.$anonfun$main$3(SubprocessRunner.scala:31)
	at annex.SubprocessTestRunner$.$anonfun$main$3$adapted(SubprocessRunner.scala:30)
	at annex.TestFrameworkRunner$.withRunner(TestFrameworkRunner.scala:19)
	at annex.SubprocessTestRunner$.$anonfun$main$2(SubprocessRunner.scala:30)
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:12)
	at annex.ClassLoader$.withContextClassLoader(ClassLoader.scala:8)
	at annex.SubprocessTestRunner$.main(SubprocessRunner.scala:30)
	at annex.SubprocessTestRunner.main(SubprocessRunner.scala)

My guess is isn't being included in the new process. Will need to investigate more to find more info.

`scala_proto_library` should respect `strip_import_prefix`

Proto targets compiled with stripped prefixes are empty srcjars.

Minimal build file

proto_library(
    name = "example_proto",
    srcs = ["example.proto"],
    strip_import_prefix = "/prefix",
)

scala_proto_library(
    name = "example_scala_proto",
    deps = [":example_proto"],
)

Source proto:

# /prefix/example.proto
syntax = "proto3";
package foo
message Foo {}

Output:

$ bazel build prefix:example_scala_proto                                                                                                                                                                                                                                                                                               Sunday August 11 10:32:28 AM
INFO: Analyzed target //prefix:example_scala_proto (1 packages loaded, 3 targets configured).
INFO: Found 1 target...
INFO: From ScalaProtoCompile prefix/tmp/example_scala_proto:
protoc-jar: protoc version: 3.8.0, detected platform: osx-x86_64 (mac os x/x86_64)
protoc-jar: embedded: bin/3.8.0/protoc-3.8.0-osx-x86_64.exe
protoc-jar: executing: [/var/folders/t9/2yf73c553136pl7hzsrk8msm0000gn/T/protocjar10477993719091343746/bin/protoc.exe, --plugin=protoc-gen-scala=/var/folders/t9/2yf73c553136pl7hzsrk8msm0000gn/T/protocbridge3494382799574754229, --scala_out=bazel-out/darwin-fastbuild/bin/prefix/tmp/example_scala_proto, bazel-out/darwin-fastbuild/bin/prefix/_virtual_imports/example_proto/example.proto]
bazel-out/darwin-fastbuild/bin/prefix/_virtual_imports/example_proto/example.proto:3:1: Expected ";".
Target //prefix:example_scala_proto up-to-date:
  bazel-bin/prefix/example_scala_proto.srcjar
INFO: Elapsed time: 0.264s, Critical Path: 0.13s
INFO: 2 processes: 1 darwin-sandbox, 1 worker.
INFO: Build completed successfully, 4 total actions

The generated file (bazel-bin/prefix/example_scala_proto.srcjar) is empty.

Passing arguments to test frameworks?

I'm trying to run my scalatest tests and exclude tests marked as Slow from running. I'd usually do this by passing `-l org.scalatest.tags.Slow' as arguments to the scalatest.

Is there a way to do this? I've tried a few things that seemed promising but have haven't figured it out yet.

Some of the documentation makes me think it should be doable (Supports <...> custom test framework arguments) and that I must be missing something.

Any pointers?

Rule loading currently requires scalafmt repositories

The rules currently fail to load without the following in WORKSPACE:

load("@rules_scala_annex//rules/scalafmt:workspace.bzl", "scalafmt_default_config", "scalafmt_repositories")
scalafmt_repositories()
scalafmt_default_config()

Host filesystem permissions leak into resource jars

Building the same resource jar across two different hosts currently yields different results if the hosts apply different filesystem permissions after cloning the repo. Git itself only tracks executable flags for files.

Better Zinc Log Format

The Zinc output log is getting harder to read whenever we get a lot of messages, especially with long file path and long messages.
Screenshot from 2019-07-23 16-15-32

It would be good if we can come out with an idea of improving the readability. Adding distinction to the file path would be one option, it becomes more clear where the actually message starts.

How about making it bold?
Screenshot from 2019-07-23 16-14-20

How about putting new lines?
Screenshot from 2019-07-23 16-16-58

Or we could have both?
Screenshot from 2019-07-23 16-17-44

Any suggestion will be appreciated. Thanks.

Scaladoc rule fails when including dependencies

When using the scaladoc rule and including a dep tag I get the following error. Stacktrace points to

classpath = depset(transitive = [dep[JavaInfo].transitive_compile_time_deps for dep in ctx.attr.deps])

Not sure if this is something to do with my environment, or a known problem with the scaladoc rule, but wanted to put this out here in case anyone else is running into a similar issue.

If more information is needed, or this is not the proper format you all prefer for issues to be created in please let me know, definitely don't want to be creating superfluous issues if I am just missing something obvious.

'JavaInfo' object has no attribute 'transitive_compile_time_deps'
Available attributes: annotation_processing, compilation_info, compile_jars, full_compile_jars, outputs, runtime_output_jars, source_jars, transitive_compile_time_jars, transitive_deps, transitive_exports, transitive_runtime_deps, transitive_runtime_jars, transitive_source_jars

Provide default compilers for various versions of Scala

Users can currently use Scala 2.12 quite easily by adding

bind(
    name = "default_scala",
    actual = "@rules_scala_annex//src/main/scala:zinc_2_12_8",
)

to their WORKSPACE file.

The rules would be more accessible if we provided similar default targets for other versions of Scala. Currently, users that need a different version of Scala must do all the configuration/set-up themselves.

Lessen stateful Zinc overhead

Stateful incremental Zinc is disabled by default. However, there still extra work being done just because it is an option. E.g. unpacking upstream JARs.

There should be a choice between two modes (probably toolchains).

(a) Stateless. Does not unpack JARs, etc.

(b) Stateful-optional. Does unpack JARs, etc. Can use prior state if the --persistent_dir flag is specified.

These two modes would produce different outputs and so would not share cache hits between them. (Though within the mode, they would share cache hits.)

Directory structure

Directory structure was refactored six months ago in #44, but I think there can be some improvements.

.travis.yml
LICENSE.txt
README.md
format.sh
gen-deps.sh
setup-tools.sh
test.sh
travis.sh
rules/
    WORKSPACE
    common/
        BUILD
        private/
            util.bzl
        src/
            worker/
                SimpleMain.scala
                WorkerMain.scala
    jvm/
        BUILD
        private/
            label.bzl
    scala/
        BUILD
        dependencies.yml
        compat.bzl
        scala.bzl
        workspace.bzl
        3rdparty/
            BUILD
            maven.bzl
        private/
            doc.bzl
            import.bzl
            provider.bzl
            scala.bzl
            repl.bzl
        src/
            deps/
                DepsRunner.scala
            doc/
                DocRunner.scala
            zinc/
                ZincRunner.scala
    scalac/
        BUILD
        scalac.bzl
        private/
            scalac.bzl
    scala_proto/
        BUILD
        scala_proto.bzl
        workspace.bzl
        private/
            scala_proto.bzl
        src/
            proto/
                ScalaProtoRunner.scala
    scalafmt/
        BUILD
        dependencies.yml
        scalafmt.bzl
        workspace.bzl
        3rdparty/
            BUILD
            maven.bzl
        private/
            scalafmt.bzl
            format.template.sh
        src/
            scalafmt/
                ScalafmtRunner.scala
    tools/
        bazel
        bazel.rc
        bazel-0.16.rc
        bazel-0.17.rc
        bazel-0.18.rc
tests/
    docker/
        WORKSPACE
        simple/
            BUILD
            Container.scala
            test
        tools/
    scala/
        WORKSPACE
        dependencies.yml
        3rdparty/
            BUILD
            maven.bzl
        binary/
            BUILD
            Binary.scala
            test
        tools/
    scala_proto/
    scalafmt/
        WORKSPACE
        simple/
            BUILD
            test
        tools/
http_archive(
  name = "rules_scala_annex",
  sha256 = "<hash>",
  strip_prefix = "rules_scala_annex-<commit>/rules",
  url = "https://github.com/andyscott/rules_scala_annex/archive/<commit>.zip",
)

Advantages:

  • No gen-ignores.sh hackery required
  • Tests can test separate workspace setups.
  • Tests become simple therefore a good form of examples
  • Unrelated classpaths use separate Maven deps. (Actually this already happens...scalafmt has its own Maven deps. This is just a little more consistent about it.)
  • Users don't have to store all of our tests, docs, etc. (Eh, minor.)

Disadvantages:

  • This isn't how everyone else do it; they always have a root WORKSPACE.

NoClassDefFoundError

Hi again,

I'm having some trouble running my code. The error java.lang.NoClassDefFoundError: java/sql/Date

The Java options I'm using are the following:

default_javac_opts = [ "-target", "1.8",
                       "-encoding", "UTF-8",
                       "-source", "1.8",
                     ]

When I build, I get the following warning which I suspect may be related:
[Warn] bootstrap class path not set in conjunction with -source 8

If removing the source option I still get the same error at runtime however.

I'm currently building with openjdk version 1.8.0_222. Can I solve this using -Xbootclasspath? What should the value be?

Bazel 1.0.0 support

Recently Bazel 1.0.0 was released
https://blog.bazel.build/2019/10/10/bazel-1.0.html

An attempt to build existing code with newest Bazel version fails with an error:

File ".../external/rules_scala_annex/rules/scala_proto/private/core.bzl", line 15, 
  in scala_proto_library_implementation
  fail(("disallowed non proto deps in %...))
disallowed non proto deps in [<target //:foo_scala_proto>]

while my BUILD file does not contain any non-proto deps:

proto_library(
    name = "foo_proto",
    srcs = ["src/main/resources/foo.proto"],
)

scala_proto_library(
    name = "foo_scala_proto",
    deps = [":foo_proto"],
)

And it used to build properly with Bazel 0.29.1

Clean annex related items from test stack traces

We should prune annex items from stack traces.

Current formatted as diff of desired change:

INFO: From Testing //src/test/scala/io/higherkindness/arktika/scalac:ArktikaPhaseChecks:
==================== Test output for //src/test/scala/io/higherkindness/arktika/scalac:ArktikaPh
aseChecks:                                                                                     
Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.j
ava:62)                                                                                        
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccess
orImpl.java:45)                                                                                
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.scalacheck.Platform$.newInstance(Platform.scala:66)
        at org.scalacheck.ScalaCheckRunner$BaseTask.<init>(ScalaCheckFramework.scala:57)
        at org.scalacheck.ScalaCheckRunner$$anon$1.<init>(ScalaCheckFramework.scala:88)
        at org.scalacheck.ScalaCheckRunner.rootTask(ScalaCheckFramework.scala:88)
        at org.scalacheck.ScalaCheckRunner.$anonfun$tasks$2(ScalaCheckFramework.scala:47)
        at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233)
        at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:32)
        at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:29)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:194)
        at scala.collection.TraversableLike.map(TraversableLike.scala:233)
        at scala.collection.TraversableLike.map$(TraversableLike.scala:226)
        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:194)
        at org.scalacheck.ScalaCheckRunner.tasks(ScalaCheckFramework.scala:45)
-       at annex.BasicTestRunner.$anonfun$execute$2(TestFrameworkRunner.scala:13)
-       at annex.BasicTestRunner.$anonfun$execute$2$adapted(TestFrameworkRunner.scala:11)
-       at annex.TestHelper$.withRunner(Test.scala:32)
-       at annex.BasicTestRunner.$anonfun$execute$1(TestFrameworkRunner.scala:11)
-       at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:12)
-       at annex.ClassLoader$.withContextClassLoader(ClassLoader.scala:10)
-       at annex.BasicTestRunner.execute(TestFrameworkRunner.scala:11)
-       at annex.TestRunner$.$anonfun$main$10(TestRunner.scala:167)
-       at annex.TestRunner$.$anonfun$main$10$adapted(TestRunner.scala:143)
-       at scala.collection.IndexedSeqOptimized.prefixLengthImpl(IndexedSeqOptimized.scala:37)
-       at scala.collection.IndexedSeqOptimized.forall(IndexedSeqOptimized.scala:42)
-       at scala.collection.IndexedSeqOptimized.forall$(IndexedSeqOptimized.scala:42)
-       at scala.collection.mutable.ArrayBuffer.forall(ArrayBuffer.scala:48)
-       at annex.TestRunner$.main(TestRunner.scala:143)
-       at annex.TestRunner.main(TestRunner.scala)
Caused by: java.lang.NullPointerException
        at scala.tools.util.PathResolver$Calculated$.basis(PathResolver.scala:246)
        at scala.tools.util.PathResolver$Calculated$.containers$lzycompute(PathResolver.scala:25
7)                                                                                             
        at scala.tools.util.PathResolver$Calculated$.containers(PathResolver.scala:257)
        at scala.tools.util.PathResolver.containers(PathResolver.scala:273)
        at scala.tools.util.PathResolver.computeResult(PathResolver.scala:295)
        at scala.tools.util.PathResolver.result(PathResolver.scala:278)
        at scala.tools.util.PathResolver.resultAsURLs(PathResolver.scala:290)
        at scala.tools.reflect.ReflectMain$.classloaderFromSettings(ReflectMain.scala:13)
        at scala.tools.reflect.ReflectMain$.newCompiler(ReflectMain.scala:17)
        at io.higherkindness.arktika.scalac.ArktikaPhaseChecks.<init>(ArktikaPhaseChecks.scala:1
0)                                                                                             
        ... 32 more

scala_library doesn't support reference.conf

See this draft PR for an example demonstrating the issue:
#215

java_library(
    name = "plain_java",
    resources = glob(["reference.conf"]),
    resource_strip_prefix = "resources/reference_conf",
)

Produces a JAR containing the manifest and a single file reference.conf.

scala_library(
    name = "plain",
    resources = glob(["reference.conf"]),
    resource_strip_prefix = "resources/reference_conf",
    scala = "//scala:2_11",
)

Produces a JAR containing only the manifest.

It seems like the two libraries should both contain the reference.conf file.

Add toolchain to provide default options

options to provide:

  • scala versions
  • scalac options
  • scalac plugins
  • jvm options
  • potentially: worker target label -- currently it's hard coded to the one in //runner

`scala_proto_library` build doesn't fail if `protoc` errors

I noticed this while trying to get a build working in my project, so I edited my test project to reproduce it in a smaller testcase:
https://github.com/luser/bazel-scala-proto-test/tree/protoc-failure

I edited hello.proto to add an import of a proto file that does not exist. Building the :hello-proto target directly errors:

└─ $ ▶ bazel build :hello-proto
INFO: Analyzed target //:hello-proto (15 packages loaded, 561 targets configured).
INFO: Found 1 target...
ERROR: /Users/ted/bazel-scala-proto-test/BUILD:5:1: Generating Descriptor Set proto_library //:hello-proto failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc '--descriptor_set_out=bazel-out/darwin-fastbuild/bin/hello-proto-descriptor-set.proto.bin' '-Ihello.proto=hello.proto' --direct_dependencies ... (remaining 3 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
foo.proto: File not found.
hello.proto:5:1: Import "foo.proto" was not found or had errors.
Target //:hello-proto failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 70.476s, Critical Path: 19.57s
INFO: 182 processes: 182 darwin-sandbox.
FAILED: Build did NOT complete successfully

But building :scala-proto-test (the scala_proto_library target) does not, even though protoc prints the same errors:

└─ $ ▶ bazel build :scala-proto-test
INFO: Analyzed target //:scala-proto-test (20 packages loaded, 613 targets configured).
INFO: Found 1 target...
INFO: From SkylarkAction external/rules_scala_annex/src/main/scala/higherkindness/rules_scala/common/worker/worker/classes.jar [for host]:
warning: there were two unchecked warnings; re-run with -unchecked for details
one warning found
INFO: From SkylarkAction external/rules_scala_annex/src/main/scala/compiler_bridge_2_12_8/classes.jar [for host]:
warning: there were three deprecation warnings (since 2.12.0); re-run with -deprecation for details
warning: there were three feature warnings; re-run with -feature for details
two warnings found
INFO: From ScalaProtoCompile tmp/scala_proto_test:
foo.proto: File not found.
hello.proto: Import "foo.proto" was not found or had errors.
protoc-jar: protoc version: 3.5.1, detected platform: osx-x86_64 (mac os x/x86_64)
protoc-jar: embedded: bin/3.5.1/protoc-3.5.1-osx-x86_64.exe
protoc-jar: executing: [/var/folders/4x/l8p9pv0j3m35h9vh7frs8b340000gp/T/protocjar9825690595322198977/bin/protoc.exe, --plugin=protoc-gen-scala=/var/folders/4x/l8p9pv0j3m35h9vh7frs8b340000gp/T/protocbridge2344614132569024880, --scala_out=bazel-out/darwin-fastbuild/bin/tmp/scala_proto_test, hello.proto]
Target //:scala-proto-test up-to-date:
  bazel-bin/scala-proto-test.srcjar
INFO: Elapsed time: 47.555s, Critical Path: 47.28s
INFO: 32 processes: 27 darwin-sandbox, 1 local, 4 worker.
INFO: Build completed successfully, 48 total actions

Rule for running onejar on plugins

If you attempt to use a scalac plugin such as scapegoat or silencer, you'll get the following warning from phase_classpaths:

WARNING! It is slightly inefficient to use a JVM target with dependencies directly as a scalac plugin. Please SingleJar the target before using it as a scalac plugin in order to avoid additional overhead.

[1] https://github.com/higherkindness/rules_scala/blob/master/rules/private/phases/phase_classpaths.bzl#L25

However, there is no available rule that makes it possible to "SingleJar" the target.

I wrote a quick rule that exposes the existing singlejar wrapper from rules_scala, but that feels a bit wrong since needs to load the action_singlejar from private/utils.bzl. It seems like it would make sense to add such a rule to make it easier to package plugins?

load("@rules_scala_annex//rules/common:private/utils.bzl", _action_singlejar = "action_singlejar")

def _scala_plugin_impl(ctx):
    plugin = ctx.attr.plugin
    plugin_runtime_jars = plugin[JavaInfo].transitive_runtime_jars.to_list()

    output_jar = ctx.outputs.plugin_singlejar
    _action_singlejar(
        ctx,
        inputs = plugin_runtime_jars,
        output = output_jar,
        progress_message = "singlejar scalac plugin %s" % plugin.label.name,
    )

    return struct(
        providers = [
            JavaInfo(
                output_jar = output_jar,
                compile_jar = output_jar,
            ),
        ],
    )

scala_plugin = rule(
    implementation = _scala_plugin_impl,
    attrs = {
        "plugin": attr.label(
            allow_single_file = True,
            mandatory = True,
            doc = "The Scalac plugin.",
            providers = [JavaInfo],
        ),
        "_singlejar": attr.label(
            cfg = "host",
            default = "@bazel_tools//tools/jdk:singlejar",
            executable = True,
        ),
    },
    outputs = {
        "plugin_singlejar": "%{name}_singlejar.jar",
    },
)

Mention `scala_proto_register_toolchains` in docs

Hi, thanks for this project! I was trying to get a simple example working with scala_proto_library and kept getting a no matching toolchains error from bazel. After reading the bazel docs on toolchains and this repository's WORKSPACE file I realized that I needed scala_proto_register_toolchains in my WORKSPACE and got things working. It would be great if the docs mentioned this!

scala_proto_library contains code for all dependencies

I originally noticed this when debugging why I was getting unexpected "indirect dependency" problems.

If you have two proto libraries, such that one depends on the other (eg., A <- B), then the scala library jar for B will contain all the code for the protos in A. I suspect this is because the proto_library rule produces a (concatenated) descriptor describing both the src proto and any protos in the dependency set, and the scala_proto_library then produces code for all the entries (not just those that were listed in the srcs).

See #217 for a test reproducing the issue and a proposed fix.

(I'm not sure if this is what is causing my indirect dependency error, but my theory was that even though I had a direct dependency on A, and wasn't directly using B, it was registered as "used" because it redefined the classes from A)

Windows CI tests

Without CI tests, Windows support is almost certainly broken.

I can't get scalafmt to work through rules_scala_annex

I attempted to follow the instructions in the README for running scalafmt.

On my first attempt I got these errors:

ERROR: /Users/dave.hinton/vc/bazel-scalafmt-playground/WORKSPACE:29:1: file '@rules_scala_annex//rules/scalafmt:workspace.bzl' does not contain symbol 'scalafmt_default'
ERROR: /Users/dave.hinton/vc/bazel-scalafmt-playground/WORKSPACE:30:1: name 'scalafmt_default' is not defined
ERROR: Error evaluating WORKSPACE file

After brief investigation, I guessed that I should use scalafmt_default_config instead of scalafmt_default. This seemed to be an improvement, but still gave me an error:

ERROR: Analysis of target '//:format' failed; build aborted: no such package '@scalafmt_com_geirsson_metaconfig_core_2_12//': The repository could not be resolved

I understand that rules_scala_annex is experimental and that perhaps scalafmt support is broken at the moment. On the other hand, if the problem is just that the documentation is wrong or misleading, I am happy to open a PR just as soon as someone has explained to me how to get it working :-)

Minimal workspace I was using for this test: https://gist.github.com/onzo-dave-hinton/b0b745095036077ef3ef053f3102d978 (includes commit history).

$ bazel version
Build label: 0.17.2-homebrew
Build target: bazel-out/darwin-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
Build time: Fri Sep 28 10:42:37 2018 (1538131357)
Build timestamp: 1538131357
Build timestamp as int: 1538131357

Fully document rules and attributes

Could use Skydoc, though it's currently implemented in a really fragile way.

Some documentation on all rules and attributes is needed though

ScalaPB worker crashing with `Stream closed` errors

Sometimes when running (especially in CircleCI), the bazel bulid fails with errors like the following. Are there any known issues with the ScalaPB worker that could be causing this? Any fixes / more information that would be helpful to collect?

INFO: From ScalaProtoCompile <elided>/api/tmp/state_scala_proto:
protoc-jar: protoc version: 3.8.0, detected platform: linux-x86_64 (linux/amd64)
protoc-jar: embedded: bin/3.8.0/protoc-3.8.0-linux-x86_64.exe
protoc-jar: executing: [/tmp/protocjar10815344638873208356/bin/protoc.exe, --plugin=protoc-gen-scala=/tmp/protocbridge9316711168667304922, --scala_out=grpc,flat_package:bazel-out/k8-fastbuild/bin/<elided>.proto, --proto_path=external/com_google_protobuf_wkt, --proto_path=external/com_github_scalapb, --proto_path=., <elided>proto, external/com_github_scalapb/scalapb/scalapb.proto, external/com_google_protobuf_wkt/google/protobuf/descriptor.proto]
ERROR: <elided>/BUILD:285:1: ScalaProtoCompile <elided>/value_scala_proto failed: Worker process quit or closed its stdin stream when we tried to send a WorkRequest:

---8<---8<--- Exception details ---8<---8<---
java.io.IOException: Stream closed
	at java.base/java.lang.ProcessBuilder$NullOutputStream.write(Unknown Source)
	at java.base/java.io.OutputStream.write(Unknown Source)
	at java.base/java.io.BufferedOutputStream.flushBuffer(Unknown Source)
	at java.base/java.io.BufferedOutputStream.flush(Unknown Source)
	at com.google.devtools.build.lib.worker.WorkerSpawnRunner.execInWorker(WorkerSpawnRunner.java:331)
	at com.google.devtools.build.lib.worker.WorkerSpawnRunner.actuallyExec(WorkerSpawnRunner.java:172)
	at com.google.devtools.build.lib.worker.WorkerSpawnRunner.exec(WorkerSpawnRunner.java:121)
	at com.google.devtools.build.lib.exec.SpawnRunner.execAsync(SpawnRunner.java:225)
	at com.google.devtools.build.lib.exec.AbstractSpawnStrategy.exec(AbstractSpawnStrategy.java:123)
	at com.google.devtools.build.lib.exec.AbstractSpawnStrategy.exec(AbstractSpawnStrategy.java:88)
	at com.google.devtools.build.lib.actions.SpawnActionContext.beginExecution(SpawnActionContext.java:41)
	at com.google.devtools.build.lib.exec.ProxySpawnActionContext.beginExecution(ProxySpawnActionContext.java:60)
	at com.google.devtools.build.lib.actions.SpawnContinuation$1.execute(SpawnContinuation.java:80)
	at com.google.devtools.build.lib.analysis.actions.SpawnAction$SpawnActionContinuation.execute(SpawnAction.java:1344)
	at com.google.devtools.build.lib.analysis.actions.SpawnAction.beginExecution(SpawnAction.java:314)
	at com.google.devtools.build.lib.actions.Action.execute(Action.java:123)
	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor$4.execute(SkyframeActionExecutor.java:851)
	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor$ActionRunner.continueAction(SkyframeActionExecutor.java:985)
	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor$ActionRunner.run(SkyframeActionExecutor.java:957)
	at com.google.devtools.build.lib.skyframe.ActionExecutionState.runStateMachine(ActionExecutionState.java:116)
	at com.google.devtools.build.lib.skyframe.ActionExecutionState.getResultOrDependOnFuture(ActionExecutionState.java:77)
	at com.google.devtools.build.lib.skyframe.SkyframeActionExecutor.executeAction(SkyframeActionExecutor.java:577)
	at com.google.devtools.build.lib.skyframe.ActionExecutionFunction.checkCacheAndExecuteIfNeeded(ActionExecutionFunction.java:760)
	at com.google.devtools.build.lib.skyframe.ActionExecutionFunction.compute(ActionExecutionFunction.java:275)
	at com.google.devtools.build.skyframe.AbstractParallelEvaluator$Evaluate.run(AbstractParallelEvaluator.java:451)
	at com.google.devtools.build.lib.concurrent.AbstractQueueVisitor$WrappedRunnable.run(AbstractQueueVisitor.java:399)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
---8<---8<--- End of exception details ---8<---8<---

---8<---8<--- Start of log, file at /workspace/.bazel/bazel-workers/worker-3-ScalaProtoCompile.log ---8<---8<---
(empty)
---8<---8<--- End of log ---8<---8<---

`--strategy=SingleJar=worker` errors

In a new project, using --strategy=SingleJar=worker results in build failures:

Worker process did not return a WorkResponse

It seems there's some flag incompatibilities that'll need to be documented.

Bazel 0.27 support

I'm going to tackle this next week. Adding this issue here for those who want to subscribe.

scala library doesn't show up in intellij

I was not able to get intellij to recognize my scala library version until I added the scala library to the runtime_deps of the scala_library (adding it to deps caused buildozer to request removing it):

scala_library(
  runtime_deps = [
    "@maven//:org_scala_lang_scala_library",
  ],
  ...
)

Not sure whether this is expected, or if an issue with how the rules are defined or something with my setup.

Bloop

There's some non-functioning file that've been earmarked as the point of a possible Bloop integration.

I'm personally not of an opinion that Bloop is right abstraction here as opposed in Zinc, but I'm not an expert and this is the skunkworks after all, right?

Anyway, what's the game plan? Do we need to keep Bloop stubs around, or can we clear out the clutter and put it back when something actually exists? @andyscott @jvican

"Failed to save cached analysis" error

On some builds, I get a an error like Failed to save cached analysis: java.nio.file.NoSuchFileException: service/.tmp. The error does not give any specific reason for the failure, so I'm not sure where to begin debugging.

Building with README example fails

Hi!

I'm just getting started with bazel and wanted to try these rules out. I have a bare minimum project with just a single main-class in it. WORKSPACE is copied straight from the README.

This is my BUILD file:

load("@rules_scala_annex//rules:scala.bzl", "scala_library")

scala_library(
  name = "test",
  srcs = ["src/main/scala/com/example/Main.scala"],
)

bazel build fails with:

ERROR: Failed to load Starlark extension '@annex//:defs.bzl'.
Cycle in the workspace file detected. This indicates that a repository is used prior to being defined.
The following chain of repository dependencies lead to the missing definition.
 - @annex
This could either mean you have to add the '@annex' repository with a statement like `http_archive` in your WORKSPACE file (note that transitive dependencies are not added automatically), or move an existing definition earlier in your WORKSPACE file.
ERROR: cycles detected during target parsing
INFO: Elapsed time: 0.061s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)

Any pointers would be appreciated! :)

Bazel 0.19 Support

Annex warns on some things in 0.19. I'm working on fixing them, but using this issue to document the things I'm running into

...depends on deprecated target '@local_jdk//:java': Don't depend on targets in the JDK workspace; use @bazel_tools//tools/jdk:current_java_runtime instead (see https://github.com/bazelbuild/bazel/issues/5594).

Cannot configure scala_deps_used via --define

I tried configuring scala_deps_used in my .bazelrc as the docs suggest with the line:

build --define=scala_deps_used=warn

But I could not get it to respect the setting. I grepped the repo and found that configure_zinc_scala has a deps_used parameter, and using this worked as expected. Are the docs out of date, or is this define only used for the default scala version and ignored when defining a separate one with configure_zinc_scala?

Scala compilation fails when it includes a Java source with an inner class depending on an third-party dependency

Here's a failing test in a PR: #172

What appears to be going on: ClassToApi is using Java reflection to determine the structure of generated Java sources. [https://github.com/sbt/zinc/blob/develop/internal/zinc-apiinfo/src/main/scala/sbt/internal/inc/ClassToAPI.scala#L158]. However, this appears to cause the classloader to attempt to transitively load a class that's defined only in an ijar in this phase; it then blows up because these files are apparently malformed for this purpose.

I'm not sure if this is an upstream Zinc bug that's merely being exercised here or there's something that this project can change directly. Happy to talk through possible resolutions.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.