Giter Club home page Giter Club logo

ddf's People

Contributors

binhmop avatar ctn avatar dabaitu avatar dungnn avatar hai-adatao avatar huandao0812 avatar khangich avatar lebinh avatar ljzzju avatar namma avatar nhanitvn avatar pangzhi avatar phoaivu avatar piccolbo avatar pzzs avatar qinxinwei avatar thxph avatar ubolonton avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ddf's Issues

Dose the current version support clusters-running in standalone mode?

I have tried many ways to make this aciton, however it is only correct in the local mode.

And in the cluster mode, only works the column-type operation, columnNames, NumColumns, eg.
Operations of row-type, such as numRows, always fails with fatal exception or 0 result.

I desire your help and it will be quite appreciated if a user-guide about how to run the DDF with cluster mode is provided.

Thanks a lot!

Move DDF/spark sub-project to SparkSQL

Currently it uses Spark 0.9 and Shark 0.9 internally, Shark is no longer in active development, so we'll need to move it to the newest Spark release.

introduce generic R function as.ddf

It is the natural complement to as.data.frame
it will have initially one method for data frames, but more are possible for local files, hive tables and such
can be a focal point to progressively refactor all conversion code to these two as.something functions.
the data.frame methods is almost indispensable to write quick test cases.

add more algorithms

Sorry firstly, I have no idea where to put this issue.

By using the java reflection , ddf can easily implement the mllib algrithms.

I have found that the method name is restricted to be MLClassMethods.DEFAULT_TRAIN_METHOD_NAME which is defined as "train" in io.ddf.ml.MLClassMethods

this is fine for many mllib algrithms because they have provided the "train" method.

However, there are excludings , e.g. RandomForest . thus RF cannot be simply defined as KMeans does, like:

public IModel decisionTree(args....) throws DDFException {
return this.train("decisionTree", args...);
}

So I wonder things should be changed to let the actural training method awared towards specific mllib algorithm.

Here is what my suggestion:

(1) expand the current algorithm traing entrance with a training mehtod parameter, e.g

current: public IModel train(String trainMethodName, Object... paramArgs) throws DDFException
modified: public IModel train(String trainMethodName, String runMethodName, Object... paramArgs) throws DDFException

(2) the API provided to users should not include the runMethodName, thus maintaining the current ddf algorithm API entrance,, e.g

modified: public IModel KMeans(int numCentroids, int maxIters, int runs) throws DDFException {
return this.train("kmeans", "train",numCentroids, maxIters, runs);
}

any help would be appreciated, thanks.

// //// ISupportML //////

/**

  • Runs a training algorithm on the entire DDF dataset.
    *
  • @param trainMethodName
  • @param args
  • @return
  • @throws DDFException
    /
    @OverRide
    public IModel train(String trainMethodName, Object... paramArgs) throws DDFException {
    /
    *
    • Example signatures we must support:

    • Unsupervised Training

    • Kmeans.train(data: RDD[Array[Double]], k: Int, maxIterations: Int, runs: Int, initializationMode: String)

    • Supervised Training

    • LogisticRegressionWithSGD.train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double, miniBatchFraction:

    • Double, initialWeights: Array[Double])
      *

    • SVM.train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double, regParam: Double, miniBatchFraction:

    • Double)

    • */

      // Build the argument type array
      if (paramArgs == null) paramArgs = new Object[0];

      // Locate the training method
      String mappedName = Config.getValueWithGlobalDefault(this.getEngine(), trainMethodName);
      if (!Strings.isNullOrEmpty(mappedName)) trainMethodName = mappedName;

      TrainMethod trainMethod = new TrainMethod(trainMethodName, _MLClassMethods.DEFAULT_TRAIN_METHOD_NAME_, paramArgs);
      if (trainMethod.getMethod() == null) {
      throw new DDFException(String.format("Cannot locate method specified by %s", trainMethodName));
      }

bin/pyddf examples/basics.py fails

The exception is: This UDAF does not support the deprecated getEvaluator() method.

System:

$ uname -a
Darwin jw-macbook-pro 14.3.0 Darwin Kernel Version 14.3.0: Thu Feb 12 18:38:33 PST 2015; root:xnu-2782.20.34~3/RELEASE_X86_64 x86_64
$ python --version
Python 2.7.9

traceback:

Traceback (most recent call last):
  File "examples/basics.py", line 25, in <module>
    ddf.getFiveNumSummary()
  File "/Users/jbw/work/CWP/DDF/python/package/ddf/DDF.py", line 69, in getFiveNumSummary
    return self._jddf.getFiveNumSummary()
  File "/Users/jbw/work/CWP/DDF/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 538, in __call__
  File "/Users/jbw/work/CWP/DDF/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o5.getFiveNumSummary.
: io.ddf.exception.DDFException: Unable to get fivenum summary of the given columns from table SparkDDF_spark_8f97376f_5de4_4d31_bc8a_6a9455418742
    at io.ddf.DDF.sql2txt(DDF.java:324)
    at io.ddf.analytics.AStatisticsSupporter.getFiveNumSummary(AStatisticsSupporter.java:60)
    at io.ddf.DDF.getFiveNumSummary(DDF.java:912)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:744)
Caused by: org.apache.hadoop.hive.ql.parse.SemanticException: This UDAF does not support the deprecated getEvaluator() method.
    at org.apache.hadoop.hive.ql.udf.generic.AbstractGenericUDAFResolver.getEvaluator(AbstractGenericUDAFResolver.java:53)
    at org.apache.spark.sql.hive.HiveGenericUdaf.objectInspector$lzycompute(hiveUdfs.scala:182)
    at org.apache.spark.sql.hive.HiveGenericUdaf.objectInspector(hiveUdfs.scala:181)
    at org.apache.spark.sql.hive.HiveGenericUdaf.dataType(hiveUdfs.scala:189)
    at org.apache.spark.sql.catalyst.expressions.Alias.toAttribute(namedExpressions.scala:94)
    at org.apache.spark.sql.catalyst.plans.logical.Aggregate$$anonfun$output$6.apply(basicOperators.scala:141)
    at org.apache.spark.sql.catalyst.plans.logical.Aggregate$$anonfun$output$6.apply(basicOperators.scala:141)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at org.apache.spark.sql.catalyst.plans.logical.Aggregate.output(basicOperators.scala:141)
    at org.apache.spark.sql.catalyst.planning.PhysicalOperation$$anonfun$unapply$1.apply(patterns.scala:61)
    at org.apache.spark.sql.catalyst.planning.PhysicalOperation$$anonfun$unapply$1.apply(patterns.scala:61)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.sql.catalyst.planning.PhysicalOperation$.unapply(patterns.scala:61)
    at org.apache.spark.sql.execution.SparkStrategies$ParquetOperations$.apply(SparkStrategies.scala:209)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.apply(QueryPlanner.scala:59)
    at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan$lzycompute(SQLContext.scala:383)
    at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan(SQLContext.scala:381)
    at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan$lzycompute(SQLContext.scala:387)
    at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan(SQLContext.scala:387)
    at org.apache.spark.sql.SchemaRDD.collect(SchemaRDD.scala:454)
    at io.spark.ddf.etl.SqlHandler.sql2txt(SqlHandler.java:176)
    at io.ddf.DDFManager.sql2txt(DDFManager.java:392)
    at io.ddf.DDFManager.sql2txt(DDFManager.java:387)
    at io.ddf.DDFManager.sql2txt(DDFManager.java:382)
    at io.ddf.DDF.sql2txt(DDF.java:322)
    ... 13 more

Failure in ddf_spark tests in maven

Hi I left a post on the google group as well but thought it might good to cross post this here as well.

I've been using this for a while now but I thought I would say there are a few hiccups along the way when running this on a clean(-ish) mac Yosemite 10.10.x install.

Setting Java 7 home (thats my own fault...)

Maven Opts - you need to set this to something sensible or your Spark DDF Module will fail all its tests mysteriously.

Mac-mini:DDF $ java -version

java version "1.7.0_51"

Java(TM) SE Runtime Environment (build 1.7.0_51-b13)

Java HotSpot(TM) 64-Bit Server VM (build 24.51-b03, mixed mode)

Mac-mini:DDF $ echo $MAVEN_OPTS 

-Xmx2048m -XX:MaxPermSize=756m

mvn --version
Apache Maven 3.1.1 (0728685237757ffbf44136acec0402957f723d9a; 2013-09-17 23:22:22+0800)
Maven home: /Users/mb_old/java/apache-maven-3.1.1
Java version: 1.7.0_51, vendor: Oracle Corporation
Java home: /Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "10.10.2", arch: "x86_64", family: "mac"

Pulling out MASTER from github I get this error on [mvn install]

Tests in error:

  AggregationHandlerTest.setUp:19->BaseTest.createTableAirline:31 ยป QueryExecution

  AggregationHandlerTest.setUp:19->BaseTest.createTableAirline:31 ยป QueryExecution

  BinningHandlerTest.testBinning:22->BaseTest.createTableAirline:31 ยป QueryExecution

  CrossValidationSuite.<init>:11->ATestSuite.createTableAirline:70 ยป QueryExecution

  MetricsTests.testConfusionMatrix:22 ยป QueryExecution FAILED: Execution Error, ...

  MLlibIntegrationSuite>ATestSuite.run:18->ATestSuite.org$scalatest$BeforeAndAfterAll$$super$run:18->FunSuite.org$scalatest$FunSuiteLike$$super$run:1559->FunSuite.runTests:1559->ATestSuite.runTest:18->ATestSuite.org$scalatest$BeforeAndAfterEach$$super$runTest:18->FunSuite.withFixture:1559->ATestSuite.createTableAirlineWithNA:86 ยป QueryExecution

  MLlibIntegrationSuite>ATestSuite.run:18->ATestSuite.org$scalatest$BeforeAndAfterAll$$super$run:18->FunSuite.org$scalatest$FunSuiteLike$$super$run:1559->FunSuite.runTests:1559->ATestSuite.runTest:18->ATestSuite.org$scalatest$BeforeAndAfterEach$$super$runTest:18->FunSuite.withFixture:1559 ยป Hive

  MLlibIntegrationSuite>ATestSuite.run:18->ATestSuite.org$scalatest$BeforeAndAfterAll$$super$run:18->FunSuite.org$scalatest$FunSuiteLike$$super$run:1559->FunSuite.runTests:1559->ATestSuite.runTest:18->ATestSuite.org$scalatest$BeforeAndAfterEach$$super$runTest:18->FunSuite.withFixture:1559 ยป Hive

  MLSupporterSuite.<init>:10->ATestSuite.createTableAirlineSmall:54 ยป QueryExecution

  StatisticsSupporterTest.setUp:24->BaseTest.createTableAirline:31 ยป QueryExecution

  StatisticsSupporterTest.setUp:24->BaseTest.createTableAirline:31 ยป QueryExecution

  StatisticsSupporterTest.setUp:24->BaseTest.createTableAirline:31 ยป QueryExecution

  StatisticsSupporterTest.setUp:24->BaseTest.createTableAirline:31 ยป QueryExecution

  StatisticsSupporterTest.setUp:24->BaseTest.createTableAirline:31 ยป QueryExecution

  StatisticsSupporterTest.setUp:24->BaseTest.createTableAirline:31 ยป QueryExecution

  StatisticsSupporterTest.setUp:24->BaseTest.createTableAirline:31 ยป QueryExecution

  FactorSuite.<init>:8->ATestSuite.createTableMtcars:36 ยป QueryExecution FAILED:...

  ListDDFSuite.<init>:8->ATestSuite.createTableMtcars:36 ยป QueryExecution FAILED...

  RepresentationHandlerSuite.<init>:20->ATestSuite.createTableAirline:70 ยป QueryExecution

  SampleSuite.<init>:8->ATestSuite.createTableMtcars:36 ยป QueryExecution FAILED:...

  ViewHandlerTest.testRemoveColumns:17->BaseTest.createTableAirline:31 ยป QueryExecution

  MissingDataHandlerTest.setUp:24->BaseTest.createTableAirlineWithNA:48 ยป QueryExecution

  MissingDataHandlerTest.setUp:24->BaseTest.createTableAirlineWithNA:48 ยป QueryExecution

  TransformationHandlerTest.setUp:20->BaseTest.createTableAirline:31 ยป QueryExecution

  TransformationHandlerTest.setUp:20->BaseTest.createTableAirline:31 ยป QueryExecution

  TransformationHandlerTest.setUp:20->BaseTest.createTableAirline:31 ยป QueryExecution

  TransformationHandlerTest.setUp:20->BaseTest.createTableAirline:31 ยป QueryExecution

  ALSTest.TestALS:13->BaseTest.createTableRatings:64 ยป QueryExecution FAILED: Se...

  KMeansTest.TestKMeans:14->BaseTest.createTableAirline:31 ยป QueryExecution FAIL...

  SparkDDFManagerTests.testSimpleSparkDDFManager:30->BaseTest.createTableAirline:31 ยป QueryExecution



Tests run: 32, Failures: 0, Errors: 30, Skipped: 0



[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary:

[INFO] 

[INFO] ddf ............................................... SUCCESS [3.347s]

[INFO] ddf_core .......................................... SUCCESS [15.668s]

[INFO] ddf_spark ......................................... FAILURE [3:15.629s]

[INFO] ddf_examples ...................................... SKIPPED

[INFO] ------------------------------------------------------------------------

[INFO] BUILD FAILURE

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 3:34.902s

[INFO] Finished at: Thu Apr 02 21:25:45 SGT 2015

[INFO] Final Memory: 22M/285M

[INFO] ------------------------------------------------------------------------

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.15:test (default-test) on project ddf_spark_2.10: There are test failures.

[ERROR] 

[ERROR] Please refer to /Users/mb_old/scala/DDF/spark/target/scala-2.10/surefire-reports for the individual test results.

[ERROR] -> [Help 1]

[ERROR] 

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR] 

[ERROR] For more information about the errors and possible solutions, please read the following articles:

[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

[ERROR] 

[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn <goals> -rf :ddf_spark_2.10

I've tried a few things but not sure what the error is or what needs fixed, any ideas?

Thanks In Advance

Michael

Question : project ended ?

I am looking for a framework to help with ML of data in both batch and streaming scenarios.
This DDF looks very very simple to us, but there have been no issues closed for months ?
I would like to know if the project is dead.

Thanks in advance

how to set spark worker/executor/driver memory in R environment?

and How to set these memories when run ./bin/ddf-shell on spark standalone with master=local

I found that the spark params are restricted in the code io.spark.ddf.SparkDDFManager

to be :

private static final String[][] SPARK_ENV_VARS = new String[][] {
// @Formatter:off
{ "SPARK_APPNAME", "spark.appname" },
{ "SPARK_MASTER", "spark.master" },
{ "SPARK_HOME", "spark.home" },
{ "SPARK_SERIALIZER", "spark.kryo.registrator" },
{ "HIVE_HOME", "hive.home" },
{ "HADOOP_HOME", "hadoop.home" },
{ "DDFSPARK_JAR", "ddfspark.jar" }
// @Formatter:on
};

Does that mean I cannot set the other spark parameters in ddf?

terrible if it is true!!

NoClassDefFoundError when run DDF on Yarn

I downloaded the latest code of ddf-sparksql-1.1.0 branch, and tried to run it on yarn with the guidance writen by khangich in #15, however, after I run "bin/sbt 'project spark' console", and entered the command "manager.sql2txt("drop table if exists airline")", I encountered the NoClassDefFoundError as follows:

Loading /home/qxw/DDF/bin/sbt.dir/bin/sbt-launch-lib.bash
[info] Loading project definition from /home/qxw/DDF/project
[info] Set current project to root (in build file:/home/qxw/DDF/)
[info] Set current project to ddf_spark (in build file:/home/qxw/DDF/)
[info] Starting scala interpreter...
[info]
SparkDDFManager available as manager
import io.ddf.DDFManager
manager: io.ddf.DDFManager = io.spark.ddf.SparkDDFManager@4223c913
Welcome to Scala version 2.10.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_05).
Type in expressions to have them evaluated.
Type :help for more information.

scala> manager.sql2txt("drop table if exists airline");
java.lang.NoClassDefFoundError: org/apache/spark/sql/catalyst/types/DateType$
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:99)
at io.spark.ddf.etl.SqlHandler.sql2txt(SqlHandler.java:175)
at io.ddf.DDFManager.sql2txt(DDFManager.java:392)
at io.ddf.DDFManager.sql2txt(DDFManager.java:387)
at io.ddf.DDFManager.sql2txt(DDFManager.java:382)
at .(:10)
at .()
at .(:7)
at .()
at $print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:983)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:604)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:568)
at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:756)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:801)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:713)
at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:577)
at scala.tools.nsc.interpreter.ILoop.innerLoop$1(ILoop.scala:584)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:587)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:878)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:833)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:833)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:833)
at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:900)
at xsbt.ConsoleInterface.run(ConsoleInterface.scala:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:77)
at sbt.Console.sbt$Console$$console0$1(Console.scala:23)
at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:24)
at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply(Console.scala:24)
at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply(Console.scala:24)
at sbt.Logger$$anon$4.apply(Logger.scala:90)
at sbt.TrapExit$App.run(TrapExit.scala:244)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.catalyst.types.DateType$
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 45 more

I am very confused on this error, because the DataType$ is contained in spark-catalyst_2.10-1.2.0-SNAPSHOT.jar.

Does anybody ever encounter it and have idea about it? Any help will be appreciated.

Spark Tests unsuccesful

Hi,

Pulled from master. If I -DskipTests It compiles succesfully but I cannot run examples:
$ ./bin/run-example io.ddf.spark.examples.RowCount
Failed to find DDF examples assembly in /var/DDF/lib or /var/DDF/examples/target
You need to build DDF with mvn install before running this program

Running io.ddf.spark.SparkDDFManagerTests
org.apache.spark.serializer.KryoSerializer
null
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.647 sec - in io.ddf.spark.SparkDDFManagerTests

Results :

Tests in error:
RepresentationHandlerSuite.Can do sql queries after Transform Rserve ยป Spark J...
TransformationHandlerTest.testTransformNativeRserve ยป Spark Job aborted due to...

Tests run: 79, Failures: 0, Errors: 2, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:46 min
[INFO] Finished at: 2015-08-19T14:59:07-05:00
[INFO] Final Memory: 59M/3944M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.15:test (default-test) on project ddf_spark_2.10: There are test failures.
[ERROR]
[ERROR] Please refer to /var/DDF/spark/target/scala-2.10/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

why the algorithm entry "KMeans","als" are additionally provided since there is entry "train"?

I have seen that in the file "io.ddf.facades.MLFacade", the general entry for Spark MLlib algrithms
public IModel train(String trainMethodName, Object... params) throws DDFException {
return this.getMLSupporter().train(trainMethodName, params);
}
is provided.

I wonder since there is already method "train", why the algorithm entry "KMeans","als" are additionally provided? And I found entries "KMeans","als" are acturally also implemented by method "train".

Is there any different between these entries?

Thanks.

summary vs fivenum

The R summary is a fivenum summary + most common items counts for non numeric cols + count of NAs
A ddf fivenum is a fivenum summary + drop non-numeric cols + ignore NAs
A ddf summary is a method for a R base package generic and as such should be semantically close and if possible identical to the method for data.frames, particularly in this case where data frames in memory and distributed are implementation of the same abstraction (summary for vectors is by necessity a little different). Instead a ddf summary has: mean, min, max and count of NAs in common with R summary, stddev, count of rows, repeated for each column, and returns NAs for string columns (not even the rows that make sense, like count an cNA)

My suggestion is: keep fivenum what it is, since everybody knows the five num summary. Make summary a true method for base::summary results should be identical.

Add a function for momenta like mean, stddev, skewness, kurtosis etc

If the changes in the definition of summary should apply to other languages is a bit more subtle. I guess no one has the monopoly on the word summary. Unless there are specific expectation in Java or Python it'd be simpler to make them all the same. In R there are clear expectations for what summary should do.

branch ddf-sparksql-1.1.0 debugging error

I found the "debug" patch in this branch, however the function "sql2txt" in SqlHandler.java ended with
SchemaRDD rdd = ((SparkDDFManager) this.getManager()).getHiveContext().sql(command);
Long size = rdd.count();
Row row = rdd.first();
// Row[] arrRow = (Row[]) rdd.collect();
List lsString = new ArrayList();
lsString.add(row.mkString("\t"));
// for (Row row : arrRow) {
// lsString.add(row.mkString("\t"));
// }
return lsString;

the line Row row = rdd.first(); will always throw exception. Because rdd may be empty, frequently.

I wonder this version has some special utilities or these changed codes should be restored?

classNotFound error in when execute ddf.ML.als(m,n,p) BUG

There is small spelling error in the method
public IModel als(int rank, int iteration, double lamda) throws DDFException {
return this.train("collaborateFiltering", rank, iteration, lamda);
}
of class io.ddf.facades.MLFacade

The string name "collaborateFiltering" is inconsistent with the one defined in the file ddf-conf/ddf.ini
with line 48 : collaborativeFiltering = org.apache.spark.mllib.recommendation.ALS

so change the "collaborativeFiltering " in ddf.ini to "collaborateFiltering" is recommended.

sql doesn't accept a DDF

R help(sql) is pretty clear that the first argument should be a ddf, but instead it's a manger. Looking around other docs, this looks like a documentation slip, but it also suggest the question: why can't we do SQL on a DDF? (this is also important for interfacing with dplyr, potentially)

run-example fails if spaces included in path

Not a biggie, but could perplex other users

Exception in thread "main" shark.api.QueryExecutionException: FAILED: Error in semantic analysis: Line 1:23 Invalid path ''resources/test/airline.csv'': No files matching path file:/Users/antonio/Per%2520Data%2520LLC/Clients/Adatao/projects/DDF/resources/test/airline.csv

Some url sanitation applied instead of quoting, but I am just guessing. Workaround is to rename offending directory

DF as a Window to Streaming Data

When processing live data like say stock prices, it will be helpful to have a DF which is a window to the changing data. (Say last 30 seconds of data or last 30 items.)

So is it possible to provide a way to easily construct such DFs easily.

Backends for DDF

Hi,

I just recently found DDF and found it to be very appealing to do data analytics, due to its promise of programming expressiveness (R-style), scalability, .... I am still very new with DDF, and I want to ask a question here, as i can't find suitable forum or other discussion channels to ask this kind of question.

This is my question: Are there other backends built for DDF (either in production, experiment, coding, or planning stage)? At the outset DDF seems to be used on top of Spark natively. I feel that Spark is very heavy to begin with, though, especially if we are just starting some analytics. I think it could also use some lighter-weight frameworks, including Pandas or R itself as the backend. Then later on one can move on to Spark if necessity dictates (i.e. data has outgrown the backend). In this way, the analytics code are not changed.

Wirawan

DDF as a SpreadSheet with Cells, Rows, Columns defined through Formulas / Functions which update when when the DF data changes or accessed

Being able to treat a DDF as a spread sheet with rows, columns, and cells defined through formulas / functions which update when internal DF data changes or accessed. This would increase the usefulness and power of DDF.

Have a look at the following projects to get an idea:

If this is coupled with the functionality where you can define a new DF through a formula using multiple DFs and DF operations which update when data changes, this would make DDF very powerful. (#56)

in this case the readable view of the DF should be different from the writeable view of the DF so combining DF will not mess up rows and columns calculated by formulas.

could use head as print method

Since data is the star of the show, printing a ddf should not return some odd abstraction-killing object information, but the actual data. Since printing all the data is in general a bad idea , printing some is a better one. A sample or the first few lines are both acceptable options (the former is more statistically motivated, the latter faster to implement and the R tradition

spark-cluster running failed

hi,guys.
I was confused that the DDF application goes well with the SPARK_MASTER=local[4], as is the default set.
However, when I add the line
export JAVA_OPTS+=" -Dspark.master=spark://9.91.8.145:7077"
in the ./bin/run-example script to run the example with command :
bin/run-example io.spark.ddf.examples.RowCount
on a 3-works cluster, The job is never working.
It seems the worker cannot connect to the master or something like that.

My env. is:
Hadoop 2.4.0
Spark 1.1.0 or 1.0.0
I have changed nothing except adding the JAVA_OPTS line, here is the running info. as follows, Thanks for your help!

/home/ljz/DDF131/DDF
/home/ljz/DDF131/DDF/examples/target/scala-2.10/ddf_examples_2.10-1.2.0.jar
/home/ljz/DDF131/DDF/examples/target/scala-2.10/ddf_examples_2.10-1.2.0.jar:/home/ljz/DDF131/DDF/examples/target/scala-2.10/lib/*
/home/ym/jdk1.7.0_11/bin/java
-Dhive.metastore.warehouse.dir=/tmp/hive/warehouse -Dspark.master=spark://9.91.8.145:7077 -agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005
io.spark.ddf.examples.RowCount
Listening for transport dt_socket at address: 5005
0 [main] DEBUG io.ddf.util.ConfigHandler - Using config file found at /home/ljz/DDF131/DDF/ddf-conf/ddf.ini

128 [main] DEBUG org.apache.commons.configuration.ConfigurationUtils - ConfigurationUtils.locate(): base is null, name is /home/ljz/DDF131/DDF/ddf-conf/ddf.ini
132 [main] DEBUG org.apache.commons.configuration.ConfigurationUtils - Loading configuration from the absolute path /home/ljz/DDF131/DDF/ddf-conf/ddf.ini
1158 [main] INFO io.spark.ddf.SparkDDFManager - >>>>>>> params = {"SPARK_SERIALIZER":"io.spark.content.KryoRegistrator","HIVE_HOME":"/home/ym/hive","SPARK_APPNAME":"DDFClient","SPARK_MASTER":"spark://9.91.8.145:7077","HADOOP_HOME":"/home/favor/hadoop/2.4.0","SPARK_HOME":"/home/favor/spark/1.1.0"}
1159 [main] INFO io.spark.ddf.SparkDDFManager - >>>>> ddfSparkJar = null
1159 [main] INFO io.spark.ddf.SparkDDFManager - >>>> key = SPARK_SERIALIZER, value = io.spark.content.KryoRegistrator
1159 [main] INFO io.spark.ddf.SparkDDFManager - >>>> key = HIVE_HOME, value = /home/ym/hive
1159 [main] INFO io.spark.ddf.SparkDDFManager - >>>> key = SPARK_APPNAME, value = DDFClient
1159 [main] INFO io.spark.ddf.SparkDDFManager - >>>> key = SPARK_MASTER, value = spark://9.91.8.145:7077
1159 [main] INFO io.spark.ddf.SparkDDFManager - >>>> key = HADOOP_HOME, value = /home/favor/hadoop/2.4.0
1159 [main] INFO io.spark.ddf.SparkDDFManager - >>>> key = SPARK_HOME, value = /home/favor/spark/1.1.0
4884 [main] INFO org.apache.spark.SecurityManager - Changing view acls to: root
4886 [main] INFO org.apache.spark.SecurityManager - Changing modify acls to: root
4889 [main] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
4924 [main] DEBUG org.apache.spark.util.AkkaUtils - In createActorSystem, requireCookie is: off
6303 [sparkDriver-akka.actor.default-dispatcher-4] INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
6444 [sparkDriver-akka.actor.default-dispatcher-4] INFO Remoting - Starting remoting
6625 [sparkDriver-akka.actor.default-dispatcher-4] INFO Remoting - Remoting started; listening on addresses :[akka.tcp://sparkDriver@ydatasight1:55383]
6649 [main] INFO org.apache.spark.util.Utils - Successfully started service 'sparkDriver' on port 55383.
6663 [main] DEBUG org.apache.spark.SparkEnv - Using serializer: class org.apache.spark.serializer.JavaSerializer
6713 [main] INFO org.apache.spark.SparkEnv - Registering MapOutputTracker
6759 [main] INFO org.apache.spark.SparkEnv - Registering BlockManagerMaster
6769 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.storage.BlockManagerMasterActor - [actor] received message ExpireDeadHosts from Actor[akka://sparkDriver/user/BlockManagerMaster#1587151872]
6771 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.storage.BlockManagerMasterActor - [actor] handled message (1.988192 ms) ExpireDeadHosts from Actor[akka://sparkDriver/user/BlockManagerMaster#1587151872]
6855 [main] INFO org.apache.spark.util.Utils - Successfully started service 'Connection manager for block manager' on port 36208.
6863 [main] INFO org.apache.spark.network.nio.ConnectionManager - Bound socket to port 36208 with id = ConnectionManagerId(ydatasight1,36208)
6877 [main] DEBUG org.apache.spark.util.Utils - Getting/creating local root dirs at '/tmp'
6885 [main] INFO org.apache.spark.storage.DiskBlockManager - Created local directory at /tmp/spark-local-20141103233003-df31
6899 [main] INFO org.apache.spark.storage.MemoryStore - MemoryStore started with capacity 14.4 GB
6918 [main] INFO org.apache.spark.storage.BlockManagerMaster - Trying to register BlockManager
6922 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.storage.BlockManagerMasterActor - [actor] received message RegisterBlockManager(BlockManagerId(, ydatasight1, 36208),15420629975,Actor[akka://sparkDriver/user/BlockManagerActor1#-297451495]) from Actor[akka://sparkDriver/temp/$a]
6924 [sparkDriver-akka.actor.default-dispatcher-3] INFO org.apache.spark.storage.BlockManagerMasterActor - Registering block manager ydatasight1:36208 with 14.4 GB RAM
6929 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.storage.BlockManagerMasterActor - [actor] handled message (6.135115 ms) RegisterBlockManager(BlockManagerId(, ydatasight1, 36208),15420629975,Actor[akka://sparkDriver/user/BlockManagerActor1#-297451495]) from Actor[akka://sparkDriver/temp/$a]
6931 [main] INFO org.apache.spark.storage.BlockManagerMaster - Registered BlockManager
6977 [main] INFO org.apache.spark.HttpFileServer - HTTP File server directory is /tmp/spark-194698e4-091b-4880-9060-79a6042bb22b
7008 [main] INFO org.apache.spark.HttpServer - Starting HTTP Server
7150 [main] DEBUG org.eclipse.jetty.util.log - Logging to org.slf4j.impl.Log4jLoggerAdapter(org.eclipse.jetty.util.log) via org.eclipse.jetty.util.log.Slf4jLog
7185 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.Server@1cacdf69 + [email protected]:0 as connector
7200 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.Server@1cacdf69 + qtp961873472{8<=0<=0/254,-1} as threadpool
7333 [main] DEBUG org.apache.spark.HttpServer - HttpServer is not using security
7333 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.HandlerList@6e53c251 + org.eclipse.jetty.server.handler.ResourceHandler@3cfa8c6d as handler
7333 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.HandlerList@6e53c251 + org.eclipse.jetty.server.handler.DefaultHandler@65196761 as handler
7333 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.Server@1cacdf69 + org.eclipse.jetty.server.handler.HandlerList@6e53c251 as handler
7333 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.server.Server@1cacdf69
7339 [main] INFO org.eclipse.jetty.server.Server - jetty-8.1.14.v20131031
7363 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.server.handler.HandlerList@6e53c251
7363 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.server.handler.ResourceHandler@3cfa8c6d
7377 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.server.handler.ResourceHandler@3cfa8c6d
7378 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.server.handler.ResourceHandler@3cfa8c6d
7378 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.server.handler.DefaultHandler@65196761
7378 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.server.handler.DefaultHandler@65196761
7378 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.server.handler.DefaultHandler@65196761
7378 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.server.handler.HandlerList@6e53c251
7378 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.server.handler.HandlerList@6e53c251
7378 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.server.Server@1cacdf69
7379 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting qtp961873472{8<=0<=0/254,-1}
7382 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED qtp961873472{8<=7<=8/254,0}
7382 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting [email protected]:0
7382 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting null/null
7393 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED PooledBuffers [0/1024@6144,0/1024@16384,0/1024@-]/PooledBuffers [0/1024@6144,0/1024@32768,0/1024@-]
7395 [main] INFO org.eclipse.jetty.server.AbstractConnector - Started [email protected]:47443
7396 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED [email protected]:47443
7396 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.server.Server@1cacdf69
7397 [main] INFO org.apache.spark.util.Utils - Successfully started service 'HTTP file server' on port 47443.
7399 [main] DEBUG org.apache.spark.HttpFileServer - HTTP file server started at: http://9.91.8.145:47443
7699 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7699 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7700 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7700 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-14f953df}
7700 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-14f953df=org.apache.spark.ui.JettyUtils$$anon$1-14f953df}
7707 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7707 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7707 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7707 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-256f07b9}
7707 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-256f07b9=org.apache.spark.ui.JettyUtils$$anon$1-256f07b9}
7707 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7708 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7708 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7708 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2}
7708 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2=org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2}
7708 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7708 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7709 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7709 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd}
7709 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd=org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd}
7709 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7709 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7709 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7710 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5}
7710 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5=org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5}
7710 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7710 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7710 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7710 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-498c3269}
7710 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-498c3269=org.apache.spark.ui.JettyUtils$$anon$1-498c3269}
7719 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7719 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7719 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7719 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-6c76c638}
7719 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-6c76c638=org.apache.spark.ui.JettyUtils$$anon$1-6c76c638}
7720 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7720 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7720 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7720 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29}
7720 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29=org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29}
7720 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7720 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7721 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7721 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-45cdac04}
7721 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-45cdac04=org.apache.spark.ui.JettyUtils$$anon$1-45cdac04}
7721 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7721 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7721 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7722 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-41d85e69}
7722 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-41d85e69=org.apache.spark.ui.JettyUtils$$anon$1-41d85e69}
7726 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7727 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7727 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7727 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-5dad3677}
7727 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-5dad3677=org.apache.spark.ui.JettyUtils$$anon$1-5dad3677}
7727 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7727 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7727 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7728 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-16d2eead}
7728 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-16d2eead=org.apache.spark.ui.JettyUtils$$anon$1-16d2eead}
7733 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7733 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7733 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7734 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-636ff712}
7734 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-636ff712=org.apache.spark.ui.JettyUtils$$anon$1-636ff712}
7734 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7734 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7734 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7734 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be}
7734 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be=org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be}
7744 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7744 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7744 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7744 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.eclipse.jetty.servlet.DefaultServlet-758c7ddc}
7744 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.eclipse.jetty.servlet.DefaultServlet-758c7ddc=org.eclipse.jetty.servlet.DefaultServlet-758c7ddc}
7747 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7747 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7747 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7747 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31}
7747 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31=org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31}
7750 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7750 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7750 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7750 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$2-5c481da6}
7750 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$2-5c481da6=org.apache.spark.ui.JettyUtils$$anon$2-5c481da6}
7793 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.Server@25695aa2 + [email protected]:4040 as connector
7793 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.Server@25695aa2 + qtp666719454{8<=0<=0/254,-1} as threadpool
7793 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/stages,null} as handler
7794 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/stages/json,null} as handler
7794 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/stages/stage,null} as handler
7794 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/stages/stage/json,null} as handler
7794 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/stages/pool,null} as handler
7794 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/stages/pool/json,null} as handler
7795 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/storage,null} as handler
7795 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/storage/json,null} as handler
7795 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/storage/rdd,null} as handler
7795 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/storage/rdd/json,null} as handler
7795 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/environment,null} as handler
7796 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/environment/json,null} as handler
7796 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/executors,null} as handler
7796 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/executors/json,null} as handler
7796 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/static,null} as handler
7797 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/,null} as handler
7797 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 + o.e.j.s.ServletContextHandler{/stages/stage/kill,null} as handler
7797 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.server.Server@25695aa2 + org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33 as handler
7797 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.server.Server@25695aa2
7797 [main] INFO org.eclipse.jetty.server.Server - jetty-8.1.14.v20131031
7797 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33
7799 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/stages,null}
7800 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@cd7e6f5 + org.apache.spark.ui.JettyUtils$$anon$1-14f953df as servlet
7800 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@cd7e6f5 + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-14f953df as servletMapping
7800 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/stages,null} + org.eclipse.jetty.servlet.ServletHandler@cd7e6f5 as handler
7800 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@cd7e6f5
7800 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7800 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7801 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7801 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-14f953df}
7801 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-14f953df=org.apache.spark.ui.JettyUtils$$anon$1-14f953df}
7801 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@cd7e6f5
7801 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@cd7e6f5
7801 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/stages,null}
7802 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-14f953df
7806 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-14f953df
7806 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/stages,null}
7807 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/stages/json,null}
7807 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@41755a16 + org.apache.spark.ui.JettyUtils$$anon$1-256f07b9 as servlet
7807 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@41755a16 + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-256f07b9 as servletMapping
7807 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/stages/json,null} + org.eclipse.jetty.servlet.ServletHandler@41755a16 as handler
7807 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@41755a16
7808 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7808 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7808 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7808 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-256f07b9}
7808 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-256f07b9=org.apache.spark.ui.JettyUtils$$anon$1-256f07b9}
7808 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@41755a16
7808 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@41755a16
7808 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/stages/json,null}
7809 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-256f07b9
7809 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-256f07b9
7809 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/stages/json,null}
7809 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/stages/stage,null}
7809 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@7f6d7bec + org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2 as servlet
7809 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@7f6d7bec + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2 as servletMapping
7810 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/stages/stage,null} + org.eclipse.jetty.servlet.ServletHandler@7f6d7bec as handler
7810 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@7f6d7bec
7810 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7810 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7810 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7810 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2}
7810 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2=org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2}
7811 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@7f6d7bec
7811 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@7f6d7bec
7811 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/stages/stage,null}
7811 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2
7811 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-1c6415e2
7811 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/stages/stage,null}
7812 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/stages/stage/json,null}
7812 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@6ce3044f + org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd as servlet
7812 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@6ce3044f + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd as servletMapping
7812 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/stages/stage/json,null} + org.eclipse.jetty.servlet.ServletHandler@6ce3044f as handler
7812 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@6ce3044f
7813 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7813 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7813 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7813 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd}
7813 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd=org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd}
7813 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@6ce3044f
7813 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@6ce3044f
7814 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/stages/stage/json,null}
7814 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd
7814 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-6ed8b6fd
7814 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/stages/stage/json,null}
7814 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/stages/pool,null}
7814 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@2c301662 + org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5 as servlet
7815 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@2c301662 + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5 as servletMapping
7815 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/stages/pool,null} + org.eclipse.jetty.servlet.ServletHandler@2c301662 as handler
7815 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@2c301662
7815 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7815 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7815 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7815 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5}
7816 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5=org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5}
7816 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@2c301662
7816 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@2c301662
7816 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/stages/pool,null}
7816 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5
7816 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-55d5d4e5
7816 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/stages/pool,null}
7817 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/stages/pool/json,null}
7817 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@90d9298 + org.apache.spark.ui.JettyUtils$$anon$1-498c3269 as servlet
7817 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@90d9298 + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-498c3269 as servletMapping
7817 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/stages/pool/json,null} + org.eclipse.jetty.servlet.ServletHandler@90d9298 as handler
7817 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@90d9298
7818 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7818 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7818 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7818 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-498c3269}
7818 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-498c3269=org.apache.spark.ui.JettyUtils$$anon$1-498c3269}
7818 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@90d9298
7818 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@90d9298
7819 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/stages/pool/json,null}
7819 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-498c3269
7819 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-498c3269
7819 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/stages/pool/json,null}
7819 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/storage,null}
7819 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@5a133dcc + org.apache.spark.ui.JettyUtils$$anon$1-6c76c638 as servlet
7820 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@5a133dcc + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-6c76c638 as servletMapping
7820 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/storage,null} + org.eclipse.jetty.servlet.ServletHandler@5a133dcc as handler
7820 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@5a133dcc
7820 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7820 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7820 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7820 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-6c76c638}
7821 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-6c76c638=org.apache.spark.ui.JettyUtils$$anon$1-6c76c638}
7821 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@5a133dcc
7821 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@5a133dcc
7821 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/storage,null}
7821 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-6c76c638
7821 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-6c76c638
7822 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/storage,null}
7822 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/storage/json,null}
7822 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@253e4a47 + org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29 as servlet
7822 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@253e4a47 + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29 as servletMapping
7822 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/storage/json,null} + org.eclipse.jetty.servlet.ServletHandler@253e4a47 as handler
7822 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@253e4a47
7823 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7823 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7823 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7823 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29}
7823 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29=org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29}
7823 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@253e4a47
7823 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@253e4a47
7824 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/storage/json,null}
7824 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29
7824 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-69cfbe29
7824 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/storage/json,null}
7824 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/storage/rdd,null}
7824 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@1c7e8e6b + org.apache.spark.ui.JettyUtils$$anon$1-45cdac04 as servlet
7825 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@1c7e8e6b + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-45cdac04 as servletMapping
7825 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/storage/rdd,null} + org.eclipse.jetty.servlet.ServletHandler@1c7e8e6b as handler
7825 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@1c7e8e6b
7825 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7825 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7825 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7826 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-45cdac04}
7826 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-45cdac04=org.apache.spark.ui.JettyUtils$$anon$1-45cdac04}
7826 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@1c7e8e6b
7826 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@1c7e8e6b
7826 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/storage/rdd,null}
7826 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-45cdac04
7826 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-45cdac04
7827 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/storage/rdd,null}
7827 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
7827 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@38b8216a + org.apache.spark.ui.JettyUtils$$anon$1-41d85e69 as servlet
7827 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@38b8216a + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-41d85e69 as servletMapping
7827 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/storage/rdd/json,null} + org.eclipse.jetty.servlet.ServletHandler@38b8216a as handler
7827 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@38b8216a
7828 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7828 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7828 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7828 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-41d85e69}
7828 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-41d85e69=org.apache.spark.ui.JettyUtils$$anon$1-41d85e69}
7828 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@38b8216a
7828 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@38b8216a
7829 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
7829 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-41d85e69
7829 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-41d85e69
7829 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
7829 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/environment,null}
7829 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@4099d33d + org.apache.spark.ui.JettyUtils$$anon$1-5dad3677 as servlet
7830 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@4099d33d + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-5dad3677 as servletMapping
7830 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/environment,null} + org.eclipse.jetty.servlet.ServletHandler@4099d33d as handler
7830 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@4099d33d
7830 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7830 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7830 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7831 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-5dad3677}
7831 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-5dad3677=org.apache.spark.ui.JettyUtils$$anon$1-5dad3677}
7831 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@4099d33d
7831 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@4099d33d
7831 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/environment,null}
7831 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-5dad3677
7831 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-5dad3677
7832 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/environment,null}
7832 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/environment/json,null}
7832 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@32fb6aed + org.apache.spark.ui.JettyUtils$$anon$1-16d2eead as servlet
7832 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@32fb6aed + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-16d2eead as servletMapping
7832 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/environment/json,null} + org.eclipse.jetty.servlet.ServletHandler@32fb6aed as handler
7833 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@32fb6aed
7833 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7833 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7833 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7833 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-16d2eead}
7833 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-16d2eead=org.apache.spark.ui.JettyUtils$$anon$1-16d2eead}
7833 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@32fb6aed
7833 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@32fb6aed
7834 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/environment/json,null}
7834 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-16d2eead
7834 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-16d2eead
7834 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/environment/json,null}
7834 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/executors,null}
7835 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@182907c1 + org.apache.spark.ui.JettyUtils$$anon$1-636ff712 as servlet
7835 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@182907c1 + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-636ff712 as servletMapping
7835 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/executors,null} + org.eclipse.jetty.servlet.ServletHandler@182907c1 as handler
7835 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@182907c1
7835 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7835 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7835 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7836 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-636ff712}
7836 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-636ff712=org.apache.spark.ui.JettyUtils$$anon$1-636ff712}
7836 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@182907c1
7836 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@182907c1
7836 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/executors,null}
7836 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-636ff712
7836 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-636ff712
7837 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/executors,null}
7837 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/executors/json,null}
7837 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@2dbc1c4b + org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be as servlet
7837 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@2dbc1c4b + [/]=>org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be as servletMapping
7837 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/executors/json,null} + org.eclipse.jetty.servlet.ServletHandler@2dbc1c4b as handler
7838 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@2dbc1c4b
7838 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7838 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7838 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7838 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be}
7838 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be=org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be}
7838 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@2dbc1c4b
7838 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@2dbc1c4b
7839 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/executors/json,null}
7839 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be
7839 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$1-4bc5f1be
7839 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/executors/json,null}
7839 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/static,null}
7840 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@18e59762 + org.eclipse.jetty.servlet.DefaultServlet-758c7ddc as servlet
7840 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@18e59762 + [/]=>org.eclipse.jetty.servlet.DefaultServlet-758c7ddc as servletMapping
7840 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/static,null} + org.eclipse.jetty.servlet.ServletHandler@18e59762 as handler
7840 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@18e59762
7840 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7840 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7840 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7841 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.eclipse.jetty.servlet.DefaultServlet-758c7ddc}
7841 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.eclipse.jetty.servlet.DefaultServlet-758c7ddc=org.eclipse.jetty.servlet.DefaultServlet-758c7ddc}
7841 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@18e59762
7841 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@18e59762
7841 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/static,null}
7841 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.DefaultServlet-758c7ddc
7844 [main] DEBUG org.eclipse.jetty.servlet.DefaultServlet - resource base = jar:file:/home/ljz/DDF131/DDF/examples/target/scala-2.10/lib/spark-core_2.10-1.2.0-adatao.jar!/org/apache/spark/ui/static
7844 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.DefaultServlet-758c7ddc
7845 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/static,null}
7845 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/,null}
7845 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@935afb3 + org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31 as servlet
7845 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@935afb3 + [/]=>org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31 as servletMapping
7845 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/,null} + org.eclipse.jetty.servlet.ServletHandler@935afb3 as handler
7845 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@935afb3
7846 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7846 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7846 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7846 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31}
7846 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31=org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31}
7846 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@935afb3
7846 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@935afb3
7847 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/,null}
7847 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31
7847 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$2-3f2dea31
7847 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/,null}
7847 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
7847 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@23a2157e + org.apache.spark.ui.JettyUtils$$anon$2-5c481da6 as servlet
7848 [main] DEBUG org.eclipse.jetty.util.component.Container - Container org.eclipse.jetty.servlet.ServletHandler@23a2157e + [/]=>org.apache.spark.ui.JettyUtils$$anon$2-5c481da6 as servletMapping
7848 [main] DEBUG org.eclipse.jetty.util.component.Container - Container o.e.j.s.ServletContextHandler{/stages/stage/kill,null} + org.eclipse.jetty.servlet.ServletHandler@23a2157e as handler
7848 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.servlet.ServletHandler@23a2157e
7848 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - filterNameMap={}
7848 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - pathFilters=null
7848 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletFilterMap=null
7848 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletPathMap={/=org.apache.spark.ui.JettyUtils$$anon$2-5c481da6}
7849 [main] DEBUG org.eclipse.jetty.servlet.ServletHandler - servletNameMap={org.apache.spark.ui.JettyUtils$$anon$2-5c481da6=org.apache.spark.ui.JettyUtils$$anon$2-5c481da6}
7849 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.servlet.ServletHandler@23a2157e
7849 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.servlet.ServletHandler@23a2157e
7849 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
7849 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.apache.spark.ui.JettyUtils$$anon$2-5c481da6
7849 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.apache.spark.ui.JettyUtils$$anon$2-5c481da6
7850 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
7850 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33
7850 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.server.handler.ContextHandlerCollection@7f0ecf33
7850 [main] DEBUG org.eclipse.jetty.server.handler.AbstractHandler - starting org.eclipse.jetty.server.Server@25695aa2
7850 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting qtp666719454{8<=0<=0/254,-1}
7853 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED qtp666719454{8<=6<=8/254,0}
7853 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting [email protected]:4040
7853 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting null/null
7855 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED PooledBuffers [0/1024@6144,0/1024@16384,0/1024@-]/PooledBuffers [0/1024@6144,0/1024@32768,0/1024@-]
7855 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - starting org.eclipse.jetty.server.nio.SelectChannelConnector$ConnectorSelectorManager@381cc7bf
7972 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.server.nio.SelectChannelConnector$ConnectorSelectorManager@381cc7bf
7973 [qtp666719454-46 Selector1] DEBUG org.eclipse.jetty.io.nio - Starting Thread[qtp666719454-46 Selector1,5,main] on org.eclipse.jetty.io.nio.SelectorManager$1@58af3ba8
7973 [qtp666719454-45 Selector0] DEBUG org.eclipse.jetty.io.nio - Starting Thread[qtp666719454-45 Selector0,5,main] on org.eclipse.jetty.io.nio.SelectorManager$1@50b9206f
7973 [qtp666719454-47 Selector2] DEBUG org.eclipse.jetty.io.nio - Starting Thread[qtp666719454-47 Selector2,5,main] on org.eclipse.jetty.io.nio.SelectorManager$1@2a0080d0
7977 [qtp666719454-49 Selector4] DEBUG org.eclipse.jetty.io.nio - Starting Thread[qtp666719454-49 Selector4,5,main] on org.eclipse.jetty.io.nio.SelectorManager$1@fcc06e0
7976 [qtp666719454-50 Selector5] DEBUG org.eclipse.jetty.io.nio - Starting Thread[qtp666719454-50 Selector5,5,main] on org.eclipse.jetty.io.nio.SelectorManager$1@ac4860
7974 [qtp666719454-48 Selector3] DEBUG org.eclipse.jetty.io.nio - Starting Thread[qtp666719454-48 Selector3,5,main] on org.eclipse.jetty.io.nio.SelectorManager$1@30728532
7978 [qtp666719454-51 Selector7] DEBUG org.eclipse.jetty.io.nio - Starting Thread[qtp666719454-51 Selector7,5,main] on org.eclipse.jetty.io.nio.SelectorManager$1@37d02427
7978 [qtp666719454-53 Selector6] DEBUG org.eclipse.jetty.io.nio - Starting Thread[qtp666719454-53 Selector6,5,main] on org.eclipse.jetty.io.nio.SelectorManager$1@37d02427
7987 [main] INFO org.eclipse.jetty.server.AbstractConnector - Started [email protected]:4040
7988 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED [email protected]:4040
7988 [main] DEBUG org.eclipse.jetty.util.component.AbstractLifeCycle - STARTED org.eclipse.jetty.server.Server@25695aa2
7988 [main] INFO org.apache.spark.util.Utils - Successfully started service 'SparkUI' on port 4040.
7998 [main] INFO org.apache.spark.ui.SparkUI - Started SparkUI at http://ydatasight1:4040
8292 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of successful kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
8295 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of failed kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
8298 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
8522 [main] DEBUG org.apache.hadoop.security.authentication.util.KerberosName - Kerberos krb5 configuration not found, setting default realm to empty
8527 [main] DEBUG org.apache.hadoop.security.Groups - Creating new Groups object
8533 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
8533 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
8534 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
8534 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
8534 [main] DEBUG org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Falling back to shell based
8535 [main] DEBUG org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
8922 [main] DEBUG org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000
9421 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
9430 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (8.101242 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
9456 [sparkDriver-akka.actor.default-dispatcher-3] INFO org.apache.spark.deploy.client.AppClient$ClientActor - Connecting to master spark://9.91.8.145:7077...
9683 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
9684 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.445697 ms) Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
9688 [sparkDriver-akka.actor.default-dispatcher-2] WARN akka.remote.ReliableDeliverySupervisor - Association with remote system [akka.tcp://[email protected]:7077] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
9691 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
9691 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.007899 ms) Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
10428 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
10429 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.640578 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
11439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
11439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.26142 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
12439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
12440 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.261801 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
13439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
13440 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.262073 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
14438 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
14439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.26537 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
15439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
15439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25605 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
16439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
16439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.262377 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
17439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
17439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.262524 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
18439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
18440 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.254947 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
19438 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
19439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.26188 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
20438 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
20439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25546 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
21439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
21439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.262226 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
22439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
22439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.260991 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
23439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
23439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.258391 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
24439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
24440 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.265184 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
25438 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
25439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.263743 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
26438 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
26439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.286323 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
27439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
27439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255577 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
28439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
28439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.262487 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
29439 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
29440 [sparkDriver-akka.actor.default-dispatcher-2] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.26152 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
29490 [sparkDriver-akka.actor.default-dispatcher-2] INFO org.apache.spark.deploy.client.AppClient$ClientActor - Connecting to master spark://9.91.8.145:7077...
29515 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
29515 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.009049 ms) Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
29516 [sparkDriver-akka.actor.default-dispatcher-3] WARN akka.remote.ReliableDeliverySupervisor - Association with remote system [akka.tcp://[email protected]:7077] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
29516 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
29516 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.021897 ms) Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
30438 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
30439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.30142 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
31438 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
31439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255098 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
32439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
32439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.267343 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
33439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
33440 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.265573 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
34438 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
34439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25545 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
35439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
35439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.256877 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
36439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
36439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.254407 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
37438 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
37439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.258793 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
38438 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
38439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.253717 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
39438 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
39439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.268777 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
40439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
40439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25727 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
41439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
41440 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25507 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
42438 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
42439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255854 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
43438 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
43439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25488 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
44439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
44439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.2558 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
45439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
45439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.256207 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
46439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
46440 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255426 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
47438 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
47439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.256097 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
48438 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
48439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255301 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
49439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
49439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255243 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
49478 [sparkDriver-akka.actor.default-dispatcher-4] INFO org.apache.spark.deploy.client.AppClient$ClientActor - Connecting to master spark://9.91.8.145:7077...
49502 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
49502 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.023787 ms) Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
49503 [sparkDriver-akka.actor.default-dispatcher-3] WARN akka.remote.ReliableDeliverySupervisor - Association with remote system [akka.tcp://[email protected]:7077] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
49503 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
49503 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.016451 ms) Disassociated [akka.tcp://sparkDriver@ydatasight1:55383] -> [akka.tcp://[email protected]:7077] from Actor[akka://sparkDriver/deadLetters]
50439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
50439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.268887 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
51439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
51440 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255084 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
52438 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
52439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.254834 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
53438 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
53439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.259327 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
54439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
54440 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.254597 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
55439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
55439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25466 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
56439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
56439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255264 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
57439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
57439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.256474 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
58439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
58439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25629 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
59438 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
59439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255096 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
60439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
60440 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25814 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
61439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
61439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.255467 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
62438 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
62439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25609 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
63438 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
63439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.25578 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
64439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
64440 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.26166 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
65439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
65439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.260454 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
66438 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
66439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.260594 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
66769 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.storage.BlockManagerMasterActor - [actor] received message ExpireDeadHosts from Actor[akka://sparkDriver/user/BlockManagerMaster#1587151872]
66770 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.storage.BlockManagerMasterActor - [actor] handled message (0.165651 ms) ExpireDeadHosts from Actor[akka://sparkDriver/user/BlockManagerMaster#1587151872]
67439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
67439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.258233 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
68439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
68439 [sparkDriver-akka.actor.default-dispatcher-4] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.256537 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
69439 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] received message ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
69440 [sparkDriver-akka.actor.default-dispatcher-3] DEBUG org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - [actor] handled message (0.254436 ms) ReviveOffers from Actor[akka://sparkDriver/user/CoarseGrainedScheduler#-883616774]
69480 [sparkDriver-akka.actor.default-dispatcher-3] ERROR org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend - Application has been killed. Reason: All masters are unresponsive! Giving up.
69481 [sparkDriver-akka.actor.default-dispatcher-3] ERROR org.apache.spark.scheduler.TaskSchedulerImpl - Exiting due to error from cluster scheduler: All masters are unresponsive! Giving up.
69487 [delete Spark temp dirs] DEBUG org.apache.spark.util.Utils - Shutdown hook called
69487 [delete Spark local dirs] DEBUG org.apache.spark.storage.DiskBlockManager - Shutdown hook called

Incremental / Reactive / DataFlow DataFrames

Say you have DF a defined as:

A = B + C

Is it possible to provide the functionality with perhaps some special DFs implementation:

  1. A is computed lazily when accessed using the latest values in B and C (Pull)
  2. When B or C's values changes A gets automatically updated (Push)

Compilation failure

Cloned latest and tried bin/run-once.sh

Got the following issue:

DDF/core/src/main/java/io/basic/ddf/BasicDDFManager.java (at line 20)
[ERROR] public String getEngine() {
[ERROR] ^^^^^^^^^^^
[ERROR] The method getEngine() of type BasicDDFManager must override a superclass method

R example failed with

when I run Rscript examples/basics.R all the commands work well until

fivenum(ddf)
with error:

Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, :
java.lang.OutOfMemoryError: PermGen space
Calls: fivenum ... -> .jrcall -> .jcall -> .jcheck -> .Call
Execution halted

so on R shell I run commands in basics.R sentence-by-sentence

if I don't run
daggr(mpg ~ vs + carb, ddf, FUN=mean)
or
daggr(ddf, agg.cols="sum(mpg), min(hp)", by="vs, am")

other sentances work well.

However once I run any one of the above two sentences, error goes to
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, :
io.ddf.exception.DDFException: Unable to get fivenum summary of the given columns from table SparkDDF_spark_5be72817_9687_40eb_bab2_8c1fb190bb9c

sometimes, the error changes to be :
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, :
java.lang.OutOfMemoryError: PermGen space
Calls: fivenum ... -> .jrcall -> .jcall -> .jcheck -> .Call
Execution halted

I was quite confused why?

Is there anything I missing?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.