Giter Club home page Giter Club logo

pickling's Introduction

scala/pickling

Build Status Stories in Ready Join the chat at https://gitter.im/scala/pickling

Scala Pickling is an automatic serialization framework made for Scala. It's fast, boilerplate-free, and allows users to easily swap in/out different serialization formats (such as binary, or JSON), or even to provide their own custom serialization format.

Defaults mode

scala> import scala.pickling.Defaults._, scala.pickling.json._
scala> case class Person(name: String, age: Int)

scala> val pkl = Person("foo", 20).pickle
pkl: pickling.json.pickleFormat.PickleType =
JSONPickle({
  "$type": "Person",
  "name": "foo",
  "age": 20
})

scala> val person = pkl.unpickle[Person]
person: Person = Person(foo,20)

For more, flip through, or watch the ScalaDays 2013 presentation!
For deeper technical details, we've also written an OOPSLA 2013 paper on scala/pickling, Instant Pickles: Generating Object-Oriented Pickler Combinators for Fast and Extensible Serialization.

Get Scala Pickling

Scala Pickling is available on Sonatype for Scala 2.10 and Scala 2.11! You can use Scala Pickling in your sbt project by simply adding the following dependency to your build file:

libraryDependencies += "org.scala-lang.modules" %% "scala-pickling" % "0.10.1"

Please, don't use the version 0.11.0-M1 since it's not production ready and it's still under ongoing development.

What makes it different?

Scala Pickling...

  • can be Language-Neutral if you want it to be. Changing the format of your serialized data is as easy as importing the correct implicit pickle format into scope. Out of the box, we currently support a fast Scala binary format, as well as JSON. Support is currently planned for other formats. Or, you can even roll your own custom pickle format!
  • is Automatic. That is, without any boilerplate at all, one can instruct the framework to figure out how to serialize an arbitrary class instance. No need to register classes, no need to implement any methods.
  • Allows For Unanticipated Evolution. That means that you don’t have to extend some marker trait in order to serialize a given Scala class. Just import the scala.pickling package and call pickle on the instance that you would like to serialize.
  • gives you more Typesafety. No more errors from serialization/deserialization propagating to arbitrary points in your program. Unlike Java Serialization, errors either manifest themselves as compile-time errors, or runtime errors only at the point of unpickling.
  • has Robust Support For Object-Orientation. While Scala Pickling is based on the elegant notion of pickler combinators from functional programming, it goes on to extend the traditional form of pickler combinators to be able to handle open class hierarchies. That means that if you pickle an instance of a subclass, and then try to unpickle as a superclass, you will still get back an instance of the original subclass.
  • Happens At Compile-Time. That means that it’s super-performant because serialization-related code is typically generated at compile-time and inlined where it is needed in your code. Scala Pickling is essentially fully-static, reflection is only used as a fallback when static (compile-time) generation fails.

Optimizing performance

Pickling enables optimizing performance through configuration, in case the pickled objects are known to be simpler than in the general case.

Disabling cyclic object graphs

By default, Pickling can serialize cyclic object graphs (for example, for serializing doubly-linked lists). However, this requires bookkeeping at run time. If pickled objects are known to be not cyclic (for example, simple lists or trees), then this additional bookkeeping can be disabled using the following import:

import scala.pickling.shareNothing._

If objects are pickled in a tight loop, this import can lead to a significant performance improvement.

Static serialization without reflection

To pickle objects of types like Any Pickling uses run-time reflection, since not enough information is available at compile time. However, Pickling supports a static-only mode that ensures no run-time reflection is used. In this mode, pickling objects that would otherwise require run-time reflection causes compile-time errors.

The following import enables static-only serialization:

import scala.pickling.static._  // Avoid run-time reflection

A la carte import

If you want, Pickling lets you import specific parts (functions, ops, picklers, and format) so you can customize each part.

import scala.pickling._         // This imports names only
import scala.pickling.json._    // Imports PickleFormat
import scala.pickling.static._  // Avoid runtime pickler

// Import pickle ops
import scala.pickling.Defaults.{ pickleOps, unpickleOps } 
// Alternatively import pickle function
// import scala.pickling.functions._

// Import picklers for specific types
import scala.pickling.Defaults.{ stringPickler, intPickler, refPicklerUnpickler, nullPickler }

case class Pumpkin(kind: String)
// Manually generate a pickler using macro
implicit val pumpkinPickler = Pickler.generate[Pumpkin]
implicit val pumpkinUnpickler = Unpickler.generate[Pumpkin]

val pckl = Pumpkin("Kabocha").pickle
val pump = pckl.unpickle[Pumpkin]

DIY protocol stack

There are also traits available for picklers to mix and match your own convenience object to import from. If you're a library author, you can provide the convenience object as your protocol stack that some or all of the pickling parts:

  • ops
  • functions
  • picklers
  • format
scala> case class Apple(kind: String)
defined class Apple

scala> val appleProtocol = {
     |              import scala.pickling._
     |              new pickler.PrimitivePicklers with pickler.RefPicklers
     |                  with json.JsonFormats {
     |                // Manually generate pickler for Apple
     |                implicit val applePickler = PicklerUnpickler.generate[Apple]
     |                // Don't fall back to runtime picklers
     |                implicit val so = static.StaticOnly
     |                // Provide custom functions
     |                def toJsonString[A: Pickler](a: A): String =
     |                  functions.pickle(a).value
     |                def fromJsonString[A: Unpickler](s: String): A =
     |                  functions.unpickle[A](json.JSONPickle(s))
     |              }
     |            }
appleProtocol: scala.pickling.pickler.PrimitivePicklers with scala.pickling.pickler.RefPicklers with scala.pickling.json.JsonFormats{implicit val applePickler: scala.pickling.Pickler[Apple] with scala.pickling.Unpickler[Apple] with scala.pickling.Generated; implicit val so: scala.pickling.static.StaticOnly.type; def toJsonString[A](a: A)(implicit evidence$1: scala.pickling.Pickler[A]): String; def fromJsonString[A](s: String)(implicit evidence$2: scala.pickling.Unpickler[A]): A} = $anon$1@2b033c35

Now your library user can import appleProtocol as follows:

scala> import appleProtocol._
import appleProtocol._

scala>  toJsonString(Apple("honeycrisp"))
res0: String =
{
  "$type": "Apple",
  "kind": "honeycrisp"
}

scala> fromJsonString(res0)
res1: Apple = Apple(honeycrisp)

Other ways of getting Pickling

If you would like to run the latest development version of scala/pickling (0.10.2-SNAPSHOT), you also need to add the Sonatype "snapshots" repository resolver to your build file:

libraryDependencies += "org.scala-lang.modules" %% "scala-pickling" % "0.10.2-SNAPSHOT"

resolvers += Resolver.sonatypeRepo("snapshots")

For a more illustrative example, see a sample sbt project which uses Scala Pickling.

Or you can just directly download the 0.10.1 jar (Scala 2.10, Scala 2.11).

pickling's People

Contributors

ahnfelt avatar dzufferey avatar eed3si9n avatar emchristiansen avatar gitter-badger avatar guersam avatar havocp avatar heathermiller avatar jcracknell avatar jsuereth avatar jvican avatar lbliss avatar phaller avatar theblackdragon avatar xeno-by avatar zaneli avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pickling's Issues

Examples do not show pickling to and from a file

I'd like to pickle an object which is in-memory resident at over 8Gb. I can't pickle this to a string - is there an interface to pickle directly to a file? If so, could we show it in the README?

What happens if my case class has a field with the name `tpe`

Not sure if this is the right place to ask, or if it has been brought up elsewhere before, but when I have a case class such as

case class Ticket(id: String, tpe: String)

The pickler will give me back a JSON string with two "tpe"s:

scala> Ticket(java.util.UUID.randomUUID.toString, "A1").pickle
res10: scala.pickling.json.JSONPickle = 
JSONPickle({
  "tpe": "Ticket",
  "id": "e98e47db-255b-4594-82e3-63aeec3df093",
  "tpe": "A1"
})

which would in turn fail to unpickle:

scala> Ticket(java.util.UUID.randomUUID.toString, "A1").pickle.unpickle
scala.reflect.internal.MissingRequirementError: class A1 not found.
...

Is there a way I can avoid the above behavior, e.g. by telling the library to leave out the auto-generated tpe field, other than rolling my own pickler/unpickler? I could always provide the necessary class info when I intend to unpickle, so, for me at least, the full class name saved in tpe seems pretty much redundant.

Thanks.

BTW I'm using 0.8.0-SNAPSHOT with Scala 2.10.3.

Can't generate unpickler in class hierarchy when passing object params to parent

From StackOverflow:

The example below pickles fine, but I get a compile error stating that no unpickler can be generated. Here is a simple test case to reproduce this:

    import scala.pickling._
    import json._
    object JsonTest extends App {
      val simplePickle = new Simple(new SimpleProp("TestProp")).pickle
      val simpleUnpickle = simplePickle.unpickle[Simple]
    }
    abstract class SimpleAbstract(val stringWrapper: SimpleProp) {}
    class Simple(stringWrapper: SimpleProp) extends SimpleAbstract(stringWrapper) {}
    case class SimpleProp(prop: String) {}

Please let me know if you need any additional information on this.

Here is the stacktrace I get with the -Xlog-implicits flag on:

pickling.this.Unpickler.genUnpickler is not a valid implicit value for scala.pickling.Unpickler[com.ft.flexui.modules.pm.Simple] because:
exception during macro expansion: 
java.util.NoSuchElementException: None.get
    at scala.None$.get(Option.scala:313)
    at scala.None$.get(Option.scala:311)
    at scala.pickling.Macro.reflectively(Tools.scala:380)
    at     scala.pickling.UnpicklerMacros$$anonfun$impl$2$$anonfun$16.apply(Macros.scala:248)
    at     scala.pickling.UnpicklerMacros$$anonfun$impl$2$$anonfun$16.apply(Macros.scala:245)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
at scala.pickling.UnpicklerMacros$$anonfun$impl$2.unpickleObject$1(Macros.scala:245)
at scala.pickling.UnpicklerMacros$$anonfun$impl$2.unpickleLogic$1(Macros.scala:273)
at scala.pickling.UnpicklerMacros$$anonfun$impl$2.apply(Macros.scala:291)
at scala.pickling.UnpicklerMacros$$anonfun$impl$2.apply(Macros.scala:184)
at scala.pickling.Macro.preferringAlternativeImplicits(Tools.scala:357)
at scala.pickling.UnpicklerMacros$class.impl(Macros.scala:184)
at scala.pickling.Compat$$anon$3.impl(Compat.scala:24)
at scala.pickling.Compat$.UnpicklerMacros_impl(Compat.scala:25)
at sun.reflect.GeneratedMethodAccessor270.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at scala.tools.nsc.typechecker.Macros$$anonfun$scala$tools$nsc$typechecker$Macros$$macroRuntime$3$$anonfun$apply$8.apply(Macros.scala:544)
at scala.tools.nsc.typechecker.Macros$$anonfun$scala$tools$nsc$typechecker$Macros$$macroRuntime$3$$anonfun$apply$8.apply(Macros.scala:544)
at scala.tools.nsc.typechecker.Macros$class.scala$tools$nsc$typechecker$Macros$$macroExpandWithRuntime(Macros.scala:830)
at scala.tools.nsc.typechecker.Macros$$anonfun$scala$tools$nsc$typechecker$Macros$$macroExpand1$1.apply(Macros.scala:796)
at scala.tools.nsc.typechecker.Macros$$anonfun$scala$tools$nsc$typechecker$Macros$$macroExpand1$1.apply(Macros.scala:787)
at scala.tools.nsc.Global.withInfoLevel(Global.scala:190)
at scala.tools.nsc.typechecker.Macros$class.scala$tools$nsc$typechecker$Macros$$macroExpand1(Macros.scala:787)
val simpleUnpickle = simplePickle.unpickle[Simple]
                                        ^`

SPickler does not support traits

I try to defined a dummy SPickler this way:

class CustomPickler[T](implicit val format: PickleFormat) extends SPickler[T]{
  def pickle(t: T, builder: PBuilder): Unit = {
    println("In my pickle")
    builder.beginEntry(t)
    builder.endEntry
  }
}

then I defined an implicit this way:

  implicit def cPickler = new CustomPickler[X]
  println((new Y(0)).pickle)

where Y extends the trait X:

trait X {
  def id: Int
}

class Y(val id: Int) extends X

but it does not apply the the pickle method I defined in the CustomPickler class, whereas it does if replacing X by Y in the definition of the implicit. Conclusion: SPickler does not support the trait (if I correctly coded the things), which is a big limitation.

Out of memory while compiling

I'm getting java.lang.OutOfMemoryError: Java heap space while compiling my project with pickling since a recent update. It used to compile fine last week, and it still works fine as long as I comment simulation.pickle in class Parser.

I'm running sbt compile with default parameters:
/usr/bin/java -Xmx512M -jar /usr/local/Cellar/sbt/0.13.0/libexec/sbt-launch.jar compile
but I have tried to increase my heap up to 4GB, only to get: java.lang.OutOfMemoryError: GC overhead limit exceeded

The process uses 100% of one cpu until it reaches the heap limit, then it uses every cpu as it is forced to GC.

Most recent error log
Another error log

I'm currently tring to reproduce with a smaller part of the project, but I only manage to get java.io.IOException: File name too long as in #10 instead of the intended OutOfMemoryError.

Design of PickleFormat trait

Hi!

I've created a library using the power off scala.pickling.

While I was working on that library I was struggling with PickleFormat.

The problem is that this is solid type for both pickle/unpickle operations. If you're serializing into database there is different types of objects for reading/writing like Statement class (write parameters) and ResultSet/Row classes for reading.
For example in PickleFormat you have to define method createBuilder without parameters which should create instance of empty output type. This is problematic with the database where you have a session/connection object and the output (Statement) is always bound to connection. My proposal is to define separate traits for input/output types:

trait PickleInputFormat {
  def createReader(pickle: PickleType, mirror: Mirror): PReader
}

trait PickleOutputFormat {
  type OutputType
  def createBuilder(): PBuilder
  def createBuilder(out: OutputType): PBuilder
}

This allows you to create pickle method with implicit parameters that accepts Statement/Session of database where it's available. The same applies to PickleOutputFormat.

If you're interested in the library please see it here: https://github.com/InnovaCo/capickling
The library is for the Cassandra and allows binding (which is actualy serialization) to database statement and mapping (deserialization) from database results.

Support for OpenJDK 6

The API of sun.misc.Unsafe in OpenJDK 6 is slightly smaller than in Oracle JDK 6. It would be good to also support OpenJDK 6.

Apparent regression

I think a bug was added within the last 11 days to 0.8.0-SNAPSHOT.
I haven't been able to reproduce it as a test case in the scala-pickling project, but I do have a fairly simple external project in which the bug shows up.

The project is https://github.com/emchristiansen/PersistentMap.
It passed its builds on 3 October, but failed when it was rebuilt today.
Nothing of significance changed in that time (the logs show the Scala version changed, but I have also tested the previous version).

The build error comes from this file: https://github.com/emchristiansen/PersistentMap/blob/master/src/test/scala/st/sparse/persistentmap/testPersistentMap.scala

The error is

[error] /media/psf/Home/Dropbox/t/2013_q4/persistentmap/src/test/scala/st/sparse/persistentmap/testPersistentMap.scala:52: Cannot generate an unpickler for st.sparse.persistentmap.MyValue. Recompile with -Xlog-implicits for details
[error]     val map = PersistentMap.create[MyKey, MyValue]("test", database)

Enabling -Xlog-implicits, we have

[info] /media/psf/Home/Dropbox/t/2013_q4/persistentmap/src/test/scala/st/sparse/persistentmap/testPersistentMap.scala:93: pickling.this.Unpickler.genUnpickler is not a valid implicit value for scala.pickling.Unpickler[st.sparse.persistentmap.MyValue] because:
[info] hasMatchingSymbol reported error: value forcefulSet is not a member of reflect.runtime.universe.FieldMirror
[info]     val map2 = PersistentMap.connect[MyKey, MyValue]("test", database).get

I have also distilled the error into a test case here: https://github.com/emchristiansen/pickling/blob/picklerGenerationTest/core/src/test/scala/pickling/generic-spickler.scala.
See the test labeled "possible regression".

POM file doesn't validate

I was trying to download the snapshot version (0.8.0-SNAPSHOT) through a proxy repository (Artifactory), however that failed because the XML in the POM doesn't validate the XSD schema for POMs.

The problem is that there are two organization elements, while the XSD for the POM only allows one.

In Build.scala an extra organization element is defined in pomExtra. Just removing the organization element from pomExtra would fix this problem. Sbt already has a setting key that can be used to set the organization name.

Pickling Java classes yields empty objects

Hi,
It seems that java classes aren't currently well handled, when attempting to either pickle directly, or say embedded in fields of pure scala classes.
The examples below compile successfully, and generate pickled representations.

However in each case:
a) unpickling to the specific type, successfully returns an object, however it's incorrectly initialised. (i.e. it's fields don't match those of the original object)

b) unpickling to [Any] fails at runtime.

This is rather unfortunate, since particularly case (a) could lead to hard to detect errors.
It would be great if you could take a look.

package picklingtests
import scala.pickling._
import scala.pickling.json._

import org.joda.time.LocalDate

import org.junit._
import org.junit.Assert._

case class X(date: org.joda.time.LocalDate)

class PickleTest {

  @Test def testPickleCaseClassJavaField_Specific = {
    val dt = LocalDate.now().plusDays(3)
    val x = X(dt)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[X]

    assertEquals(x, restored)
  }

    @Test def testPickleCaseClassJavaField_Any = {
    val dt = LocalDate.now().plusDays(3)
    val x = X(dt)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[Any]

    assertEquals(x, restored)
  }


  @Test def testPickleJavaClass_Any = {
    val x = LocalDate.now().plusDays(3)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[Any]

    assertEquals(x, restored)
  }

  @Test def testPickleJavaClass_Specific = {
    val x = LocalDate.now().plusDays(3)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[LocalDate]

    assertEquals(x, restored)
  }

  @Test def testPickleJavaClass2_Any = {
    val x = new java.awt.Rectangle(1, 2, 3, 4)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[Any]

    assertEquals(x, restored)
  }

  @Test def testPickleJavaClass2_Specific = {
    val x = new java.awt.Rectangle(1, 2, 3, 4)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[java.awt.Rectangle]

    assertEquals(x, restored)
  }

}

Output:

JSONPickle({
  "tpe": "picklingtests.X",
  "date": {

  }
})
JSONPickle({
  "tpe": "picklingtests.X",
  "date": {

  }
})
JSONPickle({
  "tpe": "java.awt.Rectangle"
})
JSONPickle({
  "tpe": "org.joda.time.LocalDate"
})
JSONPickle({
  "tpe": "org.joda.time.LocalDate"
})
JSONPickle({
  "tpe": "java.awt.Rectangle"
})

picklingtests.PickleTest

testPickleCaseClassJavaField_Specific(picklingtests.PickleTest)
java.lang.AssertionError: expected:<X(2013-10-24)> but was:<X(2013-10-21)>


testPickleCaseClassJavaField_Any(picklingtests.PickleTest)
java.util.NoSuchElementException: key not found: x$1
    at scala.collection.MapLike$class.default(MapLike.scala:228)
    at scala.collection.AbstractMap.default(Map.scala:58)
    at scala.collection.MapLike$class.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:58)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:235)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:159)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:174)



testPickleJavaClass2_Specific(picklingtests.PickleTest)
java.lang.AssertionError: expected:<java.awt.Rectangle[x=1,y=2,width=3,height=4]> but was:<java.awt.Rectangle[x=0,y=0,width=0,height=0]>


testPickleJavaClass_Any(picklingtests.PickleTest)
java.util.NoSuchElementException: key not found: x$1
    at scala.collection.MapLike$class.default(MapLike.scala:228)
    at scala.collection.AbstractMap.default(Map.scala:58)
    at scala.collection.MapLike$class.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:58)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:235)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:159)

testPickleJavaClass_Specific(picklingtests.PickleTest)
java.lang.AssertionError: expected:<2013-10-24> but was:<2013-10-21>


testPickleJavaClass2_Any(picklingtests.PickleTest)
java.util.NoSuchElementException: key not found: x$1
    at scala.collection.MapLike$class.default(MapLike.scala:228)
    at scala.collection.AbstractMap.default(Map.scala:58)
    at scala.collection.MapLike$class.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:58)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:235)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:159)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:174)



Benchmark of usefulness of endCollection having the collection size

For now the correct collection size is given at the endCollection:

if (coll.isInstanceOf[IndexedSeq[_]]) builder.beginCollection(coll.size)
else builder.beginCollection(0)

What would be the impact of

builder.beginCollection(coll.size)

and getting rid of the length argument at endCollection.

Benefit:

  • Incremental
  • Simpler API

generic function with pickling

Is it possible to write generic function that would pickle given argument of yet unknown type?
Something like:

object Barrel {
    def put[A](objx: A) = objx.pickle.value
    def get[A](valx: String) = valx.unpickle[A]
}

Pickling case classes with objects and supertypes

Hi,

i am having some problems with pickling. It seems to be doing something odd when trying to pickle case classes that contain objects. Consider this REPL session:

scala> import scala.pickling._; import json._
import scala.pickling._
import json._

scala> trait A
defined trait A

scala> case class B(x: Option[Int]) extends A
defined class B

scala> B(Some(1)).pickle.unpickle[B] == B(Some(1))
res1: Boolean = true 

scala> B(None).pickle.unpickle[B] == B(None)
res2: Boolean = true

scala> B(None).pickle.unpickle[A]
res4: A = B(None)

!!!!!!!!!
scala> B(None).pickle.unpickle[A] == B(None)
res3: Boolean = false 
!!!!!!!!!

scala> (B(None): A) == B(None)
res6: Boolean = true

As you can see, there is something odd going on with == method going on in the highlighted example above. Am i doing something wrong? This happens only if we unpickle B using the unpickle[A] method (as far as i know)
Thanks,
Tomas

Scala pickle creates needless copies of objects

In this code

object Test extends App {
        val x = new X
        val p = x.pickle
        val y = p.unpickle[X]
        x.a(0) = 4
        y.a(0) = 4
        println(x.b.mkString) //prints 423
        println(y.b.mkString) //prints 123
}

class X {
        var a = Array(1,2,3)
        var b = a
}

X will pickle into:

JSONPickle({
  "tpe": "X",
  "a": [
    1,
    2,
    3
  ],
"b": [
  1,
  2,
  3
]
})

Which will depickle into a X with two copies of the same array. In the above example, y is a depickled x. Instead of b pointing to the same array object as a, it now points to a new, different array.

This is related to issue #1

List as type parameter error

Hello,

I have this function:

def p [ A : SPickler : FastTypeTag ](a: A) = a.pickle

when I try to call it with Seq[Int] => p(Seq(1,2,3)) it returns

scala.pickling.json.JSONPickle = 
JSONPickle({
  "tpe": "scala.collection.Seq[scala.Int]",
  "elems": [
    1,
    2,
    3
  ]
})

but if I try to call it with List[Int] => p(List(1,2,3)) it causes exception

scala.NotImplementedError: an implementation is missing
    at scala.Predef$.$qmark$qmark$qmark(Predef.scala:252)
    at scala.pickling.PicklerUnpicklerNotFound.pickle(Custom.scala:19)
    at .p(<console>:17)
    at .<init>(<console>:19)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:983)
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:604)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:568)
    at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:745)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:790)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:702)
    at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:566)
    at scala.tools.nsc.interpreter.ILoop.innerLoop$1(ILoop.scala:573)
    at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:576)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:867)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:889)
    at xsbt.ConsoleInterface.run(ConsoleInterface.scala:57)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:73)
    at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:64)
    at sbt.Console.console0$1(Console.scala:23)
    at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:24)
    at sbt.TrapExit$.executeMain$1(TrapExit.scala:33)
    at sbt.TrapExit$$anon$1.run(TrapExit.scala:42)

Thanks for your help,
Milan

Running into problems when pickling Map and HashMap

steps (0.10.0)

scala-pickling> console
[info] Starting scala interpreter...
[info] 
Welcome to Scala version 2.11.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51).
Type in expressions to have them evaluated.
Type :help for more information.

scala> import scala.collection.immutable
import scala.collection.immutable

scala> import scala.pickling._, Defaults._, json._
import scala.pickling._
import Defaults._
import json._

scala> immutable.HashMap(( for (j <- 0 until 20) yield j -> "Test" ): _* ).pickle.value

problem (0.10.0)

scala> println(immutable.HashMap(( for (j <- 0 until 20) yield j -> "Test" ): _* ).pickle.value)
{
  "$type": "scala.collection.immutable.HashMap.HashTrieMap",
  "bitmap": 2044127639,
  "elems": {
    "elems": [
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    }
    ]
  },
  "size0": 20
}

original report

My data structure is stored as a Map and when I try to pickle it I run into some weird issues.

When pickling a immutable.HashMap using the following example code.

import scala.collection.immutable
import scala.pickling._
import json._

object Test {
  def main(args : Array[String]) {
    println(immutable.HashMap(( for (j <- 0 until 20) yield j -> "Test" ): _* ).pickle.toString)
  }
}

as output I get

JSONPickle({
  "tpe": "scala.collection.immutable.HashMap.HashTrieMap",
  "bitmap": 2044127639,
  "elems": {

  },
  "size0": 20
})

If I use instead the following snippet of code

import scala.collection.Map
import scala.pickling._
import json._

object Test {
  def main(args : Array[String]) {
    println(Map(( for (j <- 0 until 20) yield j -> "Test" ): _* ).pickle.toString)
  }
}

I also end up with

JSONPickle({
  "tpe": "scala.collection.immutable.HashMap.HashTrieMap",
  "bitmap": 2044127639,
  "elems": {

  },
  "size0": 20
})

only when I don't explicitly import Map do I end up with the correct output.

"invalid index 1 in unpicklee cache of length 1" during unpickling where object ref is repeated

steps (0.10.0)

scala-pickling> console
[info] Starting scala interpreter...
[info] 
Welcome to Scala version 2.11.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51).
Type in expressions to have them evaluated.
Type :help for more information.

scala> import scala.pickling._, Defaults._, json._
import scala.pickling._
import Defaults._
import json._

scala> case class StringProp(prop: String) {}
defined class StringProp

scala> class PropTest(val prop1: StringProp, val prop2: StringProp) {}
defined class PropTest

scala> val obj = StringProp("test")
obj: StringProp = StringProp(test)

scala> val pickle = new PropTest(obj, obj).pickle.unpickle[PropTest]

problems

scala> val pickle = new PropTest(obj, obj).pickle.unpickle[PropTest]
scala.pickling.PicklingException: error in unpickle of primitive unpickler 'scala.pickling.refs.Ref':
tag in unpickle: 'scala.pickling.refs.Ref'
message:
fatal error: invalid index 1 in unpicklee cache of length 1
  at scala.pickling.pickler.PrimitivePickler.unpickle(Primitives.scala:31)
  at $line12$readiwiwiwiwPropTestUnpickler$macro$4$2$$line11$readiwiwiwiwStringPropUnpickler$macro$12$2$.unpickle(<console>:20)
  at $line12$readiwiwiwiwPropTestUnpickler$macro$4$2$.unpickle(<console>:20)
  at scala.pickling.Unpickler$class.unpickleEntry(Pickler.scala:79)
  at $line12$readiwiwiwiwPropTestUnpickler$macro$4$2$.unpickleEntry(<console>:20)
  at scala.pickling.functions$.unpickle(functions.scala:11)
  at scala.pickling.UnpickleOps.unpickle(Ops.scala:23)
  ... 43 elided

original report

Here is a sample use case to produce an Invalid Index exception when unpickling:

import scala.pickling._
import json._
object JsonTest extends App {
  val obj = StringProp("test")
  val pickle = new PropTest(obj, obj).pickle.unpickle[PropTest]
}
case class StringProp(prop: String) {}
class PropTest(val prop1: StringProp, val prop2: StringProp) {}

As reported on StackOverflow:
http://stackoverflow.com/questions/19436070/invalid-index-unpickling-where-object-ref-is-repeated

publish roadmap

It would be handy to have - of course, as estimation - a roadmap to the first point which can be named more or less stable. Thanks!

Schema evolution handling

Hi,

Scala Pickling looks highly promising so that I want to migrate an existing Eventsourced application which uses default Java serialization for event storage.

Currently the biggest concern other than #27 is schema versioning. The application requirement changes as time goes by, so eventually I'll need to add/rename/remove some fields from event messages, and then compatibility with previous versions of message will be an issue.

Is there any plan or recommendation about this?

Salat

Is it possible to see if there is a possibility to introp and avoid duplication and framework fragmentation with Salat

CustomPicklers cannot be used in generic functions.

object Test extends App {
        implicit def genCustomPersonPickler[T <: Person](implicit format: PickleFormat) = new CustomPersonPickler
        def fn[T <: Person](x: T) = x.pickle
}

case class Person(name: String, age: Int, salary: Int)

class CustomPersonPickler(implicit val format: PickleFormat) extends SPickler[Person] {
        def pickle(picklee: Person, builder: PBuilder) {}
}

The above code will fail to compile even though a normal implicit conversion can be used in fn. The error is:

[error] exception during macro expansion: 
[error] scala.ScalaReflectionException: type T is not a class
[error]         at scala.reflect.api.Symbols$SymbolApi$class.asClass(Symbols.scala:323)
[error]         at scala.reflect.internal.Symbols$SymbolContextApiImpl.asClass(Symbols.scala:73)
[error]         at scala.pickling.PickleMacros$class.pickleInto(Macros.scala:276)
[error]         at scala.pickling.Compat$$anon$5.pickleInto(Compat.scala:30)
[error]         at scala.pickling.Compat$.PickleMacros_pickleInto(Compat.scala:31)
[error] one error found

This was inspired by issue #3.

edit: My code was slightly inaccurate, so I fixed it. A regular implicit conversion with bounds like [T <: Person] can be used in fn, but conversion to CustomPersonPickler does not work though it should be able to.

Case class with List/Array/Seq does not unpickle to its parrent type

steps

Hello, I have some issue with unpickling derived case classes.
example (additional import Defaults._ is needed for 0.10.x):

scala> import scala.pickling._, Defaults._
import scala.pickling._

scala> import json._
import json._

scala> trait A
defined trait A

scala> case class B(a: List[Int]) extends A
defined class B

scala> B(List(1,2,3,4)).pickle.unpickle[B]
res1: B = B(List(1, 2, 3, 4))

scala> B(List(1,2,3,4)).pickle.unpickle[A]

problem

scala> B(List(1,2,3,4)).pickle.unpickle[A]
java.util.NoSuchElementException: key not found: hd
    at scala.collection.MapLike$class.default(MapLike.scala:228)
    at scala.collection.AbstractMap.default(Map.scala:58)
    at scala.collection.MapLike$class.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:58)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:234)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:158)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:172)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:171)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2.fieldVals$1(Runtime.scala:171)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2.unpickle(Runtime.scala:202)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:188)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:171)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2.fieldVals$1(Runtime.scala:171)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2.unpickle(Runtime.scala:202)
    at .<init>(<console>:17)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:983)
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:604)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:568)
    at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:745)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:790)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:702)
    at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:566)
    at scala.tools.nsc.interpreter.ILoop.innerLoop$1(ILoop.scala:573)
    at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:576)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:867)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:889)
    at xsbt.ConsoleInterface.run(ConsoleInterface.scala:57)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:73)
    at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:64)
    at sbt.Console.console0$1(Console.scala:23)
    at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:24)
    at sbt.TrapExit$.executeMain$1(TrapExit.scala:33)
    at sbt.TrapExit$$anon$1.run(TrapExit.scala:42)

note

this works fine:

scala> trait A
defined trait A

scala> case class B(a: Int) extends A
defined class B

scala> B(2).pickle.unpickle[A]
res3: A = B(2)

Unable to deserialize Option[X] member of case class.

This is the minimum test necessary to reproduce. I have a more complicated case class than this in my code base, but the Option[X] member appears to be the problem. The October 13th snapshot update seems to have introduced the bug. I get the same error when in 2.10.3-RC2 as well as 2.10.2.

import scala.pickling._
import scala.pickling.binary._


object Test {
  case class TestA (x: Option[Int])
  TestA(Some(1)).pickle.unpickle[TestA]

  case class TestB (x: Option[String])
  TestB(Some("hello")).pickle.unpickle[TestB]
}

I turned on -Xlog-implicits to get the following error on compile

...
[info] /Users/ashton/Projects/event_api/src/main/scala/Pickling.scala:7: pickling.this.Unpickler.genUnpickler is not a valid implicit value for scala.pickling.Unpickler[Test.TestA] because:
[info] hasMatchingSymbol reported error: value forcefulSet is not a member of reflect.runtime.universe.FieldMirror
[info]   TestA(Some(1)).pickle.unpickle[TestA]
[info]                                 ^
[error] /Users/ashton/Projects/event_api/src/main/scala/Pickling.scala:7: Cannot generate an unpickler for Test.TestA. Recompile with -Xlog-implicits for details
[error]   TestA(Some(1)).pickle.unpickle[TestA]
[error]
...

I was able to find the FieldMirror class in the nightly docs
http://www.scala-lang.org/files/archive/nightly/docs-2.10.2/library/index.html#scala.reflect.api.Mirrors$FieldMirror

forcefulSet does not appear to be a member, but it does appear to be added with RichFieldMirror implicit in core/src/main/scala/pickling/internal/package.scala

The following does work

scala> val x:Option[Int] = Some(1)
x: Option[Int] = Some(1)

scala> val y = x.pickle
y: scala.pickling.binary.BinaryPickle = BinaryPickle([0,0,0,21,115,99,97,108,97,46,83,111,109,101,91,115,99,97,108,97,46,73,110,116,93,0,0,0,1])

scala> y.unpickle[Option[Int]]
res18: Option[Int] = Some(1)

Cannot generate a pickler for a class named 'Unit' when not namespaced.

steps

To reproduce, run from the REPL:

scala> import scala.pickling._, Defaults._
import scala.pickling._
import Defaults._

scala> import json._
import json._

scala> object Container { case class Unit(x: Int) }
defined module Container

scala> import Container._
import Container._

scala> case class C(x: Unit)
defined class C

scala> val c = C(Unit(3))
c: C = C(Unit(3))

scala> c.pickle

problem (0.10.0)

scala> c.pickle
<console>:24: error: Cannot generate a pickler for C. Recompile with -Xlog-implicits for details
              c.pickle
                ^

problem from the original report

fails with:

<console>:20: pickling.this.SPickler.genPickler is not a valid implicit value for scala.pickling.SPickler[C.type] because:
hasMatchingSymbol reported error: type mismatch;
 found   : scala.Unit
 required: Container.Unit
              C.pickle

expectations

Note that redefining C with a fully qualified path, without importing Container._:

case class C(x: Container.Unit)
val c = C(Container.Unit(3))
c.pickle

evaluates to

JSONPickle({
  "tpe": "C",
  "x": {
    "x": 3
  }
})

which is correct.

Weird behavior when pickling org.joda.time.DateTime

When picking then unpickling a DateTime, the original value of the DateTime is lost, replaced with the current time.

Here's a console session:

import scala.pickling._
import binary._
import org.joda.time.DateTime

val date = DateTime.now
// date: org.joda.time.DateTime = 2013-09-16T13:04:39.214+02:00

// The unpickled date is different.
date.pickle.unpickle[DateTime]
// org.joda.time.DateTime = 2013-09-16T13:05:51.761+02:00

// Calling the same line again produces yet another result.
date.pickle.unpickle[DateTime]
// org.joda.time.DateTime = 2013-09-16T13:06:08.124+02:00

Runtime crash when pickling a FastTypeTag

Maybe it's a bit weird, but I want to serialize a FastTypeTag, so I can do type verification for a type-safe key-value store I hacked together.
Unfortunately, while the pickling seems to work, unpickling causes a runtime error.

steps

This code:

import scala.pickling._, Defaults._
import scala.pickling.binary._

val ftt = implicitly[FastTypeTag[Int]]
val pickled = ftt.pickle
// Crashes in this line.
val unpickled = pickled.unpickle[FastTypeTag[Int]]

problem

fails with this runtime error:

[info]   scala.reflect.internal.MissingRequirementError: class $anon$1 not found.
[info]   at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[info]   at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[info]   at scala.reflect.internal.Mirrors$RootsBase.ensureClassSymbol(Mirrors.scala:90)
[info]   at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:119)
[info]   at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:21)
[info]   at scala.pickling.package$.typeFromString(package.scala:65)
[info]   at scala.pickling.FastTypeTag$.apply(FastTags.scala:56)
[info]   at scalatestextra.TestPersistentMap$$anonfun$1.apply$mcV$sp(testPersistentMap.scala:29)
[info]   at scalatestextra.TestPersistentMap$$anonfun$1.apply(testPersistentMap.scala:26)
[info]   at scalatestextra.TestPersistentMap$$anonfun$1.apply(testPersistentMap.scala:26)

Scala pickle cannot handle Map

For this code

import scala.pickling._
import json._


object Test extends App {
  val elems = Map("elem1" -> 1,
                  "elem2" -> 2,
                  "elem3" -> 3)

  val pickle = elems.pickle
  println(pickle)
}

elems will pickle into:

JSONPickle({
  "tpe": "scala.collection.immutable.Map.Map3"
})

which is wrong.

Would be nice to have something like (btw, the "tpe" bellow doesn't correspond to a real thing. Just added as example):

JSONPickle({
  "tpe": "scala.collection.immutable.Map.Map3[scala.Predef.String, scala.Int]",
  "elems" : {
    "elem1": 1,
    "elem2": 2,
    "elem3": 3
  }
})

Please explain how to deal with PicklerUnpicklerNotFound

I'm currently getting a runtime NotImplementedError from PicklerUnpicklerNotFound, and I'd like to know what the general strategies should be for working around it.
As this is the most common bug I've found while using pickling, maybe it deserves a section in the README.

I'm also wondering why it even exists; it seems to push detectable compile-time errors (no pickler found) into a runtime error.

BigDecimal unpickling doesn't work

I'll let the console history speak for itself:

scala> import scala.pickling._
import scala.pickling._

scala> import binary._
import binary._

scala> (2.3: BigDecimal).pickle.unpickle[BigDecimal]
<console>:29: error: Cannot generate an unpickler for BigDecimal. Recompile with -Xlog-implicits for details
              (2.3: BigDecimal).pickle.unpickle[BigDecimal]

Akka Support

Hey, thanks a lot for the great work! This would solve a big problem for me if I could use it with Akka.

Viktor mentions @ https://groups.google.com/forum/#!searchin/akka-user/pickling/akka-user/CHeusozMtuQ/lfMtqHu0FvoJ that there might be some issues with Akka support because Pickling generates code at compile-time.

Are there any plans for an Akka extension? Can you elaborate in more detail what potential problem Viktor is referring to and what approaches might solve it?

Thank you very much and kind regards,
Philip

scala.NotImplementedError when using custom function to unpickle case class to its parent trait type

Hello, here is the problem:

scala> import scala.pickling._
import scala.pickling._

scala> import json._
import json._

scala> trait Fruit
defined trait Fruit

scala> case class Apple() extends Fruit
defined class Apple

// This works just fine: 
scala> Apple().pickle.unpickle[Fruit]
res0: Fruit = Apple()

but when I define function to unpickle any instance of type T or any instance derived from trait T, it fails

scala> def unpickle[T : Unpickler : FastTypeTag](p : JSONPickle) = p.unpickle[T]
unpickle: [T](p: scala.pickling.json.JSONPickle)(implicit evidence$1: scala.pickling.Unpickler[T], implicit evidence$2: scala.pickling.FastTypeTag[T])T

scala> unpickle[Fruit](Apple().pickle)

problem

scala> unpickle[Fruit](Apple().pickle)
scala.NotImplementedError: an implementation is missing
    at scala.Predef$.$qmark$qmark$qmark(Predef.scala:252)
    at scala.pickling.PicklerUnpicklerNotFound.unpickle(Custom.scala:20)
    at .unpickle(<console>:13)
    at .<init>(<console>:18)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:983)
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:604)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:568)
    at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:745)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:790)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:702)
    at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:566)
    at scala.tools.nsc.interpreter.ILoop.innerLoop$1(ILoop.scala:573)
    at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:576)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:867)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:889)
    at xsbt.ConsoleInterface.run(ConsoleInterface.scala:69)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
    at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:77)
    at sbt.Console.sbt$Console$$console0$1(Console.scala:23)
    at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:24)
    at sbt.TrapExit$.sbt$TrapExit$$executeMain$1(TrapExit.scala:33)
    at sbt.TrapExit$$anon$1.run(TrapExit.scala:42)

Function works fine with concrete type:
scala> unpickle[Apple](Apple().pickle)
res1: Apple = Apple()

Thanks a lot,
Milan

Support for common collections traits

First of all, thanks for doing this work -- very cool and practical idea.

Is there a plan in place to handle automatic pickler dispatch for e.g.

Seq(1, 2, 3).pickle

Unfortunately, it's one of the first examples I tried. I think I understand the concept; custom pickler plug-ins could potentially provide a DPickler to handle this reflectively. However, would there be a way to relieve every plug-in author of having to do this, perhaps by importing some optional implicits?

I started work on custom picklers for Apache Avro in topology-io/pickling-avro. This is currently on hold at least until the stable release, mod spare time. I have another project which uses runtime reflection to serialize/deserialize arbitrary types to Avro in GenslerAppsPod/scalavro and am hoping to improve upon that implementation by taking advantage of the macros provided in scala-pickling.

Pickling - possible race condition?

Hey guys,

i have following problem:

i am trying to pickle this case class

trait GuiMessage
case class SkillMatrix(operators: Seq[String], skills: Seq[String], enabledSkills: Seq[(String,String)]) extends GuiMessage

The problem is that when i try to pickle this class from two different threads at the same time, i get following:

{x:Any => val pickled = x.asInstanceOf[SkillMatrix].pickle.value; println(s"pickling $x into $pickled");pickled}) //this function is called from two different threads
pickling SkillMatrix(List(1234, 4312),List(skill1, skill2),List((1234,skill1), (1234,skill2), (4312,skill1))) into {
  "tpe": "com.spinoco.horus.message.GuiMessage.SkillMessage.SkillMatrix",
  "operators": {
    "tpe": "scala.collection.Seq[java.lang.String]",
    "elems": [
      "1234",
      "4312"
    ]
  },
  "skills": {
    "tpe": "scala.collection.Seq[java.lang.String]",
    "elems": [
      "skill1",
      "skill2"
    ]
  },
  "enabledSkills": {
    "tpe": "scala.collection.Seq[scala.Tuple2[java.lang.String,java.lang.String]]",
    "elems": [
      { "$ref": 2 },
      {
      "tpe": "scala.Tuple2[java.lang.String,java.lang.String]",
      "a": "1234",
      "b": "skill2"
    },
      { "$ref": 4 }
    ]
  }
}
pickling SkillMatrix(List(1234, 4312),List(skill1, skill2),List((1234,skill1),(1234,skill2),(4312,skill1))) into {
  "tpe": "com.spinoco.horus.message.GuiMessage.SkillMessage.SkillMatrix",
  "operators": {
    "tpe": "scala.collection.Seq[java.lang.String]",
    "elems": [
      "1234",
      "4312"
    ]
  },
  "skills": {
    "tpe": "scala.collection.Seq[java.lang.String]",
    "elems": [
      "skill1",
      "skill2"
    ]
  },
  "enabledSkills": {
    "tpe": "scala.collection.Seq[scala.Tuple2[java.lang.String,java.lang.String]]",
    "elems": [
      {
      "tpe": "scala.Tuple2[java.lang.String,java.lang.String]",
      "a": "1234",
      "b": "skill1"
    },
      { "$ref": 3 },
      {
      "tpe": "scala.Tuple2[java.lang.String,java.lang.String]",
      "a": "4312",
      "b": "skill1"
    }
    ]
  }
}

as you can see, there are some really odd "$ref"s that i don't care for. Am i doing something wrong or is that error on your end? To me, it looks like some kind of race condition when pickling same object in two places at the same time.

Thanks

Pickle a class where on field is enum

Hi, the enums are not serialized nicely.
The most-specific type of the enum is not in the pickled version and the information about which value from the enumeration was pickled is completely missing.

 "source": {
   "tpe": "Enumeration.this.Val"
 }

It would be nice, if you'd look at it.

Get rid of EncodedOutput

Use directly binary.Util for the encoding/decoding details, have a simple Output and get rid of EncodedOutput.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.