Giter Club home page Giter Club logo

upickle's Introduction

uPickle: a simple Scala JSON and Binary (MessagePack) serialization library

If you use uPickle and like it, you will probably enjoy the following book by the Author:

Hands-on Scala has uses uPickle extensively throughout the book, and has the entirety of Chapter 8: JSON and Binary Data Serialization dedicated to uPickle. Hands-on Scala is a great way to level up your skills in Scala in general and uPickle in particular.

For a hands-on introduction to this library, take a look at the following blog post:

If you use uPickle and like it, please support it by donating to our Patreon:

Build Status

Join the chat at https://gitter.im/lihaoyi/upickle

upickle's People

Contributors

anatoliykmetyuk avatar boopboopbeepboop avatar bwbecker avatar diebauer avatar dorsev avatar fdietze avatar fomkin avatar gitter-badger avatar htmldoug avatar japgolly avatar jodersky avatar jppellet avatar k3d3 avatar krever avatar lefou avatar lihaoyi avatar lolgab avatar masseguillaume avatar matthewdunbar avatar nicolasstucki avatar quafadas avatar ramnivas avatar romansky avatar sake92 avatar scala-steward avatar sebnozzi avatar sjrd avatar stewsquared avatar tindzk avatar vreuter avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

upickle's Issues

Type inference is failing with R/W type classes, scalac gets confused

I tried to use upickle R/W as part of interfaces exposed to user. It expects a deal of type inference so that he doesn't repeat himself:

trait Req[I,O]
case class Insert[I : W, O : R](arg: I) extends Req[I,O]
object Problem {
  def execute[I,O](req: Req[I,O]): O = null.asInstanceOf[O]
  val result: Int = execute(Insert("test"))  // both return type and parameter type should be inferred
}

[error] found : String
[error] required: Int
[error] val result: Int = execute(Insert("test"))

Without W/R type classes it compiles fine. Imho it's a serious problem because:

  • you gotta do things like :
execute(Append[String,Int,List](...)
  • it is quite hard to design APIs without having uPickle type classes right at the front
  • the compilation errors don't say anything about uPickle, so you are heavily mislead

Configurable sum encoding

I need to parse the Pandoc format which is produced via aeson. aeson supports configuring the sum encoding which is done as follows in Pandoc:

Aeson.sumEncoding = Aeson.TaggedObject "t" "c"

Obviously, this consumes more space than the current approach chosen by uPickle:

  def annotate[V: ClassTag](rw: R[V], n: String) = R[V]{
    case Js.Arr(Js.Str(`n`), x) => rw.read(x)
  }

Would it be possible to not import annotate by default, so that I could define it manually?

This feature would significantly increase interoperability with other picklers as there doesn't seem to be a common standard on how to serialise classes.

error: uPickle does not know how to write [scala.collection.immutable.Seq[Int]]

I notice the documentation says uPickle can write immutable Seqs, but I find the opposite to be true - it only seems to be able to write the mutable (Predef) version.

As shown below, it can write scala.collection.immutable.List so I imagine the fix should be simple.

Welcome to Scala version 2.11.2 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05).
Type in expressions to have them evaluated.
Type :help for more information.

scala> import upickle._
import upickle._

scala> write(Seq(1, 2, 3))
res0: String = [1,2,3]

scala> write(List(1,2,3))
res1: String = [1,2,3]

scala> import scala.collection.immutable.Seq
import scala.collection.immutable.Seq

scala> write(Seq(1, 2, 3))
<console>:12: error: uPickle does not know how to write [scala.collection.immutable.Seq[Int]]s; define an implicit Writer[scala.collection.immutable.Seq[Int]] to teach it how
              write(Seq(1, 2, 3))
                   ^

NPE when writing a null string field

Here is my stacktrace:

Exception in thread "main" java.lang.NullPointerException
    at upickle.Json$.writeToBuffer(Js.scala:112)
    at upickle.Json$.writeToBuffer(Js.scala:145)
    at upickle.Json$.writeToBuffer(Js.scala:145)
    at upickle.Json$.writeToBuffer(Js.scala:156)
    at upickle.Json$.writeToBuffer(Js.scala:145)
    at upickle.Json$.writeToBuffer(Js.scala:156)
    at upickle.Json$.writeToBuffer(Js.scala:145)
    at upickle.Json$.writeToBuffer(Js.scala:156)
    at upickle.Json$.write(Js.scala:167)
    at upickle.Types$class.write(Types.scala:115)
    at upickle.package$.write(package.scala:10)

Although this indicates an issue with the underlying data, it should serialize it as a Js.Null, e.g.
case Js.String(s) => if (s == null) sb.append("null") else ... @ Js.scala:109

Can't parse JSON map to Map[String, T]

Hi Li,

Thanks for your work on this project, it's very nice! I've faced a problem when trying to parse the following JSON (output from CouchDb):

{  
   "_id":"481e7e36693c1ae3f1871be7e506aade",
   "_rev":"2-7bc08b6426dd0a165d9c7d1f9e08aad5",
   "name":"Alice",
   "age":25,
   "_attachments":{  
      "attachment":{  
         "content_type":"application/octet-stream",
         "revpos":2,
         "digest":"md5-boWfl561SDppW4MTLQhVyA==",
         "length":5,
         "stub":true
      }
   }
}

I have the following classes:

abstract class CouchDoc {
  val _id: String
  val _rev: String
  val _attachments: Map[String, CouchAttachment]
}

case class CouchAttachment(
  content_type: String,
  revpos: Int,
  digest: String,
  length: Int,
  stub: Boolean)

case class Person(
  name: String,
  age: Int,
  _id: String,
  _rev: String,
  _attachments: Map[String, CouchAttachment] = Map()) extends CouchDoc

When I'm trying to parse the JSON with something like:

upickle.read[Person](json)

I get the following exception:

data: Obj(ArrayBuffer((attachment,Obj(ArrayBuffer((content_type,Str(application/octet-stream)), (revpos,Num(2.0)), (digest,Str(md5-boWfl561SDppW4MTLQhVyA==)), (length,Num(5.0)), (stub,True)))))) msg: Array(n)
upickle.Invalid$Data: data: Obj(ArrayBuffer((attachment,Obj(ArrayBuffer((content_type,Str(application/octet-stream)), (revpos,Num(2.0)), (digest,Str(md5-boWfl561SDppW4MTLQhVyA==)), (length,Num(5.0)), (stub,True)))))) msg: Array(n)
    at upickle.Implicits$$anonfun$validate$1.applyOrElse(Implicits.scala:16)
    at upickle.Implicits$$anonfun$validate$1.applyOrElse(Implicits.scala:16)
    at upickle.Implicits$$anonfun$MapR$1.applyOrElse(Implicits.scala:109)
    at upickle.Implicits$$anonfun$MapR$1.applyOrElse(Implicits.scala:109)
    at upickle.Reader$$anonfun$read$1.applyOrElse(Types.scala:42)
    at upickle.Reader$$anonfun$read$1.applyOrElse(Types.scala:42)
    at upickle.Types$class.readJs(Types.scala:139)
    at upickle.package$.readJs(package.scala:10)
    at upickle.Generated$$anonfun$Tuple5R$1.applyOrElse(Generated.scala:49)
    at upickle.Generated$$anonfun$Tuple5R$1.applyOrElse(Generated.scala:49)
    at upickle.Reader$$anonfun$read$1.applyOrElse(Types.scala:42)
    at upickle.Reader$$anonfun$read$1.applyOrElse(Types.scala:42)
    at upickle.Types$class.readJs(Types.scala:139)
    at upickle.package$.readJs(package.scala:10)
    at upickle.Generated$InternalGenerated$$anonfun$Case5R$1.applyOrElse(Generated.scala:233)
    at upickle.Generated$InternalGenerated$$anonfun$Case5R$1.applyOrElse(Generated.scala:233)
    at upickle.GeneratedUtil$$anonfun$readerCaseFunction$1.applyOrElse(GeneratedUtil.scala:15)
    at upickle.GeneratedUtil$$anonfun$readerCaseFunction$1.applyOrElse(GeneratedUtil.scala:15)
    at upickle.Reader$$anonfun$read$1.applyOrElse(Types.scala:42)
    at upickle.Reader$$anonfun$read$1.applyOrElse(Types.scala:42)
    at upickle.Reader$$anonfun$read$1.applyOrElse(Types.scala:42)
    at upickle.Reader$$anonfun$read$1.applyOrElse(Types.scala:42)
    at upickle.Types$class.readJs(Types.scala:139)
    at upickle.package$.readJs(package.scala:10)
    at upickle.Types$class.read(Types.scala:135)
    at upickle.package$.read(package.scala:10)

As I understand, it's having problems with creating an instance of Map. Do you know any way to resolve this? Thanks!

Missing quasiquotes dependency on 2.10.4 causes compilation errors

With Scala 2.10.4 the following code fails to compile:

package pickle

import upickle._

sealed trait Root
case class A(value: Int) extends Root
case class B(value: String) extends Root

object Main extends App {
  println(write(A(10)))
  println(write(B("a")))
}

Errors:

Error:(10, 16) uPickle does not know how to write [pickle.A]s; define an implicit Writer[pickle.A] to teach it how
  println(write(A(10)))
               ^
Error:(11, 16) uPickle does not know how to write [pickle.B]s; define an implicit Writer[pickle.B] to teach it how
  println(write(B("a")))
               ^

When passing macros explicitly:

object Main extends App {
  println(write(A(10))(Writer.macroW))
  println(write(B("a"))(Writer.macroW))
}

the errors turn out to be these:

Error:(10, 31) exception during macro expansion: 
java.lang.ClassNotFoundException: scala.quasiquotes.QuasiquoteCompat$
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at upickle.Macros$.picklerFor(Macros.scala:161)
    at upickle.Macros$.macroWImpl(Macros.scala:51)
  println(write(A(10))(Writer.macroW))
                              ^
Error:(11, 32) exception during macro expansion: 
java.lang.NoClassDefFoundError: scala/quasiquotes/QuasiquoteCompat$
    at upickle.Macros$.picklerFor(Macros.scala:161)
    at upickle.Macros$.macroWImpl(Macros.scala:51)
  println(write(B("a"))(Writer.macroW))
                               ^

When I've added quasiquotes package to the dependency list in SBT:

libraryDependencies ++= Seq(
  "com.lihaoyi" %% "upickle" % "0.2.6-RC1",
  "org.scalamacros" %% "quasiquotes" % "2.0.1",
)

these errors go away completely, and implicits start to work again.

UPickle doesn't understand case classes with varags

Consider:

package example

import upickle._

case class Sentence(words: String*)

object TestUPickle extends App {
  println(write(new Sentence("a", "b")))
}

Compiling this leads to an error:

Error:(8, 16) uPickle does not know how to write [example.Sentence]s; define an implicit Writer[example.Sentence] to teach it how
  println(write(new Sentence("a", "b")))
               ^
Error:(8, 16) not enough arguments for method write: (implicit evidence$1: upickle.Writer[example.Sentence])String.
Unspecified value parameter evidence$1.
  println(write(new Sentence("a", "b")))
               ^

If I were to change words to be of Seq[String] type, it works fine. It should have worked either way.

Exponential compiletime blowup, with nested classes

The code below kills the scala-compiler.
It will lead to an exponential number of implicit-lookups.
This can be quite annoying when defining complex protocols.
It also leads to the same amount of classfiles.

Workaround:
For noncyclic datatypes, it's possible to define the "later Writers/Readers" first, to break down the exponential growth.

/* workaround:
implicit val ww15 = implicitly[upickle.Writer[A15]]
implicit val ww10 = implicitly[upickle.Writer[A10]]
implicit val ww5 = implicitly[upickle.Writer[A5]]
*/
implicit val ww1 = implicitly[upickle.Writer[A1]]
  // kill compiler
  case class A1 (x: A2 , y: A2 )
  case class A2 (x: A3 , y: A3 )
  case class A3 (x: A4 , y: A4 )
  case class A4 (x: A5 , y: A5 )
  case class A5 (x: A6 , y: A6 )
  case class A6 (x: A7 , y: A7 )
  case class A7 (x: A8 , y: A8 )
  case class A8 (x: A9 , y: A9 )
  case class A9 (x: A10, y: A10)
  case class A10(x: A11, y: A11)
  case class A11(x: A12, y: A12)
  case class A12(x: A13, y: A13)
  case class A13(x: A14, y: A14)
  case class A14(x: A15, y: A15)
  case class A15(x: A16, y: A16)
  case class A16(x: A17, y: A17)
  case class A17(x: A18, y: A18)
  case class A18()

Issue with sealed traits

I think this particular bug has to do with a a sealed trait with only a single case class or case object (See random notes).

In MacroTests.scala, adding the following lines:

object DeepHierarchy {
  sealed trait A
  case class B(i: Int) extends A
  sealed trait C extends A
  case class D(s: String) extends C
  case class E(b: Boolean) extends C
  sealed trait Q //new line
  case object AnQ extends Q //new line 
  case class F(q: Q) extends C //new line
}

and running test:compile causes:

[error] one error found
[error] /home/nick/projects/upickle/js/../shared/test/scala/upickle/MacroTests.scala:138: could not find implicit value for evidence parameter of type upickle.Writer[upickle.DeepHierarchy.A]
[error] rw(B(1): A, """["upickle.DeepHierarchy.B", {"i": 1}]""

...but then adding

case object AnOtherQ extends Q

leads to success ???

Random notes:

Adding a third other other Q is still a success...

going back to just one AnQ recreates the error...

Using:

case object AnQ 
case class F(q: AnQ.type) extends C

Throws the same compile error (removed sealed trait Q).

However,

case class AnQ(i: Int)
case class F(q: AnQ) extends C

Is a success...

But

  sealed trait Q
  case class AnQ(i: Int) extends Q
  case class F(q: Q) extends C

errors...

Which is again fixed if the seconds case object is added (AnOtherQ extends Q )is added...

An Implementation is missing at runtime

Hi,
I use upickle through autowire in a scalatra client/server application.
My server depends on JVM stuff (for autowire, upickle, ...) and my client one on JS stuff (for autowire, upickle, ...).

Everything compiles and runs fine, except when I do a request through autowire, I get a weird exeception when upickle reads the JSON data. Is some lib or something is missing ?
Thanks

scala.NotImplementedError: an implementation is missing
    at scala.Predef$.$qmark$qmark$qmark(Predef.scala:225)
    at scala.scalajs.js.JSON$.parse$default$2(JSON.scala:33)
    at upickle.json.package$.read(package.scala:20)
    at upickle.Types$class.read(Types.scala:135)
    at upickle.package$.read(package.scala:10)
    at org.openmole.gui.server.core.GUIServlet$$anonfun$2.apply(GUIServlet.scala:69)
    at org.openmole.gui.server.core.GUIServlet$$anonfun$2.apply(GUIServlet.scala:65)

forcing serialization of defaults values?

Is custom Writer the only way to force default case class values be also written?

P.S. By the way, is github issue tracker the appropriate place for questions? Have I missed some other communication place more suitable for uPickle discussions?

Macro implicits don't work outside package

Hey. Check this out:
japgolly@9a976af

Eg2 compiles but Eg1 doesn't.

[error] upickle/jvm/shared/test/scala/upickle/blah/blah.scala:7: uPickle does not know how to read [upickle.elsewhere.YYY]s; define an implicit Reader[upickle.elsewhere.YYY] to teach it how
[error]   def blah(j: String): YYY = read[YYY](j)
[error]                                       ^
[warn] one warning found
[error] one error found
[error] upickle/js/shared/test/scala/upickle/blah/blah.scala:7: uPickle does not know how to read [upickle.elsewhere.YYY]s; define an implicit Reader[upickle.elsewhere.YYY] to teach it how
[error]   def blah(j: String): YYY = read[YYY](j)
[error]                                       ^
[warn] one warning found
[error] one error found

Implicit R/W expansion fails for covariant types

It turned out to be a covariance problem, this doesn't resolve implicits :

import upickle.Aliases.R
object Main extends App {
  trait Request[+O]
  case class Append[O: R](storeName: String) extends Request[O]
  val append: Request[Int] = Append("play")
}

diverging implicit expansion for type upickle.Aliases.R[O]
[error] starting with macro method macroR in object Reader
[error] val append: Request[Int] = Append("play")

This is OK:

import upickle.Aliases.R
object Main extends App {
  trait Request[O]
  case class Append[O: R](storeName: String) extends Request[O]
  val append: Request[Int] = Append("play")
}

Customizable discrimination annotations

It would be nice if annotation format could be customized, for example, in order to have

{ "$variant": "MyClass", "key": "value" }

instead of

["your.package.MyClass", {"key": "value"}]

While such approach can lead to collisions, with some fancy discriminator key it will be very unlikely (and it is not really a problem if it can be configured). It would also be nice to be able to store only the last part of the discriminator value (or, for example, two parts if MyClass is defined in an object).

As far as I can see (I may be wrong, though) it should not be very difficult - currently the autogenerated picklers depend on Internal.annotate behavior; I think it should be possible to extract the annotation behavior to another implicit and provide the default one with the current behavior.

Having the discriminator as a part of the object is important (at least for our project) for migration reasons (a lot of other serializators are using this format) and for working with the serialized objects - for example, mongodb queries tend to be simpler without arrays.

Recursive data types

It appears to me that recursive data types are not well-supported. The following code leads to an infinite loop in the typer phase of the Scala compiler:

import upickle._
sealed case class IntTree(value: Int, children: List[IntTree])
read[IntTree]("")

reading to case class when a single field is missed

Hi!

Very exciting project, thanks!

Say, I have a case class (with given and unchangeable signature) with plenty of fields, and json string I'm going to read contains all the fields except for a single one, and a value for the last one I have in hands.

Direct way is to declare new case class without the field, read in this class, and then, field by field, use the second class fields and the value in hand to create instance of the (initial) case class. It is tedious, errors prone and, well, not elegant (while the upickle is such one).

Is there more smart, efficient and concise way?

Allow custom handling of nulls

Currently the way Reader/Writer handle nulls is hardcoded: null is always converted to and from Js.Null, and corresponding methods are final so they can't be overridden. However, sometimes it makes sense to handle nulls differently, for example, when overriding [de]serialization of options.

Is it possible to allow custom handling of nulls?

Diverging implicit with 1-class-recursive-discriminated-union

This causes uPickle to fail with a diverging implicit expansion error:

sealed trait SingleTree
case class SingleNode(value: Int, children: List[SingleTree]) extends SingleTree

write(
  SingleNode(123, List(SingleNode(456, Nil), SingleNode(789, Nil)))
)(Writer.macroW)

It really shouldn't, but I haven't managed to figure out why. Oddly enough, adding a : SingleTree type annotation or removing the children: List[SingleTree] from SingleNode is sufficient to make the problem go away.

Case objects unsupported

uPickle cannot deal with case objects as shown in the following code:

trait Type
object Type {
  case object A extends Type
  case object B extends Type
}

upickle.read[Type]("") // Error: uPickle does not know how to read [Type]s; define an implicit Reader[Type] to teach it how

Edit: When the trait is sealed, the above compiles without problems. Could the macro issue an error in such cases?

json to map not working ?

Hi ,
I am trying to convert json string to Map

read[Map[String,Int]]("""{ "i" : 123}""")

it failing with message

Uncaught upickle.Invalid$Data: data: Obj(ArrayBuffer((i,Num(123)))) msg: Array(n) 

StackOverflow

Hi,
I have a weird behavior using upickle 2.1.

It is reproduceable easily by cloning my demo project https://github.com/mathieuleclaire/scalaTraJSTagsWireRx. It is a client/server application based on scalatra/autowire/upickle. With upickle 1.7, it works fine, but upgrading the upickle version to 2.1 produces this error at compile time:
(upgrading autowire to 2.4 does not fix the issue).
Thanks

java.lang.StackOverflowError
    at scala.tools.nsc.typechecker.Namers$Namer.typeErrorHandler(Namers.scala:111)
    at scala.tools.nsc.typechecker.Namers$Namer.typeSig(Namers.scala:1535)
    at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply$mcV$sp(Namers.scala:778)
    at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:777)
    at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:777)
    at scala.tools.nsc.typechecker.Namers$Namer.scala$tools$nsc$typechecker$Namers$Namer$$logAndValidate(Namers.scala:1561)
    at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:777)
    at scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:769)
    at scala.tools.nsc.typechecker.Namers$$anon$1.completeImpl(Namers.scala:1676)
    at scala.tools.nsc.typechecker.Namers$LockingTypeCompleter$class.complete(Namers.scala:1684)
    at scala.tools.nsc.typechecker.Namers$$anon$1.complete(Namers.scala:1674)
    at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1429)
    at scala.reflect.internal.Symbols$Symbol.initialize(Symbols.scala:1576)
    at scala.tools.nsc.typechecker.Typers$Typer.typedValDefImpl(Typers.scala:1949)
    at scala.tools.nsc.typechecker.Typers$Typer.typedValDef(Typers.scala:1944)
    at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$56.apply(Typers.scala:2939)
    at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$56.apply(Typers.scala:2939)
    at scala.collection.immutable.List.loop$1(List.scala:172)
    at scala.collection.immutable.List.mapConserve(List.scala:188)
    at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedFunction(Typers.scala:2939)
    at scala.tools.nsc.typechecker.Typers$Typer.typedFunction$1(Typers.scala:5193)
    at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:5230)
    at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5262)
    at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5269)
    at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5305)
    at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5332)
    at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5279)
    at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5283)
    at scala.tools.nsc.typechecker.Typers$Typer.typedBlock(Typers.scala:2364)
    at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:5226)
    at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5262)
    at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5269)
    at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5305)
    at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5332)
    at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5279)
    at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5283)
    at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5375)
    at scala.tools.nsc.typechecker.Typers$Typer.instantiateToMethodType$1(Typers.scala:876)
    at scala.tools.nsc.typechecker.Typers$Typer.adapt(Typers.scala:1168)
    at scala.tools.nsc.typechecker.Typers$Typer.adapt(Typers.scala:1163)
    at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5319)
    at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5332)
    at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5279)
    at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5283)
    at scala.tools.nsc.typechecker.Typers$Typer.typedArg(Typers.scala:3114)
    at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedArgToPoly$1(Typers.scala:3450)
    at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$70.apply(Typers.scala:3458)
    at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$70.apply(Typers.scala:3458)
    at scala.reflect.internal.util.Collections$class.map2(Collections.scala:79)
    at scala.reflect.internal.SymbolTable.map2(SymbolTable.scala:16)

6 case limit

Any specific reason to limit the number of fields to 6? Are patches welcome there?

UPickle doesn't work with overloaded apply methods

Consider:

package com.example

import upickle._

case class Container(value: String)

object Container {
  def apply(intValue: Int): Container = Container(intValue.toString)
}

object ContainerApp extends App {
  write(Container("V"))
}

Compiling this leads to an error:

[error] /Users/ramnivas/test-dev/upickle-issue/src/main/scala/com/example/OverloadedApply.scala:14: uPickle does not know how to write [com.example.Container]s; define an implicit Writer[com.example.Container] to teach it how
[error]   val cWritten = write(c)

If I remove the apply(Int) method, it compiles fine.

Even the following compiles fine (has an auxiliary constructor; requires new when using that constructor):

package com.example

import upickle._

case class Container(value: String) {
  def this(intValue: Int) = this(intValue.toString)
}

object ContainerApp extends App {
  write(new Container(5))
}

I think uPickle should consider only the primary apply method, much the same way it seems to consider only the primary constructor.

read(write(string)) crashes

Check this out:

scala> def test(a: String) = a == read[String](write(a))
test: (a: String)Boolean

scala> test("\u001e")
res24: Boolean = true

scala> test("\ud8a3")
res25: Boolean = true

scala> test("\u001e\ud8a3")
upickle.Invalid$Json: JsonParse Error: jawn.IncompleteParseException: exhausted input in "\u001e?"
    at upickle.json.package$.read(package.scala:11)
    at upickle.Types$class.read(Types.scala:135)
    at upickle.package$.read(package.scala:10)
    at .test(<console>:11)
    at .<init>(<console>:13)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:983)
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:604)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:568)
    at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:760)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:805)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:717)
    at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:581)
    at scala.tools.nsc.interpreter.ILoop.innerLoop$1(ILoop.scala:588)
    at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:591)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:882)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:837)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:837)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:837)
    at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:904)
    at xsbt.ConsoleInterface.run(ConsoleInterface.scala:62)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:101)
    at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:76)
    at sbt.Console.sbt$Console$$console0$1(Console.scala:22)
    at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:23)
    at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply(Console.scala:23)
    at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply(Console.scala:23)
    at sbt.Logger$$anon$4.apply(Logger.scala:85)
    at sbt.TrapExit$App.run(TrapExit.scala:248)
    at java.lang.Thread.run(Thread.java:745)

upickle.Invalid$Data: data: Str(sValue) msg: Array(n)

Hi!

With this code

package fun

import upickle._

object TestUpickle extends App {

  case class In(s: Option[String] = None)
  case class Out(in: In)

  def test1: Unit = {
    val str = """{"in": {"s": "sValue"}}"""
    println(read[Out](str))
  }

  test1
}

have got

Exception in thread "main" upickle.Invalid$Data: data: Str(sValue) msg: Array(n)
    at upickle.Implicits$$anonfun$validate$1.applyOrElse(Implicits.scala:16)
    at upickle.Implicits$$anonfun$validate$1.applyOrElse(Implicits.scala:16)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at upickle.Implicits$$anonfun$upickle$Implicits$$SeqLikeR$1.applyOrElse(Implicits.scala:78)
    at upickle.Implicits$$anonfun$upickle$Implicits$$SeqLikeR$1.applyOrElse(Implicits.scala:78)
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
    at upickle.Types$class.readJs(Types.scala:127)
    at upickle.package$.readJs(package.scala:10)
    at upickle.Generated$$anonfun$Tuple1R$1.applyOrElse(Generated.scala:17)
    at upickle.Generated$$anonfun$Tuple1R$1.applyOrElse(Generated.scala:17)
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
    at upickle.Types$class.readJs(Types.scala:127)
    at upickle.package$.readJs(package.scala:10)
    at upickle.Generated$InternalGenerated$$anonfun$Case1R$1.applyOrElse(Generated.scala:197)
    at upickle.Generated$InternalGenerated$$anonfun$Case1R$1.applyOrElse(Generated.scala:197)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at upickle.GeneratedUtil$$anonfun$readerCaseFunction$1.applyOrElse(GeneratedUtil.scala:15)
    at upickle.GeneratedUtil$$anonfun$readerCaseFunction$1.applyOrElse(GeneratedUtil.scala:15)
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
    at upickle.Types$class.readJs(Types.scala:127)
    at upickle.package$.readJs(package.scala:10)
    at upickle.Generated$$anonfun$Tuple1R$1.applyOrElse(Generated.scala:17)
    at upickle.Generated$$anonfun$Tuple1R$1.applyOrElse(Generated.scala:17)
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
    at upickle.Types$class.readJs(Types.scala:127)
    at upickle.package$.readJs(package.scala:10)
    at upickle.Generated$InternalGenerated$$anonfun$Case1R$1.applyOrElse(Generated.scala:197)
    at upickle.Generated$InternalGenerated$$anonfun$Case1R$1.applyOrElse(Generated.scala:197)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at upickle.GeneratedUtil$$anonfun$readerCaseFunction$1.applyOrElse(GeneratedUtil.scala:15)
    at upickle.GeneratedUtil$$anonfun$readerCaseFunction$1.applyOrElse(GeneratedUtil.scala:15)
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
    at upickle.Types$class.readJs(Types.scala:127)
    at upickle.package$.readJs(package.scala:10)
    at upickle.Types$class.read(Types.scala:123)
    at upickle.package$.read(package.scala:10)
    at fun.TestUpickle$.test1(TestUpickle.scala:12)
...

Where is my mistake?

Control serialization of a certain property across many classes in a DRY manner

I am wondering what would be a good (=DRY) way to modify serialization and deserialization of a certain property in many classes.

I have a bunch of model classes of the following structure (somewhat simplified). I am persisting them using Slick, where such a structure is fairly common:

trait Entity {
  def id: Option[Long]
}

case class Person(fName: String, lName: String, id: Option[Long] = None) extends Entity

case class Organization(name: String, id: Option[Long] = None) extends Entity

// and so on...

Currently, uPickle serializes the id property as an array:

{
   ...
   id: [1234]
}

I am looking for a way to map the id property as follows:

  • If id is None, omit the property. uPickle already does this, since the id property's default is None.
  • If id is Some(value), just serialize the value:
{
   ...
   id: 1234
}

What would be a good way, short of writing either a custom apply() and unapply() and the attendant parallel set of classes or writing a Reader and Writer for each model class?

Evidence case classes must be declared in a class that read method is called from

So that this works :

object A extends B {
   case class SomeEvidenceClass(blablala)
   read[SomeEvidenceClass](something)
}

But if SomeEvidenceClass is declared in parent trait B, it won't work...

In other words:

Types#read[T: Reader](expr: String): T

If the T evidence is a case class and it can be read by Generated Tuple Reader, it must be declared in the class that read method is called from. If you declare it for instance in a parent trait, you get :

uPickle does not know how to read SomeEvidenceCaseClass define an implicit Reader[SomeEvidenceCaseClass]

defining implicit readers and readers examples?

I've got

package models

sealed trait A

case class B(s : String) extends A

case class C(i :Int) extends A

How are implicit Readers and Writers defined? I can't find any info in the readme, and there are at least five different choices in the Implicits package in the source.

Sometimes my calls to readA and write(foo) compile, and then randomly they stop working if I make minor changes, such as some random (unrelated) helper code in package models:

object BUtils { def validateB(b:B) : Option[B] = {...} }

just because that's in package models, the read and write functions break.. upickle is feeling very fragile to me, but I don't know if all of these problems will go away if I can learn which readers/writers to extend for my implicit objects, and how to use those.

Obtaining parse tree

The following fails with an expansion error:

upickle.read[Js.Arr](contents)

I came up with a quick workaround, but I'd prefer if the above line worked:

case class Tree(tree: Js.Arr)
implicit val treeReader = upickle.Reader[Tree] {
  case t: Js.Arr => Tree(t)
}

upickle.read[Tree](contents)

Strange serialization of scala Map

Hey,

I tried to serialize this with upicke 0.2.2, scala 2.11.2, scala.js 0.5.3 :

case class RequestMessage(requestId: String, op: String, processor: String, args: Map[String, String])
write(RequestMessage(
      "1d6d02bd-8e56-421d-9438-3bd6d0079ff1",
      "eval",
      "",
      Map(
        "gremlin" -> "g.v(x).out",
        "bindings" -> "",
        "language" -> "gremlin-groovy"
      )
    )
)

{"requestId":"1d6d02bd-8e56-421d-9438-3bd6d0079ff1","op":"eval","processor":"","args":[["gremlin","g.v(x).out"],["bindings",""],["language","gremlin-groovy"]]}

Do I need to implement unapply method for maps too in case classes? The map is serialized into some sort of list of arrays. Even if I serialize it directly it resolves to :

[["gremlin","g.v(x).out"],["bindings",""],["language","gremlin-groovy"]]

Reading unquoted numbers as longs

It seems counterintuitive that a number that fits in an Int (and thus also a Long) could be read as an Int, but not a Long unless the number is expressed as a string (i.e. quoted). I know that the uPickle document states that Longs are written as strings, but it seems that while reading it should work even with unquoted numbers as long as they fit. This will allow uPickle to consume data produced by other serializers (such as Jackson) that (by default) write numbers unquoted.

scala> import upickle._
import upickle._

scala> case class IntWrapper(value: Int)
defined class IntWrapper

scala> read[IntWrapper]("""{"value" : 1422937970}""")
res0: IntWrapper = IntWrapper(1422937970)

scala> case class LongWrapper(value: Long)
defined class LongWrapper

scala> read[LongWrapper]("""{"value" : 1422937970}""")
upickle.Invalid$Data: data: Num(1.42293797E9) msg: Number
  at upickle.Implicits$$anonfun$validate$1.applyOrElse(Implicits.scala:16)
  at upickle.Implicits$$anonfun$validate$1.applyOrElse(Implicits.scala:16)
  at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
  at upickle.Implicits$$anonfun$upickle$Implicits$$numericStringReaderFunc$1.applyOrElse(Implicits.scala:45)
  at upickle.Implicits$$anonfun$upickle$Implicits$$numericStringReaderFunc$1.applyOrElse(Implicits.scala:45)
...

// quoted number
scala> read[LongWrapper]("""{"value" : "1422937970"}""")
res2: LongWrapper = LongWrapper(1422937970)

Can't opt-out of default ReadWriters

It's awesome that uPickle comes with so many default Readers and Writers but it seems that I can't opt out.

Scala still continues to surprise me when it comes to implicits. I'm not importing any but somehow they're always in-scope. For example upickle.write(1) works without any imports, where as I would've imagined that import upickle._ or import upickle.IntRW would've been necessary.

I'm happy to submit a PR. Are you ok with me changing this behaviour?

Failing IR checks

Hey Haoyi, check this out.

Excerpt of SBT settings in my project which depends on upickle:

    emitSourceMaps in fullOptJS := false,
    checkScalaJSIR in fullOptJS := true,
    inliningMode in fullOptJS := InliningMode.Batch,

And error message:

[info] Direct Optimizing /home/golly/xxxx/target/scala-2.11/xxxx-opt.js
[error] file:/Users/haoyi/Dropbox%20(Personal)/Workspace/upickle/js/shared/main/scala/upickle/Implicits.scala(139:16:Apply): Cannot call Scala method value__T on non-class type NullType
[error] file:/Users/haoyi/Dropbox%20(Personal)/Workspace/upickle/js/shared/main/scala/upickle/Implicits.scala(139:16:Apply): Cannot call Scala method value__T on non-class type NullType
java.lang.RuntimeException: There were 2 IR checking errors.
    at scala.sys.package$.error(package.scala:27)
    at scala.scalajs.tools.optimizer.ScalaJSOptimizer.scala$scalajs$tools$optimizer$ScalaJSOptimizer$$checkIR(ScalaJSOptimizer.scala:171)
    at scala.scalajs.tools.optimizer.ScalaJSOptimizer$$anonfun$3$$anonfun$apply$1.apply$mcV$sp(ScalaJSOptimizer.scala:106)
    at scala.scalajs.tools.optimizer.ScalaJSOptimizer$$anonfun$3$$anonfun$apply$1.apply(ScalaJSOptimizer.scala:105)
    at scala.scalajs.tools.optimizer.ScalaJSOptimizer$$anonfun$3$$anonfun$apply$1.apply(ScalaJSOptimizer.scala:105)
    at scala.scalajs.tools.optimizer.IncOptimizer$.logTime(IncOptimizer.scala:731)
    at scala.scalajs.tools.optimizer.ScalaJSOptimizer$$anonfun$3.apply(ScalaJSOptimizer.scala:104)
    at scala.scalajs.tools.optimizer.ScalaJSOptimizer$$anonfun$3.apply(ScalaJSOptimizer.scala:96)
    at scala.scalajs.tools.optimizer.IncOptimizer$.logTime(IncOptimizer.scala:731)
    at scala.scalajs.tools.optimizer.ScalaJSOptimizer.optimizeIR(ScalaJSOptimizer.scala:96)
    at scala.scalajs.tools.optimizer.ScalaJSClosureOptimizer.directOptimizeIR(ScalaJSClosureOptimizer.scala:88)
    at scala.scalajs.tools.optimizer.ScalaJSClosureOptimizer$$anonfun$directOptimizeCP$1.apply$mcV$sp(ScalaJSClosureOptimizer.scala:74)
    at scala.scalajs.tools.io.CacheUtils$.cached(CacheUtils.scala:41)
    at scala.scalajs.tools.optimizer.ScalaJSClosureOptimizer.directOptimizeCP(ScalaJSClosureOptimizer.scala:67)
    at scala.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$24$$anonfun$apply$6.apply(ScalaJSPluginInternal.scala:241)
    at scala.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$24$$anonfun$apply$6.apply(ScalaJSPluginInternal.scala:240)
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:235)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

Using sjs 0.5.4, Scala 2.11.2.

enhancement to build ReadWriter classes by composition

I occasionally need to implement my own ReadWriter[X]. So far I've always been able to do this by converting to some other type that already has an instance. This requires two functions, one to map to that type and one to map back from it. This is common enough that I've made a an implicit class to add syntax.

object Helper {
  implicit class EnhancedReadWriter[A](val _rw: ReadWriter[A]) extends AnyVal {
    def compose[B](down: B => A, up: A => B): ReadWriter[B] = new ReadWriter[B](
      write = a => _rw.write(down(a)),
      read = { case (b) => up(_rw.read(b)) })
  }
}

I use it like this:

implicit val dateReadWrite: ReadWriter[Date] = (implicitly[ReadWriter[Long]]).compose(
      down = _.getTime,
      up = new Date(_) )

Is this worth adding centrally?

Circular data structures?

Hi,
does the 0.2.0 fix the circular data structure limitation ?
If not, do you plan to do it soon ?
Thanks

Is it possible to read parameterized/polymorphic case classes ?

I'm sorry for flooding upicke with issues :-) But I got utterly stuck on reading :

case class What[E](a: TypeThatHasAReaderAvailable[E])

read[What[Ever]](xyz)

The problem is here :
https://github.com/lihaoyi/upickle/blob/ccd23c491ebec51a091ffc49b709d067fa1c1df8/shared/main/scala/upickle/Macros.scala#L121

[info] exception during macro expansion: 
[info] scala.ScalaReflectionException: value apply encapsulates multiple overloaded alternatives and cannot be treated as a method. Consider invoking `<offending symbol>.asTerm.alternatives` and manually picking the required method
[info]  at scala.reflect.api.Symbols$SymbolApi$class.asMethod(Symbols.scala:228)
[info]  at scala.reflect.internal.Symbols$SymbolContextApiImpl.asMethod(Symbols.scala:82)
[info]  at upickle.Macros$.picklerFor(Macros.scala:121)
[info]  at upickle.Macros$.macroRImpl(Macros.scala:32)
[info]       val res = read[ResponseMessage[Vertex]](x.data.toString)
[info]   

I tried even invoke the macro supplying the type parameters directly to it, but it failed the same way :

    import scala.language.experimental.macros
    def macroR[T]: Reader[T] = macro Macros.macroRImpl[T]
    val a = macroR[What[Ever]]

Would you please give me a hint here?

Dynamic reading by composite Reader of tagged case classes

Hi,

imagine you have a set of 10 sealed/tagged case classes, their instances are persisted and you want to dynamically read them (you don't know which one of them you currently read) :

@key("A") case class A(i: Int) extends SealedTrait
@key("A") case class B(s: String) extends SealedTrait

Possible solution is to introduce a composite Reader with a partial function that pattern matches over key and reads particular types :

case key if key == A.getClass.getSimpleName => read[A](str)
case key if key == B.getClass.getSimpleName => read[B](str)

It would all work nicely, except it is not quite easy to construct such a Polymorhpic composite Reader that would be applied by scalac prior to Reader[A] or Reader[B] to match the tagged string ["A", {"i": 1}] or ["A", {"s": "x"}] and applied the partial function:

Reader[(String, ???)]  // this seems to be a dead end cause ??? type varies

Another solution is to persist and read these 2 pieces of information (key,obj) separately which is not very cool and maintainable at all...

Do you have any idea how this might be done by uPickle ? I have a feeling there is a fancy solution ...

Omit field keys in serialised ADTs

As mentioned in #55, I'd like to use uPickle to parse an aeson-produced file. Another interoperability issue I encountered is that uPickle doesn't encode field keys. Take the following example:

case class Format(wrapped: String)
case class RawBlock(format: Format, str: String)

RawBlock(Format("html"), "<center>")

Here is a comparison of the differences:

aeson

{"t":"RawBlock","c":["html","<center>"]}

uPickle

["RawBlock",{"format":{"wrapped":"html"},"str":"<center>"}]

As long as a class constructor doesn't use any optional parameters, keys should not matter. I assume that this behaviour could even be beneficial as a default because it significantly reduces the size.

Another minor issue I noticed is that uPickle uses the full class name, i.e., it includes the package names. As a workaround I could just rewrite the aeson JSON tree that I'm parsing (as I do with t and c right now), but setting a relative path could further shorten the JSON.

It may make sense to create a Configuration trait that contains these kind of settings.

Add license

I've just stumbled across upickle as I was looking for an alternative for play-json and it looks great.
Unfortunately, we wouldn't be able to use it at our company without a license
Please could you add one?

Alternate constructor in companion causes read to choke?

Took a while to puzzle this one out. I have an apparently innocuous case class:

case class HtmlWikitext(html:Html) extends Wikitext {
  // Irrelevant methods elided
}
object HtmlWikitext {
  def apply(html:String):HtmlWikitext = HtmlWikitext(Html(html))
}

I am trying to read it:

read[models.HtmlWikitext](pickled)

But I get an error:

[error] ...\querki\scalajs\src\main\scala\querki\pages\ThingPage.scala:35: uPickle does not know how to read [models.HtmlWikitext]s; define an implicit Reader[models.HtmlWikitext] to teach it how
[error]     read[models.HtmlWikitext](pickled)

Commenting out the alternate String constructor causes the error to go away, as does changing the name of the companion object. AFAICT, the read macro seems to be getting confused by the presence of the secondary constructor; I assume there's some sort of ambiguity introduced by it.

I could believe that this is user error -- that I need to specify something explicit to tell read() which constructor to use -- but the docs aren't terribly clear on this point, and I don't yet grok the macros well enough to be clear what the implicit Reader should look like to work around this...

Doesn't run - bad repository for 0.2.5?

Currently not working out of the box.

scala> import upickle._
import upickle._

scala> write(1 :Int)
java.lang.ClassCastException: java.lang.Double cannot be cast to scala.scalajs.js.Any
at upickle.json.package$.write(package.scala:38)
at upickle.Types$class.write(Types.scala:127)
at upickle.package$.write(package.scala:10)
... 43 elided

Can't imagine this is the update from 0.2.4, I'm guessing the wrong version hit the repo at "bintray/non" at "http://dl.bintray.com/non/maven"

Using scalajs 0.5.5

Serialization of Option types

The Option types are serialized as a list/array "[]".
From tests,

'option{
  'Some-rw(Some(123), "[123]")
  'None-rw(None: Option[String], "[]")
  'Option{
    rw(Some(123): Option[Int], "[123]")
    rw(None: Option[Int], "[]")
  }
}

This behaviour is wrong: i.e Some(123) should be 123. None should be null.
The problem is in macro expansion, because Option is lifted.
I try to create a pull request to fix the bug.

Object field order should not matter

Using the built-in unpicklers, it appears that the order of fields in a JSON string must be exactly as declared in the case class for parsing to succeed. For example, the generated reader for

case class Foo(a: String, b: String)

will fail to parse the JSON string

{ "b": "blah", "a": "blah" }

whereas it will succeed in parsing the JSON string

{ "a": "blah", "b": "blah" }

Switching around the order of fields in the case class declaration does the "opposite", i.e. it will fail to parse the latter string and succeed in parsing the first string.

I would expect both to succeed.

Use case: I'm using an API in which field order is not guaranteed (and the API cannot easily be modified to guarantee the ordering).

EDIT: Oh, yes, this is using uPickle 0.2.5 on ScalaJS.

uPickle does not know how to write [...]

Hi!

While looking for a nice way to pickle scala objects <-> json to exchange data between play and scala-js enabled client, uPickle seems a good candidate.

Unfortunately, until now I kept getting the error "uPickle does not know how to write..." while trying to picklelise a custom case class (like the 'Thing' example given on the main page) on the JVM (didn't even tried with scalajs).

Finally, I discovered the source of the issue: the version of scala.
While compiling with scala 2.10.4, the message keep popping up, however when using scala 2.11.x it compiles gently.

The configuration is as simple as possible, a standalone sbt project with only one libraryDependency (uPickle), and a single file:

case class Buddy(a: Int)
import upickle._

object Blah {
  def blah = {
    write(Buddy(42))
  }
}

As a side note, uPickle < 0.2.0 works with 2.10.x (e.g. 0.1.6), not uPickle >= 0.2.0.

Cheers,

Jean Luc

Compiler StackOverflowError

case class Foo(x:Int)
upickle.read[List[Foo]] ("")

At the same time upickle.write(List(Foo(0))) work fine

Support Alternative AST's

Was wondering how difficult it would be for uPickle to support alternative AST's (such as Json4s which is what I personally use). I mention Json4s, since its API is written in pure Scala.

Would love to use uPickle, but it only supports Jawn (and all of our JSON handling code uses Json4s). Note that Jawn itself does support other AST's, so not sure if this is helpful

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.