Giter Club home page Giter Club logo

neotypes's Introduction

Logo

neotype - a type specimen that is selected subsequent to the description of a species to replace a preexisting type that has been lost or destroyed.

Build status Scaladex Gitter chat Scala Steward

neotypes

Scala lightweight, type-safe, asynchronous driver (not opinionated on effect systems) for Neo4j

  • Scala - the driver provides you with support for all standard Scala types without the need to convert Scala <-> Java types back and forth and you can easily add support for your own types.
  • Lightweight - the core module of the driver only depends the Neo4j Java driver, and the generic module only depends on Shapeless.
  • Type-safe - the driver leverages typeclasses to derive all needed conversions at the compile time.
  • Asynchronous - the driver sits on top of asynchronous Java driver.
  • Not opinionated on side-effect implementation - you can use it with any effect system of your preference (Future, typelevel, ZIO, Monix) by implementing a simple typeclass.

Setup

Supports Scala 2.13 and 3.3
"io.github.neotypes" %% "neotypes-core" % version Core functionality. Supports scala.concurrent.Future.
"io.github.neotypes" %% "neotypes-generic" % version Automatic derivation of mappers for case classes.
"io.github.neotypes" %% "neotypes-cats-effect" % version Async support for cats.effect.Async[F]
"io.github.neotypes" %% "neotypes-monix" % version Async support for monix.eval.Task
"io.github.neotypes" %% "neotypes-zio" % version Async support for zio.Task
"io.github.neotypes" %% "neotypes-akka-stream" % version Stream support for akka.stream.scaladsl.Source
"io.github.neotypes" %% "neotypes-fs2-stream" % version Stream support for fs2.Stream
"io.github.neotypes" %% "neotypes-monix-stream" % version Stream support for monix.reactive.Observable
"io.github.neotypes" %% "neotypes-zio-stream" % version Stream support for zio.ZStream
"io.github.neotypes" %% "neotypes-refined" % version Support for insert and retrieve refined values.
"io.github.neotypes" %% "neotypes-cats-data" % version Support for insert and retrieve cats.data values.
"io.github.neotypes" %% "neotypes-enumeratum" % version Support for insert and retrieve Enumeratum enums.

Resources

Code of Conduct

We are committed to providing a friendly, safe and welcoming environment for all, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, nationality, or other such characteristics.

Everyone is expected to follow the Scala Code of Conduct when discussing the project on the available communication channels.

Special thanks

neotypes's People

Contributors

balmungsan avatar dimafeng avatar frne avatar geoffjohn11 avatar i10416 avatar irevive avatar jacoby6000 avatar masonedmison avatar pbylicki avatar scala-steward avatar tjarvstrand avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

neotypes's Issues

EitherNel[SomeErrorType, T] vs Either[Throwable, ?]

Hi, had a look at the code, awesome work, well structured and readable! And thanks for test containers too 😉Started making some Value Reader typeclass and then found this library and stopped.

What are your thoughts on using some accumulating error type to gather multiple parsing errors? EitherNel, ValidatedNel or similar.

Feature: parameter mapper for case classes

Right now, we do not support case classes as parameters on a Cypher query.
However, I think this would be very useful.

For example, consider the following case class.

final case class User(name: String, age: Int, ...) // Many more fields.

Then, as the number of fields grow, having to write something like this, becomes more & more tedious.

val user = User("Luis", 22, ...)

c"CREATE (user: User { name: ${user.name}, age: ${user.age}, ... })"

Rather, I would like to write it like this:

c"CREATE (user: User { $user })"

Or even:

c"CREATE $user"

Before diving in the implementation, I would like to see what others think about this.

Discussion: Implement the bolt protocol from scratch

As the title says, what if instead of depending on the Java driver we create or own native driver by implementing the bolt protocol.

Pros:

  • Smaller size, since we would be totally dependency-free.
  • No need to deal with more problems of the underlying driver.
  • We could better represent async and streaming operations using a single Transaction class.

Cons:

  • A lot more of work.
  • A lot less of testing and confidence.
  • For simplicity we probably would only support the latest version fo the protocol as such users could only use a smaller set of versions.

Discussion: Session + Concurrency

The javadoc for Session explicitly states that:

"Multiple sessions should be used when working with concurrency; session implementations are not thread safe."
"At most one transaction may exist in a session at any point in time. To maintain multiple concurrent transactions, use multiple concurrent sessions."

Thus, in a context of a web application, it seems the best would be to have a pool of opened Sessions which map one to one with a thread pool (ExecutionContext) of the same size.

Also, it seems to me that this is something the library should provide out of the box. But it is still not clear to me how this should work. I am only sure the underlying EC should be supplied by the user.

Discussion: Use explicit decoders instead of typeclass based ones

Following the ideas of other libraries like skunk and dynosaur, I believe we should also provide explicit decoders instead of implicit ones.

Obviously, the downside of that is extra boilerplate for simple cases.
The idea would be to provide an API that allows to easily query and decode simple types like primitives and provide generic support for explicit derivation of decoders for case classes (and maybe ADTs in general) but that allows customization when needed.

Related to #235

Add License

So, I was reading about licenses today (mainly because I am thinking in starting an open source project), and I was surprised both to notice that this does not have any, and that means nobody can use it (legally speaking). 😧

I strongly believe that is not what you wanted. Thus, I hope this just works as a reminder. 😄


Edit

I just noticed that in the build.sbt you specify this is under the MIT License. 👍
Anyways, I still believe, it would be good to specify that on the README and / or adding the LICENSE file to the root of this repo.

Discussion: Extract more modules from core?

As mentioned in #201 it may be good to extract some things out of the core module in order to make it smaller.

The idea of this issue is to make proposals on what to split and discuss the pros and cons of each proposal.

So here goes mine:

  • generic - Provides automatic derivation for case classes. - For removing the dependency on shapeless on the core module, this may also help in a future to support Scala 3.
  • cypher - Provides the cypher string interpolator. - Personally, I think this should be part of the core, but since it is implemented as a macro it probably is better to move it outside of the core just to make it easier to support Scala 3.
  • extras - Provides things like ORM capabilities (see #204) and other stuff - The idea is that we would be very open to accepting anything into it as experimentation. If they become stable / useful enough we may start to move things from extras out to independent modules, it may also be named experimental.

Also, I am not sure if we should move the streaming things into another module or not.

Properly test: should rollback on cancellation

During developing #164 I noticed that I made a big mistake on some of my previous implementations and it is that I am committing a Transaction when the effect was cancelled. I just realized that is an error, I believe most people would expect that an effect running some queries in the same Transaction to be rollbacked on cancellation.

However, #164 is too big and this change is also big enough so I decided to create this issue to not forget about this and fix it after finishing #164

Feature: refined support

I would be great if this could be integrated (probably with an external module) with refined.

I see two big opportunities.

  1. Using refined types in query builders. For example:
type Level = Int Refined Interval.Closed[W.`1`.T, W.`99`.T]
val L1: Level = 1

// This does not work.
c"CREATE (level: Level { value: ${L1} })"

// Right now, we have to do this to manually unwrap the Int from the Refined,
// which for many values is kind of tedious.
c"CREATE (level: Level { value: ${L1.value} })"
  1. Returning refined values from queries, if the value does not satisfy the refinement return a Throwable.
type Level = Int Refined Interval.Closed[W.`1`.T, W.`99`.T]

c"MATCH (level: Level) RETURN level.value".query[Level].list(session)
// res: Future[List[Level]]

Add a derive mapper method

As a best practice to reduce compile times, it is better to use semi-automatic derivation instead of the full automatic one.
This allows us to cache important instances, especially for recursive or nested data structures.

For example, I usually do this with Circe:

final case class Foo(data: String)
final case class Bar(id: Int, foo: Foo)
final case class Baz(flag: Boolean, bar: Bar)

private implicit final val fooEncoder: Encoder[Foo] = deriveEncoder
private implicit final val fooEncoder: Encoder[Bar] = deriveEncoder
private implicit final val fooEncoder: Encoder[Baz] = deriveEncoder

I would like to do the same with neotypes Mappers.
But, currently, we do not provide any way to do this.

Feature: case class mapper for maps

If using map projections, neotypes is unable to map the returned values to case classes. A common use case I have (which I believe could be useful for others) is having a node with a lot of fields to which you want to add some other computed fields and return that as a case class.
Example:

c"""MATCH (u: User { name: $name })
      ...
     RETURN u { .*, score: sum(???) }
    ORDER BY u.score
    LIMIT 10""".query[ScoredUser].list(session)

Right now we are solving the problem like this:

c"""MATCH (u: User { name: $name })
      ...
     RETURN
       u.name AS name,
       ...
       sum(???) AS score
    ORDER BY u.score
    LIMIT 10""".query[ScoredUser].list(session)

Not a big deal, but is kind of tedious to add every property to the return.


Disclaimer: I do not know anything about shapeless, so I can not help with this. Being honest, I do not even know if this is easy, doable, difficult or impossible lol 😅
But I guess I do not lose anything opening this just to see if anyone knows how to do it.

Divergent implicits when using `cats.implicits._`

I've read #35 and I understand it refers to the import neotypes.Async._? I am currently experiencing the same issue but also with import neotypes.implicits._ using version 0.9.0.

Here's the original code I had using the Neo4j java driver:

import cats.Applicative
import cats.effect._
import cats.effect.implicits._
import cats.implicits._
import org.neo4j.driver.v1._

object Neo {

  case class NeoConfig(uri: String, user: String, pass: String)

  // TODO: do proper logging
  def putStrLn[F[_]: Sync, A](a: A): F[Unit] = Sync[F].delay(println(a))

  private def loadConfig[F[_]: Applicative]: F[NeoConfig] =
    NeoConfig("bolt://localhost:7687", "neo4j", "test").pure[F]

  def mkClient[F[_]: Sync]: Resource[F, NeoClient[F]] =
    Resource.make[F, NeoClient[F]](
      loadConfig[F].flatMap { c =>
        putStrLn("NeoClient[F] >> Connecting to Neo4j...") *>
          Sync[F]
            .delay(GraphDatabase.driver(c.uri, AuthTokens.basic(c.user, c.pass)))
            .map(new LiveNeoClient(_))
            .widen[NeoClient[F]] <*
          putStrLn("NeoClient[F] >> Connected")
      }
    )(_.close)

}

trait NeoClient[F[_]] {
  def close: F[Unit]
  def transaction: Resource[F, Transaction]
}

class LiveNeoClient[F[_]: Sync] private[persistence] (driver: Driver) extends NeoClient[F] {
  def close: F[Unit] = Neo.putStrLn("NeoClient[F] >> Closing connection") *> Sync[F].delay(driver.close())
  def transaction: Resource[F, Transaction] =
    Resource.make(Sync[F].delay(driver.session.beginTransaction))(tx => Sync[F].delay(tx.close))
}

This compiles fine until I add any of the imports neotypes.Async._ and/or neotypes.implicits._

[error] /workspace/.../Neo.scala:22:17: type mismatch;
[error]  found   : F[Neo.NeoConfig]
[error]  required: ?{def flatMap: ?}
[error] Note that implicit conversions are not applicable because they are ambiguous:
[error]  both method toFlatMapOps in trait ToFlatMapOps of type [F[_], C](target: F[C])(implicit tc: cats.FlatMap[F])cats.FlatMap.Ops[F,C]{type TypeClassType = cats.FlatMap[F]}
[error]  and method AsyncExt in object implicits of type [F[_], T](m: F[T])neotypes.implicits.AsyncExt[F,T]
[error]  are possible conversion functions from F[Neo.NeoConfig] to ?{def flatMap: ?}
[error]       loadConfig[F].flatMap { c =>
[error]                 ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 2 s, completed Jun 28, 2019 10:58:31 AM

Are there any known workarounds to this? I see there's been some work done in #33 but the error still seems to be there.

Add scaladoc

It would be good to also have the scaladoc of the library published together with the documentation.

This also implies that we should add all the scaladoc comments that are missing right now.

Parameterized queries not consistent

This may be an upstream issue, but I'm wondering why the DeferredQueryBuilder only correctly interpolates queries where parameters are in the node properties of the cypher query. Whereas if we want to parameterize say the node label or the node property name, we need to instead use DeferredQueryBuilder's + operator.

Consider using newtype

Proposal

I think we should consider using scala-newtype for our wrapper classes, like Session & Transaction, instead of using value classes.

Motivation

Value classes are not guarantee to do not box at runtime. Thus, my original idea was to propose the removal of its use, given I would rather have just one guarantee instantiation, that 0-* possible instantiations that we can not control.
However, it is true that many of the types we provide, are just wrappers over other simpler types + some convenient methods. Thus, it was convenient and type safe to view them as different types in compile time, but leave the primitives at runtime.

For those reasons, I believe it is worth investigating the use of the new-type library in the core module. It does not have any other transitive dependency, so IMHO is a good trade off.

DeferredQuery single F[T] return type

DeferredQuery is defined as:
def single[F[_]](session: Session[F])(implicit F: Async[F], rm: ResultMapper[T]): F[T]

However, issues might arise when the neo4j datastore doesn't have a corresponding value. In that case I imagine a null will be returned.

I propose a change to single to reflect the possibility that the value might not exist:

def single[F[_]](session: Session[F])(implicit F: Async[F], rm: ResultMapper[T]): F[Option[T]]

Making schema evolution safer

Do you have any plans or ideas to make queries more amenable to schema changes? Specifically, one thing that bothers me is writing labels and property names in the query, which makes evolving the schema very hard.
I've made two attempts at this, for example:
https://github.com/fredshonorio/neu/blob/master/src/test/java/com/fredhonorio/neu/query/VarTest.java
and
https://github.com/fredshonorio/nod/blob/master/src/main/scala/nod/QueryExample.scala

and I'm not particularly satisfied with any of them.

Add support to Enumeratum

Enumeratum is a common library in the Scala ecosystem (especially for the typelevel community).

It would be good to provide out of the box support for encoding and decoding enumerations.

Question: Why streaming queries always rollback?

Hi,

Not sure if I am missing something, but it seems to me that when you stream a query, at the end, you always rollback the query.

F.success(sb.onComplete(stream(t)) {
  t.rollback()
})

-- taken from here.

Is there any reason for this?
Or am I missing something?


BTW, Other question.
It seems that, streaming is only valid when requesting data. It is possible to insert data coming from an Stream?

To be honest, I am new to Neo4j too, so I don't know if that does even make sense.
Maybe, the best way to do that, would be to use the same transaction to each create and execute them on my Stream?

Upgrade to Neo4j 4

A new version fo the Neo4j Java driver has been released.

The changelogs state that probably the most important change is the support of an Rx driver. It would be great if we can provide wrappers for those too for our Streaming implementations, for example since I use http4s I end up converting everything into an fs2 Stream.

Automatic wrapping/unwrapping of `AnyVal`

Many libraries, such as circe, json4s or caliban, provide automatic wrapping and unwrapping of the inner value in AnyVal types without explicit mapping. Eg. case class Id(value: String) extends AnyVal will treated as a String for serialization and a string value would automatically be wrapped in an Id where required during deserialization.

It would be great if this were supported by neotypes as well.

Add support to NewType

scala-newtype is a common library in the Scala ecosystem (especially for the typelevel community).

It would be good to provide out of the box support for encoding and decoding new types if the underlying type is supported.

Discussion: Should we be read-only by default?

Related to #178

It may be worth to add a readOnlyTransact which is basically transact but setting / overriding the transaction config to have AccessMode.Read. However, IMHO, it only makes sense to add this if we use it for operations like single or list.

The pros of this would be to improve the performance of such operations for users having a multi-node cluster with nodes specialized in read-only operations.
The cons are that this change may break existing applications without no-one noticing until execution (be it runtime or a test).

Finally note that if we do not do this, then people wanting to take advantage of the read-only optimizations need to write extra code. Whereas if we do, then people wanting to write inside a "read" query would need to add the extra code.

Document and improve the library errors

It would be good to document all the errors that the library can generate, as well as being very explicit when and why any of them could happen, how can you catch them and a guideline on how to fix it (if possible).

As such, it would also be very important to improve them first; for example, what is the difference between an IncoercibleException and a ConversionException? We also need to be sure we are consistent in using the same exception everywhere it makes sense.
Additionally, I think we should catch all the underlying Java driver exceptions but not any other kind of exception. For many of the Java driver exceptions we probably can provide appropriate wrappers and for the rest, we should have a UnknownNeo4jException or something.

neotypes.Async implicit imports cause divergent implicits

I have code that uses cats-effect with higher-kinded types in most of my functions. Here's an example that should compile fine using the cats.effect.Async typeclass:

import cats.implicits._
import cats.effect._

def flatMapF[F[_]: Async](f: F[Any]): F[Unit] = f.flatMap(_ => Async[F].unit)

This code compiles and works fine unless I import the implicits for neotypes.Async by importing neotypes.implicits.all._ or neotypes.implicits.syntax.all._. The code fails to compile due to a divergent implicit leading the compiler to not know which flatMap method to call.

Currently using the latest snapshot build from master, built locally (0.9.1-SNAPSHOT).

Wrap UUID parsing exceptions into Neotypes exceptions

UUIDs are mapped using this ValueMapper:

implicit final val UUIDValueMapper: ValueMapper[UUID] =
ValueMapper.fromCast(s => UUID.fromString(s.asString))

But UUID.fromString throws IllegalArgumentExceptions, NumberFormatExceptions and probably other types of exceptions which are not instances of NeotypesException. Ultimately, these exceptions will be returned by the library and I think it would be nicer if they were included in the NeotypesException hierarchy in order to allow the user to implement reliable logic when these errors appear.

Discussion: Transaction as a Resource

Ok so, back on #39 I made Transaction[F] a Resource. Thus, the transaction method on Session[F] returns an R[Transaction[F]]. That had the negative consequence that we couldn't use Id as the Resource of Futures. But, it seemed like the best thing to do.
However, I have been thinking about it, and I now believe I was wrong!

First, it is true that a Transaction should always be closed at the end (that was the main reason for making it a Resource).
But, the problem is that it has two ways of being closed: committed & rollbacked.

Second, our close implementation tries to commit the transaction and if that fails, then it tries to rollback. At first, that made sense for me...

I thought about a case like, creating multiple nodes in the same transaction, but if one of the nodes breaks an unique constraint, the commit would fail, and the rollback seemed reasonable.

... However, the Java driver already do this. If a transaction fails to be committed, nothing will happen. And, calling commit will return an exception explaining the problem.

Third, If one wants to manually rollback the transaction (for any reason). Then when the resource closes and calls commit, it will throw an exception! (because the transaction was already rollbacked and closed).

Note: All these apply only to IO & friends TM - Future does not have this problem, as its Resource is not doing nothing.


Thus, my question is:

Should we make a Transaction just shared state. instead of a Resource. That means, returning F[Transaction[F]] instead?
Or maybe, should we keep some internal (mutable) state, that changes if the transaction was rollbacked, and check that on the close operation?

Do anyone has any other idea or opinion about this?
I am specially interested in hearing about @gvolpe & @dimafeng

Add support for read only transactions

The neo4j java driver provides a method on AsyncSession to allow a user to specify that a transaction should not mutate data on the neo4j server: ReadTransactionAsync. This allows optimisation in clustered environments - such as routing read-only queries to read replicas.

It should be noted that introspection does not occur on the neo4j driver side, and it does not check to see that a query won't mutate data before routing the request, but the read replica will reject it.

neotypes.generic not available in 1.5.1 jar from sbt

Getting started example found at
https://neotypes.github.io/neotypes/ is not working fully(connection works, queries implicits don't)

Reason:
// import neotypes.generic.auto._ is not working

Reason:
the whole generic is missing from sbt: com.dimafeng:neotypes_2.13:0.15.1:jar (internal and implicits are there)

//SBT
libraryDependencies += "com.dimafeng" %% "neotypes" % "0.15.1"

//checked also 0.15.0 and 0.14.0, but not found since generic is a recent addition :)

Thank you for the great library!

Provide support for decoding ADTs from Nodes / Relationships

Since we already support decoding key-value structures like Node or Relationship into case classes, it may make sense to allow decoding into an ADT where (following what other libraries, e.g. circe, do) we read a property type to determine which case class to create.

Related #234 & #201

Jekyll install error during PR build

During the build of #28, the following error occured:

ERROR:  Error installing jekyll:
	The last version of public_suffix (>= 2.0.2, < 4.0) to support your Ruby & RubyGems was 3.0.3. Try installing it with `gem install public_suffix -v 3.0.3` and then running the current command again
	public_suffix requires Ruby version >= 2.3. The current ruby version is 2.2.0.
The command "gem install jekyll -v 3.4.3" failed and exited with 1 during .
Your build has been stopped.

Doesn't seem to be change-related.

Graphs of objects?

Have you considered to do saving and loading of graphs of objects?
I remember I used Neo4J OGM years ago and it saved and loaded graphs of related objects, like Hibernate ORM, is there any plan for that? Is this something in your thoughts?

Add more streaming tests

This issue is built on #221

For now, we provide a basic set of tests for our Stream abstraction and implementations.
While those are enough to guarantee the correct behaviour of the implementation of simple usage, they leave a lot of things open, which would be good to test.

For example, it would be good to add:
(in order of importance)

  • ConcurrentStreamingDriverSpec.: For ensuring the correct behaviour of concurrent queries using a StreamingDriver; similar to ConcurrentDriverSpec..
  • StreamResourceSpec: For ensuring the correct behaviour of the resource method; similar to AsyncGuaranteeSpec.
  • StreamingQueryExecutionSpec: As an extension to QueryExecutionSpec for also validating the stream execution method (it would be even better if this one could also run all the test on QueryExecutionSpec but using a StreamingDriver instead).
  • StreamingDriverSpec: As a showcase of many basic queries using a StreamingDriver; similar to DriverSpec (it would be even better if those two could inherit from a single trait that provides all the tests).

Feature: Expose test harness as a separate library

I'm just getting into Neo4j with Scala and found out that the official Neo4j test harness doesn't even support 2.13 yet :/

I think it would be a huge boon to the community if neotypes exposed their own test harness as a library so that users don't have to roll their own.

Creation of session should return `F[Session[F]]`

Creation of mutable state should be wrapped in F for the reasons stated here and here. Fabio Labella also gave a great talk on the topic.

val session: Session[IO] = driver.session().asScala[IO] // this can leak

val makeSession: IO[Session[IO]] = driver.session.asScala[IO] // this is the action of creating a session

makeSession.flatMap { s => // here we create a region of sharinn
  useSession(s) // whoever needs access to the session should take it as a parameter 
 // in this way the sharing of the session is under our control in a narrow scope (inside `flatMap`)
}

Right now I get around by just wrapping its creation in IO but ideally the library should do this.

This probably doesn't make sense when using Future since it breaks referential transparency but for the rest of us using FP libraries this is important.

Let me know if I can be of any help.

Unify tests styles

Currently, our tests are very different from each other.
It would be good if we can unify them under some "standard".

I propose using Matchers everywhere instead of plain asserts.
And using a WordsLike for all our specs, since it is the one which gives us more flexibility and we already have a couple of tests written in that style. And the FlatLike specs can be easily be adapted to this style without modifying too much.

Additionally, it may be good to check if we can mix the TypeCheckedTripleEquals from Scalactic, to ensure all our checks are typesafe.

Bug: AkkaStreamSpec randomly failing in Travis

From a while ago, the Akka Streams test suite is randomly failing in Travis.
I haven't be able to reproduce the error on local. Thus, I tend to assume the problem is related to the running environment, but this is not confirmed right now.

Fix the AkkaStreams.guarantee implementation (or test)

Right now we are ignoring the whole AkkaStreamsSuite since the StreamGuaranteeSpec started to fail for AkkaStreams without any apparent reason (I believe the implementation always had a bug but we were lucky with the CI tests when we merged the PR).

The branch https://github.com/neotypes/neotypes/tree/debug-akka can be used as a starting point to play with the code and try to figure out what is wrong.


Note: Another thing that would be good to fix is that right now we are not using the custom ExecutionContext provided by Scalatest in the creation of the ActorSystem used as the Materializer for the whole spec.
However, when trying to use it like this:

implicit val system =
  ActorSystem(name = "QuickStart", defaultExecutionContext = Some(ec))

The whole suite fails with some weird error about a timeout waiting for a logger author.

Shapeless derivation ignores instances in companion object

Background

import neotypes.implicits.mappers.results._

final case class ObjectScope(value: String)

object ObjectScope {
  val const = ObjectScope("const")
  implicit val resultMapper: ResultMapper[ObjectScope] = ResultMapper.const(const)
}

ResultMapper[ObjectScope].to(Nil, None) // Left(neotypes.exceptions$PropertyNotFoundException: Property value not found)

// Expected Right(ObjectScope(const))

ResultMapper[ObjectScope] expands into
ResultMapper[ObjectScope](neotypes.implicits.mappers.results.ccMarshallable).

From the compiler perspective, everything is correct. From the user perspective, it's a painful experience.

Instances for alias/opaque/newtype types are ignored.

Workaround

import neotypes.implicits.mappers.results.{ccMarshallable => _, _}

Solution

What if we follow the auto/semiauto pattern here?

  1. Move shapeless-based derivation into a separate folder neotypes.generic
  2. Introduce auto and semiauto derivation
  3. Use dedicated types for the shapeless-specific result mappers: DerviedResultMapper for product and RepResultMapper for HList.

Pros/Cons

Pros:

  • neotypes.generic.semiauto can work together with https://github.com/scalaz/scalaz-deriving
  • primitive instances are isolated
  • more control about imports
  • more options for external instances: manual, magnolia-based, etc

Cons:

  • Wildcard import of neotypes.generic.auto._ still leads to the problem explained in the post. But at least instances for primitives can be imported safely import neotypes.implicits.mappers.results._
  • the change is source/binary breaking

What's your opinion on this topic? If you are interesting, I can pick it up and contribute.

Compile documentation examples

Since we are already using tut due sbt-microsite. It would be great if we ensure all our examples are tested to compile with the current version of the application.

Ideally, this should have impact on the CI.
Like, if a PR introduces a change that breaks some example, the PR should not be merged until the documentation is updated accordingly.

Improve microsite

It would be good to improve the microsite so it provides a better experience for our users.
Right now I can think about the following things:

  1. Use the new (and recommended) light theme, instead of the old pattern one. I believe this force us to remove our current custom styles, so we would need to see if we like the new theme or see how to customize it.
  2. Remove the current index.md and rather use the README.md, that way we do not need to maintain both. Also, it would be good to preserve the Compatibility matrix that currently is not in the README.
  3. Make the README / index.md have the current latest version in the Setup section instead of saying "version".
  4. Adding new sections? It would be good if users could point out what they find missing in the current docs.
  5. Supporting multiple versions.
  6. Adding fav.ico

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.