Giter Club home page Giter Club logo

scala-redox's People

Contributors

andrewzurn avatar apatzer avatar dominics avatar mkokho avatar pulasthibandara avatar teriu avatar velitheda avatar vital-software-bot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

scala-redox's Issues

Switch to Codeship Pro

This project is still using Codeship Basic, and needs to be upgraded to the Codeship Pro project type to match our other projects.

Redox returns errors inside "Meta" object for 400

Here is an example

{
  "Meta": {
    "DataModel": "Clinical Summary",
    "EventType": "Query",
    "Message": {
      "ID": 96619801
    },
    "Source": {
      "ID": "5e6353f9-e28a-4295-b5f3-12b1ecae783a",
      "Name": "Redox API Endpoint"
    },
    "Errors": [
      {
        "ID": 1492613,
        "Text": "Invalid synchronous destination: more than one synchronous destination was specified",
        "Type": "message",
        "Module": "DataModels"
      }
    ]
  }
}

Robust parsing failing on multiple failures in arrays

Describe the bug
Robust parsing fails if there are multiple deep json errors in arrays. This is because play-json exists as soon as it sees a parsing error in an array. We would beed to recursively apply robust parsing to sanitise rest of the erros.

  • Recursively apply robust parsing when a failure is seen in an array
  • Avoid getting into an infinite loop by checking previously sanitized json paths.

Update VisitQueryResponse Tests

One test was failing as the Redox test endpoint is no longer returning the same information that it was (this is a bit fragile on our testing infrastructure to say the least).

See https://github.com/vital-software/scala-redox/blob/master/src/test/scala/com/github/vitalsoftware/scalaredox/ClinicalSummaryTest.scala#L192
// TODO: {Allergies, Assessment, Encounters, Results} No longer returned in the test response from Redox!

    // Allergies
    //val allergies = visitQueryResponse.Allergies
    //allergies.size must be_>(2)

    // Assessment
    //val assesment = visitQueryResponse.Assessment
    //assesment must beSome
    //assesment.head.Diagnoses.size must be_>(1)

    // Encounters
    //val encounters = visitQueryResponse.Encounters
    //encounters.size must be_>(0)

    // Results
    //val results = visitQueryResponse.Results
    //results.size must be_>(0)

Interest in Upload Functionality?

Hey there,

Awesome project! We've been working on an implementation recently, and I'm pretty happy I found this nice little client.

Would there be any interest in adding some functionality to this project around performing uploads through the '/upload' Redox endpoint?

Here's the docs on uploading files.
https://developer.redoxengine.com/data-exchange/sending-files-through-redox/

The interface could be something like:

def postUpload(file: File): Future[RedoxResponse[UploadResponse]]

where UploadResponse is

case class UploadResponse(URI: URI)

or

case class UploadResponse(URI: String)

If so, I could work on it as apart of our implementation, and submit a PR for this issue. If there isn't interest, or there is some technical reason I can't think of as to why it shouldn't be added, that's perfectly fine as well.

Thanks,
Andrew

BasicPerson in AdvanceDirective should have Credentials as Option[String] not Seq[String]

In the Redox JSON schema v4 files (patientqueryresponse.json) we see these definitions for VerifiedBy and Custodians of an AdvanceDirective:

          "VerifiedBy": {
            "type": "array",
            "items": {
              "type": "object",
              "properties": {
...
                "Credentials": {
                  "type": "string"
                },
...
              }
            }
          },
          "Custodians": {
            "type": "array",
            "items": {
              "type": "object",
              "properties": {
...
                "Credentials": {
                  "type": "string"
                },
...
              }
            }
          }

An example is something like:

      "Custodians": [
        {
          "FirstName": "Robert",
          "LastName": "Dolin",
          "Credentials": "Dr.",
          "Address": {
            "StreetAddress": "21 North Ave.",
            "City": "Burlington",
            "State": "MA",
            "Country": "USA",
            "ZIP": "02368"
          }
        }
      ]

In this library, we map the object inside those arrays as a BasicPerson, but that object has a Seq[String] for Credentials instead of an Option[String]:

More robust Json reads

In the example below SocialHistory.Observations will be empty, because Observations(0).Value contains all null values. Value is defined as a BasicCode object, which is not valid unless Code and CodeSystem are defined. Because BasicCode fails, the whole Observation sequence silently fails.

This gets hard to debug, and is not very fault tolerant. Since Value is an Option[BasicCode] the failure should simply set Value = None rather than cascading. This seems to be an issue with Play Json.

"SocialHistory": {
    "Observations": [
      {
        "Code": "160573003",
        "CodeSystem": "2.16.840.1.113883.6.96",
        "CodeSystemName": "SNOMED CT",
        "Name": "Alcohol Consumption",
        "Value": {
          "Code": null,
          "CodeSystem": null,
          "CodeSystemName": null,
          "Name": null
        },
        "ValueText": "None",
        "StartDate": "1990-05-01T04:00:00.000Z",
        "EndDate": null
      }
    ]
}

Use sensible default fallbacks for Enum values

Since we can't always rely for an EHR to send messages with properties that adhere to a standard set of values, message processing can fail when our enum values doesn't match the values sent through redox.

To detect parsing errors but still carry out processing we should log when we fail to parse and fallback to a sensible default of the Enum value.

Set up CI to perform releases

Similar to what we've done with our other Scala libs, this library should support versioning the project during the CI build of master commits:

  - name: ":sbt: Release new version"
    branches: master
    command:
      - git checkout -B ${BUILDKITE_BRANCH}
      - git branch -u origin/${BUILDKITE_BRANCH}
      - git config branch.${BUILDKITE_BRANCH}.remote origin
      - git config branch.${BUILDKITE_BRANCH}.merge refs/heads/${BUILDKITE_BRANCH}
      - sbt -batch "release with-defaults skip-tests"

The CI system should have Sonatype access, and use it to publish artifacts.

Make non-critical fields more robust

What is the purpose of this epic/feature?
We should't fail at parsing an entire message when a non-critical field like Contacts.FirstName fails to parse. The message maybe still useful without these critical fields.

Detail out your proposed approach

  • Provide a fallback mechanism where we can substitute a valid default for the non-critical field (probably not a null value because this can complicate this further)
  • Add a @fallback(<value>) annotation in the json-annotations library to handle parsing failures per field.
  • Maybe log the failure to parse?

Are there any risks or unknown factors?
Use of macros can be risky and hard to maintain. However, the final outcome of being able to define a fallback value for each field outweighs the risk.

ex:

@json case class DataModel (
	criticalField: String,
	@fallback("default") nonCriticalField: String
)

Json support for Auth request

I'm currently running into some issues using the latest version (0.90) making successful requests to the auth endpoint. It seems like the docs state that the auth endpoint should use a content type of json (http://developer.redoxengine.com/data-models/authentication.html).

When attempting to use the current client (from one of my deployed environments) I'm receiving a 401.

2017-11-09 02:12:16,769 DEBUG p.s.a.o.a.n.r.NettyRequestSender - Using pooled Channel '[id: 0x02c744e4, L:/100.97.0.15:50900 - R:api.redoxengine.com/52.5.137.221:443]' for 'POST' to 'https://api.redoxengine.com/auth/authenticate'
2017-11-09 02:12:16,769 DEBUG p.s.a.o.a.n.r.NettyRequestSender - Using open Channel [id: 0x02c744e4, L:/100.97.0.15:50900 - R:api.redoxengine.com/52.5.137.221:443] for POST '/auth/authenticate'
2017-11-09 02:12:16,813 DEBUG p.s.a.o.a.n.h.HttpHandler -

Request DefaultFullHttpRequest(decodeResult: success, version: HTTP/1.1, content: UnpooledHeapByteBuf(freed))
POST /auth/authenticate HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Content-Length: 129
Host: api.redoxengine.com
Accept: */*
User-Agent: AHC/2.0

Response DefaultHttpResponse(decodeResult: success, version: HTTP/1.1)
HTTP/1.1 401 Unauthorized
Date: Thu, 09 Nov 2017 02:12:16 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 15
Connection: keep-alive
Cache-Control: private, no-cache, no-store, must-revalidate
Expires: -1
Pragma: no-cache
ETag: W/"f-RGtBbVNe0/VH/NZT3BHUnA"
Vary: Accept-Encoding
set-cookie: sails.sid=s%3AIE-kJgq3ux5NgrGS9Fzbpmfj9ps4c-5v.AINMYtpmcfxbKlEPZx9%2B3mJcnM6yLTuP4yWxbajg9hE; Path=/; Expires=Fri, 10 Nov 2017 02:12:16 GMT; HttpOnly; Secure

2017-11-09 02:12:16,813 DEBUG p.s.a.o.a.n.h.i.Unauthorized401Interceptor - Can't handle 401 as there's no realm

Locally, however, I'm seeing this succeed (which is rather odd I would say).

2017-11-08 18:01:11,579 DEBUG p.s.a.o.a.n.c.NettyConnectListener - Using new Channel '[id: 0x2a5d2cd3, L:/192.34.100.6:60088 - R:      api.redoxengine.com/34.196.161.187:443]' for 'POST' to '/auth/authenticate'
2017-11-08 18:01:11,581 DEBUG p.s.a.i.n.h.s.SslHandler - [id: 0x2a5d2cd3, L:/192.34.100.6:60088 - R:api.redoxengine.com/34.196.        161.187:443] HANDSHAKEN: TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
 2017-11-08 18:01:11,972 DEBUG p.s.a.o.a.n.h.HttpHandler -

 Request DefaultFullHttpRequest(decodeResult: success, version: HTTP/1.1, content: UnpooledHeapByteBuf(freed))
 POST /auth/authenticate HTTP/1.1
 Content-Type: application/x-www-form-urlencoded
 Content-Length: 123
 Host: api.redoxengine.com
 Accept: */*
 User-Agent: AHC/2.0
 Response DefaultHttpResponse(decodeResult: success, version: HTTP/1.1)
 HTTP/1.1 200 OK
 Date: Thu, 09 Nov 2017 02:01:12 GMT
 Content-Type: application/json; charset=utf-8
 Content-Length: 145
 Connection: keep-alive
 Cache-Control: private, no-cache, no-store, must-revalidate
 Expires: -1
 Pragma: no-cache
 ETag: W/"91-9IBvEKIjz9qzhA71S6BuhQ"
 Vary: Accept-Encoding
 set-cookie: sails.sid=s%3A7dAQS5v3RjkFlXXa8tApkPPK3eLbSsB-.cpNRmRak01kbcFDaNk5ME4C8AfkoBKgpvzZZLfOPgxk; Path=/; Expires=Fri, 10        Nov 2017 02:01:11 GMT; HttpOnly; Secure

When attempting this call using Json I get 200 result (code and logs below).

wsClient
    .url("https://api.redoxengine.com/auth/authenticate")
    .withBody(JsObject(Seq("apiKey" -> JsString(RedoxApiKey.trim), "secret" -> JsString(RedoxSecret.trim))))
    .withMethod("POST")
    .execute()
    .foreach { result =>
      logger.info(s"Result of custom auth call: $result")
      logger.info(s"Token: ${result.body}")
    }(scala.concurrent.ExecutionContext.Implicits.global)

Logs from deployed env:

Request DefaultFullHttpRequest(decodeResult: success, version: HTTP/1.1, content: UnpooledHeapByteBuf
(freed))
POST /auth/authenticate HTTP/1.1
Content-Type: application/json
Content-Length: 133
Host: api.redoxengine.com
Accept: */*
User-Agent: AHC/2.0

Response DefaultHttpResponse(decodeResult: success, version: HTTP/1.1)
HTTP/1.1 200 OK
Date: Thu, 09 Nov 2017 04:56:26 GMT
Content-Type: application/json; charset=utf-8
Content-Length: 145
Connection: keep-alive
Cache-Control: private, no-cache, no-store, must-revalidate
Expires: -1
Pragma: no-cache
ETag: W/"91-U1mxrXJB8Yk5WGjrQLuCpA"
Vary: Accept-Encoding
set-cookie: sails.sid=s%3AMCSET6YaBiKbzPheTEAbyL2bU7i-vzBp.hefIVxutr2M8m77jKSRT2uIYwWP06vFMC8CWFF%2Bd
KyI; Path=/; Expires=Fri, 10 Nov 2017 04:56:25 GMT; HttpOnly; Secure

2017-11-09 04:56:26,173 DEBUG p.s.a.o.a.n.c.ChannelManager - Adding key: https://api.redoxengine.com:
443 for channel [id: 0xd4812bf8, L:/100.105.128.5:54128 - R:api.redoxengine.com/52.5.137.221:443]
2017-11-09 04:56:26,179 INFO c.d.s.r.j.DeliverReportRunnerFlow - Result of custom auth call: AhcWSResponse(StandaloneAhcWSResponse(200, OK))
2017-11-09 04:56:26,179 INFO c.d.s.r.j.DeliverReportRunnerFlow - Token: {"accessToken":"xxxxxxx","expires":"2017-11-10T04:56:26.131Z","refreshToken":"xxxxxxx"}

Is this something you've run into before (the inconsistency in requests made with a content-type of application/x-www-form-urlencoded)? I'm really at wits end as to why a call made with the same client between envs would have different results, and also why using json produces a successful result.

Anywho, being that using json seems to be the documented approach to auth for Redox, would you be open to that change? Kind of an odd request, but potentially worth a discussion (and maybe I'll figure out what's up wrong with my current setup in the process).

I also noticed the authorize() method in RedoxClient is overridable, but couldn't be used to hook into the other lifecycle events for the class. Might be fun to have it be isolated from the setAuth process, and have the internals handle calling authorize and then proceeding to re-set the auth token. (It's late, and I doubt that makes sense, will maybe get back to this when I've gotten a little sleep).

New Maintainer Wanted!

We no longer maintain this project, and have marked it as deprecated.

We'd be happy to move ownership of this project to a new maintainer instead. Let us know in the comments below if you'd like to take on maintenance of this project.

Add property based testing

What is the purpose of this epic/feature?
Only testing the redox spec may not be enough as we may encounter messages that doesn't match the spec exactly. We can perform property based testing per each data model and verify parsing works as intended.

Detail out your proposed approach

  • Write properties for each field type we have on data-models
  • Use properties on data models to generate json and test them individually.

Are there any risks or unknown factors?
How do we generate the Json from the data models hydrated with properties? Won't they always parse becaues we use the hydrated data models to generate json? one option is to create more loose data models to accomodate invalid properties for testing or. (maybe make all properties optional strings on test data models?)

Create robust parses for premitive types

play-json fails to parse primitive json values which does not exactly match in type. ie:

  • json value { "Low": "14.3" } will failed to be parsed as a Double because its a json string. However we can convert this string value to a scala Double.
  • json value { "StringId": 123123 } will fail to be parsed as a String because its a json number. However we can convert this number value to a scala String.

We have seen instances where redox's json type doesnt exactly match our model type, but can be easily parsed. If we can define more tolerant json Reads for these types. we can avoid these issues.

Support for Scala 2.12.x

Hello again,

I have been spending a little time updating our projects to the latest version of Play (and thus Scala 2.12 as well), and noticed that this library isn't being cross published to Scala 2.12.

Not sure the feasibility of doing that without digging around a little bit, but figured I'd ask/get it on the projects radar regardless.

When I get a little time, I can give updating the build file a go to see what happens (unless someone else is available or knows that it can/can't be done without a fair amount of effort).

Thanks,
Andrew

Fails when Note.Authenticator.ID is null when a Note is pushed

The field Note.Authenticator.ID is Probable, so not always present. We do handle missing fields by making their types as Option[...]. It this case Note.Authenticator object is given, but its fields inside are null. Example is below.

We might want a general soluiton, not just for one field

"Authenticator": {
         "ID": null,
         "IDType": null,
         "FirstName": null,
         "LastName": null,
         "Credentials": [],
         "Address": {
            "StreetAddress": null,
            "City": null,
            "State": null,
            "ZIP": null,
            "County": null,
            "Country": null
         },
         "Location": {
            "Type": null,
            "Facility": null,
            "Department": null,
            "Room": null
         },
         "PhoneNumber": {
            "Office": null
         }
      },

Visit object has 22+ fields

models.Visit () has 22+ fields, meaning our macros for automatic JSON serialization / deserialization will not work. The case class itself is fine as Scala 2.11 removes the 22 field restriction.

Right now I've arbitrarily commented out some of the fields...that's not a good solution!

Possible solution is use play-json-extensions to support OFormat for 22+ field case classes. See pull request at bizzabo/play-json-extensions#34

See also https://stackoverflow.com/questions/20258417/how-to-get-around-the-scala-case-class-limit-of-22-fields

Invalid enum value for Orders.Priority

We have been getting invalid values for Orders.Priority = "ST" from redox. This can be mapped to "Stat" at Redox as a mapping/autoconversion.

  • Apply a fix this on scala-redox
  • Get redox to send standard values for Priority enum

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.