Giter Club home page Giter Club logo

pathling's People

Contributors

chgl avatar dependabot-preview[bot] avatar dependabot[bot] avatar dionmcm avatar fongsean avatar jkiddo avatar johngrimes avatar kaicode avatar kapsner avatar piotrszul avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pathling's Issues

Test that Coding typed groupings work

A grouping expression that results in a Coding type should be "materializable", and should result in the unique Coding elements being featured as labels with the response to the aggregate operation.

Implementation URL in CapabilityStatement does not respect X-Forwarded-Host

We have a problem with caching in AnalyticsServerCapabilities.

If you take a look at one of the /fhir/metadata endpoints in production, you will see that http://localhost:8080 is reported as the implementation URL. This happens even if the X-Forwarded-Host and X-Forwarded-Proto headers are correctly set.

I think the reason for this is caching within the HAPI RestfulServer implementation with respect to CapabilityStatements. The first request to the CapabilityStatement (which is probably from Docker, not containing the proxy headers) is cached for all subsequent requests.

There does not seem to be an obvious way to disable the caching within the public API of HAPI, i.e. IServerConformanceProvider.

Update documentation to mention memory requirements

Relates to #84 and #85.

The "Getting Started" section of the documentation needs to be updated to flag that the memory resources within Docker will need to be set to something greater than the application requires, and that number is currently 3GB.

One we implement #85, we can update this again to align with whatever the default configuration is, and point to the fact that this value is configurable.

Check CodeSystems ahead of $expand for memberOf

In the new implementation of memberOf, we use ValueSet intersections to calculate the membership of concepts, rather than the $validate-code operation.

One of the challenges with this approach is that if there are any concepts within the input set that are members of code systems that are unknown to the terminology server, an error will be thrown and the entire operation will fail.

To mitigate this, we really need to know what the known code systems are ahead of the operation, so that we can filter out those concepts from the requests. This could be done via a CodeSystem search operation.

Expression with where and comparison operator causes AnalysisError

Here is the expression:

@2018-05-06 > reverseResolve(MedicationRequest.subject).where(
$this.medicationCodeableConcept.coding contains http://snomed.info/sct|407317001
).first().authoredOn

And here is the error:

Caused by: org.apache.spark.sql.AnalysisException: Resolved attribute(s) d736d2f#17162 missing from id#16716,resource#16762,01122a9#16793,(to_date('2018-05-06') > to_date(`d736d2f`))#17197,01122a9_id#16789,d736d2f_id#17156 in operator !Aggregate [(to_date(2018-05-06, None) > to_date(d736d2f#17162, None))], [(to_date(2018-05-06, None) > to_date(d736d2f#17162, None)) AS (to_date('2018-05-06') > to_date(`d736d2f`))#17227, count(01122a9#16793) AS count(01122a9)#17228L].;;
!Aggregate [(to_date(2018-05-06, None) > to_date(d736d2f#17162, None))], [(to_date(2018-05-06, None) > to_date(d736d2f#17162, None)) AS (to_date('2018-05-06') > to_date(`d736d2f`))#17227, count(01122a9#16793) AS count(01122a9)#17228L]
+- Join Inner, (01122a9_id#16789 = d736d2f_id#17156)
   :- Project [id#16716, resource#16762, 01122a9_id#16789, resource#16762 AS 01122a9#16793]
   :  +- Project [id#16716, resource#16762, id#16716 AS 01122a9_id#16789]
   :     +- Project [id#16716, resource#16762]
   :        +- Project [id#16716, meta#16717, implicitRules#16718, language#16719, text#16720, identifier#16721, active#16722, name#16723, telecom#16724, gender#16725, birthDate#16726, deceasedBoolean#16727, deceasedDateTime#16728, address#16729, maritalStatus#16730, multipleBirthInteger#16731, multipleBirthBoolean#16732, photo#16733, contact#16734, communication#16735, generalPractitioner#16736, managingOrganization#16737, link#16738, named_struct(id, id#16716, meta, meta#16717, implicitRules, implicitRules#16718, language, language#16719, text, text#16720, identifier, identifier#16721, active, active#16722, name, name#16723, telecom, telecom#16724, gender, gender#16725, birthDate, birthDate#16726, deceasedBoolean, deceasedBoolean#16727, ... 22 more fields) AS resource#16762]
   :           +- Relation[id#16716,meta#16717,implicitRules#16718,language#16719,text#16720,identifier#16721,active#16722,name#16723,telecom#16724,gender#16725,birthDate#16726,deceasedBoolean#16727,deceasedDateTime#16728,address#16729,maritalStatus#16730,multipleBirthInteger#16731,multipleBirthBoolean#16732,photo#16733,contact#16734,communication#16735,generalPractitioner#16736,managingOrganization#16737,link#16738] parquet
   +- Deduplicate [d736d2f_id#17156, (to_date('2018-05-06') > to_date(`d736d2f`))#17197]
      +- Project [d736d2f_id#17156, (to_date(2018-05-06, None) > to_date(d736d2f#17162, None)) AS (to_date('2018-05-06') > to_date(`d736d2f`))#17197]
         +- Project [3a83148_id#17121, first(3a83148, true)#17144, 792a80a_id#17147, 792a80a#17151, d736d2f_id#17156, d736d2f#17162, (to_date(2018-05-06, None) > to_date(d736d2f#17162, None)) AS comparisonResult#17169]
            +- Project [3a83148_id#17121, first(3a83148, true)#17144, 792a80a_id#17147, 792a80a#17151, d736d2f_id#17156, 792a80a#17151.authoredOn AS d736d2f#17162]
               +- Project [3a83148_id#17121, first(3a83148, true)#17144, 792a80a_id#17147, 792a80a#17151, 792a80a_id#17147 AS d736d2f_id#17156]
                  +- Project [3a83148_id#17121, first(3a83148, true)#17144, 792a80a_id#17147, first(3a83148, true)#17144 AS 792a80a#17151]
                     +- Project [3a83148_id#17121, first(3a83148, true)#17144, 3a83148_id#17121 AS 792a80a_id#17147]
                        +- Aggregate [3a83148_id#17121], [3a83148_id#17121, first(3a83148#17128, true) AS first(3a83148, true)#17144]
                           +- Project [ac0ad56#16976, 7099464_id#17031, max(equality)#17106, f7524ee_id#17110, f7524ee#17115, 3a83148_id#17121, ac0ad56#16976 AS 3a83148#17128]
                              +- Project [ac0ad56#16976, 7099464_id#17031, max(equality)#17106, f7524ee_id#17110, f7524ee#17115, f7524ee_id#17110 AS 3a83148_id#17121]
                                 +- Filter (f7524ee#17115 = true)
                                    +- Project [ac0ad56#16976, 7099464_id#17031, max(equality)#17106, f7524ee_id#17110, max(equality)#17106 AS f7524ee#17115]
                                       +- Project [ac0ad56#16976, 7099464_id#17031, max(equality)#17106, 7099464_id#17031 AS f7524ee_id#17110]
                                          +- Aggregate [ac0ad56#16976, 7099464_id#17031], [ac0ad56#16976, 7099464_id#17031, max(equality#17070) AS max(equality)#17106]
                                             +- Project [id#16716, resource#16762, 01122a9_id#16789, 01122a9#16793, id#16798, 2071abb#16923, 2071abb_id#16919, b5e3374_id#16927, b5e3374#16932, ac0ad56_id#16965, ac0ad56#16976, 28667b4_id#16988, 28667b4#17001, explodeResult#17016, 7099464_id#17031, 7099464#17047, CASE WHEN isnull(named_struct(id, null, system, http://snomed.info/sct, version, null, code, 407317001, display, null, userSelected, false)) THEN cast(null as boolean) WHEN isnull(7099464#17047) THEN false ELSE CASE WHEN (((isnull(7099464#17047.system) || isnull(7099464#17047.code)) || isnull(named_struct(id, null, system, http://snomed.info/sct, version, null, code, 407317001, display, null, userSelected, false).system)) || isnull(named_struct(id, null, system, http://snomed.info/sct, version, null, code, 407317001, display, null, userSelected, false).code)) THEN cast(null as boolean) WHEN (isnull(7099464#17047.version) || isnull(named_struct(id, null, system, http://snomed.info/sct, version, null, code, 407317001, display, null, userSelected, false).version)) THEN ((7099464#17047.system = named_struct(id, null, system, http://snomed.info/sct, version, null, code, 407317001, display, null, userSelected, false).system) && (7099464#17047.code = named_struct(id, null, system, http://snomed.info/sct, version, null, code, 407317001, display, null, userSelected, false).code)) ELSE (((7099464#17047.system = named_struct(id, null, system, http://snomed.info/sct, version, null, code, 407317001, display, null, userSelected, false).system) && (7099464#17047.code = named_struct(id, null, system, http://snomed.info/sct, version, null, code, 407317001, display, null, userSelected, false).code)) && (7099464#17047.version = null)) END END AS equality#17070]
                                                +- Project [id#16716, resource#16762, 01122a9_id#16789, 01122a9#16793, id#16798, 2071abb#16923, 2071abb_id#16919, b5e3374_id#16927, b5e3374#16932, ac0ad56_id#16965, ac0ad56#16976, 28667b4_id#16988, 28667b4#17001, explodeResult#17016, 7099464_id#17031, explodeResult#17016 AS 7099464#17047]
                                                   +- Project [id#16716, resource#16762, 01122a9_id#16789, 01122a9#16793, id#16798, 2071abb#16923, 2071abb_id#16919, b5e3374_id#16927, b5e3374#16932, ac0ad56_id#16965, ac0ad56#16976, 28667b4_id#16988, 28667b4#17001, explodeResult#17016, 28667b4_id#16988 AS 7099464_id#17031]
                                                      +- Project [id#16716, resource#16762, 01122a9_id#16789, 01122a9#16793, id#16798, 2071abb#16923, 2071abb_id#16919, b5e3374_id#16927, b5e3374#16932, ac0ad56_id#16965, ac0ad56#16976, 28667b4_id#16988, 28667b4#17001, explodeResult#17016]
                                                         +- Generate explode(28667b4#17001.coding), true, [explodeResult#17016]
                                                            +- Project [id#16716, resource#16762, 01122a9_id#16789, 01122a9#16793, id#16798, 2071abb#16923, 2071abb_id#16919, b5e3374_id#16927, b5e3374#16932, ac0ad56_id#16965, ac0ad56#16976, 28667b4_id#16988, ac0ad56#16976.medicationCodeableConcept AS 28667b4#17001]
                                                               +- Project [id#16716, resource#16762, 01122a9_id#16789, 01122a9#16793, id#16798, 2071abb#16923, 2071abb_id#16919, b5e3374_id#16927, b5e3374#16932, ac0ad56_id#16965, ac0ad56#16976, ac0ad56_id#16965 AS 28667b4_id#16988]
                                                                  +- Project [id#16716, resource#16762, 01122a9_id#16789, 01122a9#16793, id#16798, 2071abb#16923, 2071abb_id#16919, b5e3374_id#16927, b5e3374#16932, ac0ad56_id#16965, 2071abb#16923 AS ac0ad56#16976]
                                                                     +- Project [id#16716, resource#16762, 01122a9_id#16789, 01122a9#16793, id#16798, 2071abb#16923, 2071abb_id#16919, b5e3374_id#16927, b5e3374#16932, 01122a9_id#16789 AS ac0ad56_id#16965]
                                                                        +- Join LeftOuter, (01122a9_id#16789 = b5e3374#16932.reference)
                                                                           :- Project [id#16716, resource#16762, 01122a9_id#16789, resource#16762 AS 01122a9#16793]
                                                                           :  +- Project [id#16716, resource#16762, id#16716 AS 01122a9_id#16789]
                                                                           :     +- Project [id#16716, resource#16762]
                                                                           :        +- Project [id#16716, meta#16717, implicitRules#16718, language#16719, text#16720, identifier#16721, active#16722, name#16723, telecom#16724, gender#16725, birthDate#16726, deceasedBoolean#16727, deceasedDateTime#16728, address#16729, maritalStatus#16730, multipleBirthInteger#16731, multipleBirthBoolean#16732, photo#16733, contact#16734, communication#16735, generalPractitioner#16736, managingOrganization#16737, link#16738, named_struct(id, id#16716, meta, meta#16717, implicitRules, implicitRules#16718, language, language#16719, text, text#16720, identifier, identifier#16721, active, active#16722, name, name#16723, telecom, telecom#16724, gender, gender#16725, birthDate, birthDate#16726, deceasedBoolean, deceasedBoolean#16727, ... 22 more fields) AS resource#16762]
                                                                           :           +- Relation[id#16716,meta#16717,implicitRules#16718,language#16719,text#16720,identifier#16721,active#16722,name#16723,telecom#16724,gender#16725,birthDate#16726,deceasedBoolean#16727,deceasedDateTime#16728,address#16729,maritalStatus#16730,multipleBirthInteger#16731,multipleBirthBoolean#16732,photo#16733,contact#16734,communication#16735,generalPractitioner#16736,managingOrganization#16737,link#16738] parquet
                                                                           +- Project [id#16798, 2071abb#16923, 2071abb_id#16919, b5e3374_id#16927, 2071abb#16923.subject AS b5e3374#16932]
                                                                              +- Project [id#16798, 2071abb#16923, 2071abb_id#16919, 2071abb_id#16919 AS b5e3374_id#16927]
                                                                                 +- Project [id#16798, 2071abb#16876 AS 2071abb#16923, 2071abb_id#16919]
                                                                                    +- Project [id#16798, 2071abb#16876, id#16798 AS 2071abb_id#16919]
                                                                                       +- Project [id#16798, 2071abb#16876]
                                                                                          +- Project [id#16798, meta#16799, implicitRules#16800, language#16801, text#16802, identifier#16803, status#16804, statusReason#16805, intent#16806, category#16807, priority#16808, doNotPerform#16809, reportedBoolean#16810, reportedReference#16811, medicationCodeableConcept#16812, medicationReference#16813, subject#16814, encounter#16815, supportingInformation#16816, authoredOn#16817, requester#16818, performer#16819, performerType#16820, recorder#16821, ... 16 more fields]
                                                                                             +- Relation[id#16798,meta#16799,implicitRules#16800,language#16801,text#16802,identifier#16803,status#16804,statusReason#16805,intent#16806,category#16807,priority#16808,doNotPerform#16809,reportedBoolean#16810,reportedReference#16811,medicationCodeableConcept#16812,medicationReference#16813,subject#16814,encounter#16815,supportingInformation#16816,authoredOn#16817,requester#16818,performer#16819,performerType#16820,recorder#16821,... 15 more fields] parquet

	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.failAnalysis(CheckAnalysis.scala:43)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:95)
	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:369)
	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:86)
	at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:86)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:95)
	at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:108)
	at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:58)
	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:56)
	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:48)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78)
	at org.apache.spark.sql.RelationalGroupedDataset.toDF(RelationalGroupedDataset.scala:65)
	at org.apache.spark.sql.RelationalGroupedDataset.agg(RelationalGroupedDataset.scala:224)
	at org.apache.spark.sql.RelationalGroupedDataset.agg(RelationalGroupedDataset.scala:223)
	at au.csiro.pathling.query.AggregateExecutor.execute(AggregateExecutor.java:120)
	... 31 common frames omitted

AnalysisException au.csiro.pathling.query.parsing.ParsedExpression in setHashedValue...

Resolved attribute(s) 28667b4#59734 missing from cd8d844_id#59877,result#59772,1b7dd36#59872,1b7dd36_id#59868,28667b4_id#59705 in operator !Project [28667b4_id#59705, result#59772, 1b7dd36_id#59868, 1b7dd36#59872, cd8d844_id#59877, 28667b4#59734 AS cd8d844#59883].;;
!Project [28667b4_id#59705, result#59772, 1b7dd36_id#59868, 1b7dd36#59872, cd8d844_id#59877, 28667b4#59734 AS cd8d844#59883]
+- Project [28667b4_id#59705, result#59772, 1b7dd36_id#59868, 1b7dd36#59872, 1b7dd36_id#59868 AS cd8d844_id#59877]
   +- Filter (1b7dd36#59872 = true)
      +- Project [28667b4_id#59705, result#59772, 1b7dd36_id#59868, CASE WHEN isnull(result#59772) THEN false ELSE result#59772 END AS 1b7dd36#59872]
         +- Project [28667b4_id#59705, result#59772, 28667b4_id#59705 AS 1b7dd36_id#59868]
            +- Project [28667b4_id#59705, result#59772]
               +- Join LeftOuter, (hash(28667b4#59734, 42) = hash#59771)
                  :- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59116, b5e3374#59128, 77b682a_id#59066, 77b682a#59077, id#59140, Patient#59186, 2582a3d_id#59228, 2582a3d#59243, id#59259, 01122a9#59336, 01122a9_id#59332, 8b2043c_id#59394, 8b2043c#59414, id#59435, 2071abb#59560, 2071abb_id#59556, b5e3374_id#59564, ... 5 more fields]
                  :  +- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59116, b5e3374#59128, 77b682a_id#59066, 77b682a#59077, id#59140, Patient#59186, 2582a3d_id#59228, 2582a3d#59243, id#59259, 01122a9#59336, 01122a9_id#59332, 8b2043c_id#59394, 8b2043c#59414, id#59435, 2071abb#59560, 2071abb_id#59556, b5e3374_id#59564, ... 4 more fields]
                  :     +- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59116, b5e3374#59128, 77b682a_id#59066, 77b682a#59077, id#59140, Patient#59186, 2582a3d_id#59228, 2582a3d#59243, id#59259, 01122a9#59336, 01122a9_id#59332, 8b2043c_id#59394, 8b2043c#59414, id#59435, 2071abb#59560, 2071abb_id#59556, b5e3374_id#59564, ... 3 more fields]
                  :        +- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59116, b5e3374#59128, 77b682a_id#59066, 77b682a#59077, id#59140, Patient#59186, 2582a3d_id#59228, 2582a3d#59243, id#59259, 01122a9#59336, 01122a9_id#59332, 8b2043c_id#59394, 8b2043c#59414, id#59435, 2071abb#59560, 2071abb_id#59556, b5e3374_id#59564, ... 2 more fields]
                  :           +- Join LeftOuter, (8b2043c_id#59394 = b5e3374#59569.reference)
                  :              :- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59116, b5e3374#59128, 77b682a_id#59066, 77b682a#59077, id#59140, Patient#59186, 2582a3d_id#59228, 2582a3d#59243, id#59259, 01122a9#59336, 01122a9_id#59332, 8b2043c_id#59394, 01122a9#59336 AS 8b2043c#59414]
                  :              :  +- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59116, b5e3374#59128, 77b682a_id#59066, 77b682a#59077, id#59140, Patient#59186, 2582a3d_id#59228, 2582a3d#59243, id#59259, 01122a9#59336, 01122a9_id#59332, 2582a3d_id#59228 AS 8b2043c_id#59394]
                  :              :     +- Join LeftOuter, ((Patient#59186 = Patient) && (2582a3d#59243 = 01122a9_id#59332))
                  :              :        :- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59116, b5e3374#59128, 77b682a_id#59066, 77b682a#59077, id#59140, Patient#59186, 2582a3d_id#59228, id#59140 AS 2582a3d#59243]
                  :              :        :  +- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59116, b5e3374#59128, 77b682a_id#59066, 77b682a#59077, id#59140, Patient#59186, b5e3374_id#59116 AS 2582a3d_id#59228]
                  :              :        :     +- Join LeftOuter, (b5e3374#59128.reference = id#59140)
                  :              :        :        :- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59116, 77b682a#59077.subject AS b5e3374#59128, 77b682a_id#59066, 77b682a#59077]
                  :              :        :        :  +- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, 77b682a_id#59066 AS b5e3374_id#59116, b5e3374#59033, 77b682a_id#59066, 77b682a#59077]
                  :              :        :        :     +- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59028, b5e3374#59033, 77b682a_id#59066, 9e2941b#59024 AS 77b682a#59077]
                  :              :        :        :        +- Project [id#58844, resource#58890, 01122a9_id#58917, 01122a9#58921, id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59028, b5e3374#59033, 01122a9_id#58917 AS 77b682a_id#59066]
                  :              :        :        :           +- Join LeftOuter, (01122a9_id#58917 = b5e3374#59033.reference)
                  :              :        :        :              :- Project [id#58844, resource#58890, 01122a9_id#58917, resource#58890 AS 01122a9#58921]
                  :              :        :        :              :  +- Project [id#58844, resource#58890, id#58844 AS 01122a9_id#58917]
                  :              :        :        :              :     +- Project [id#58844, resource#58890]
                  :              :        :        :              :        +- Project [id#58844, meta#58845, implicitRules#58846, language#58847, text#58848, identifier#58849, active#58850, name#58851, telecom#58852, gender#58853, birthDate#58854, deceasedBoolean#58855, deceasedDateTime#58856, address#58857, maritalStatus#58858, multipleBirthInteger#58859, multipleBirthBoolean#58860, photo#58861, contact#58862, communication#58863, generalPractitioner#58864, managingOrganization#58865, link#58866, named_struct(id, id#58844, meta, meta#58845, implicitRules, implicitRules#58846, language, language#58847, text, text#58848, identifier, identifier#58849, active, active#58850, name, name#58851, telecom, telecom#58852, gender, gender#58853, birthDate, birthDate#58854, deceasedBoolean, deceasedBoolean#58855, ... 22 more fields) AS resource#58890]
                  :              :        :        :              :           +- Relation[id#58844,meta#58845,implicitRules#58846,language#58847,text#58848,identifier#58849,active#58850,name#58851,telecom#58852,gender#58853,birthDate#58854,deceasedBoolean#58855,deceasedDateTime#58856,address#58857,maritalStatus#58858,multipleBirthInteger#58859,multipleBirthBoolean#58860,photo#58861,contact#58862,communication#58863,generalPractitioner#58864,managingOrganization#58865,link#58866] parquet
                  :              :        :        :              +- Project [id#58926, 9e2941b#59024, 9e2941b_id#59020, b5e3374_id#59028, 9e2941b#59024.subject AS b5e3374#59033]
                  :              :        :        :                 +- Project [id#58926, 9e2941b#59024, 9e2941b_id#59020, 9e2941b_id#59020 AS b5e3374_id#59028]
                  :              :        :        :                    +- Project [id#58926, 9e2941b#58986 AS 9e2941b#59024, 9e2941b_id#59020]
                  :              :        :        :                       +- Project [id#58926, 9e2941b#58986, id#58926 AS 9e2941b_id#59020]
                  :              :        :        :                          +- Project [id#58926, 9e2941b#58986]
                  :              :        :        :                             +- Project [id#58926, meta#58927, implicitRules#58928, language#58929, text#58930, identifier#58931, clinicalStatus#58932, verificationStatus#58933, category#58934, severity#58935, code#58936, bodySite#58937, subject#58938, encounter#58939, onsetAge#58940, onsetString#58941, onsetPeri... 

Request:

curl \
 -X POST \
 --compressed \
 -H "Accept: application/fhir+json" \
 -H "Accept-Encoding: gzip, deflate, br" \
 -H "Accept-Language: en-AU,en;q=0.5" \
 -H "Connection: close" \
 -H "Content-Length: 983" \
 -H "Content-Type: application/fhir+json" \
 -H "Dnt: 1" \
 -H "Host: server.pathling.app" \
 -H "Origin: https://try.pathling.app" \
 -H "Referer: https://try.pathling.app/" \
 -H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:76.0) Gecko/20100101 Firefox/76.0" \
 -H "X-Forwarded-Host: server.pathling.app" \
 -H "X-Forwarded-Proto: https" \
 --data "{\"resourceType\":\"Parameters\",\"parameter\":[{\"valueCode\":\"Patient\",\"name\":\"subjectResource\"},{\"part\":[{\"name\":\"expression\",\"valueString\":\"count()\"},{\"name\":\"label\",\"valueString\":\"Number of patients\"}],\"name\":\"aggregation\"},{\"part\":[{\"name\":\"expression\",\"valueString\":\"[Filtered]\"},{\"name\":\"label\",\"valueString\":\"condition\"}],\"name\":\"grouping\"}]}" \
 "http://server.pathling.app/fhir/$aggregate"

Search request causes error when authorization is enabled

<html>

<head>
	<meta http-equiv="Content-Type" content="text/html;charset=utf-8" />
	<title>Error 500 java.lang.NullPointerException: theResource can not be null</title>
</head>

<body>
	<h2>HTTP ERROR 500 java.lang.NullPointerException: theResource can not be null</h2>
	<table>
		<tr>
			<th>URI:</th>
			<td>/fhir/CarePlan</td>
		</tr>
		<tr>
			<th>STATUS:</th>
			<td>500</td>
		</tr>
		<tr>
			<th>MESSAGE:</th>
			<td>java.lang.NullPointerException: theResource can not be null</td>
		</tr>
		<tr>
			<th>SERVLET:</th>
			<td>au.csiro.pathling.fhir.AnalyticsServer-673fdbce</td>
		</tr>
		<tr>
			<th>CAUSED BY:</th>
			<td>java.lang.NullPointerException: theResource can not be null</td>
		</tr>
	</table>
	<h3>Caused by:</h3>
	<pre>java.lang.NullPointerException: theResource can not be null
	at org.apache.commons.lang3.Validate.notNull(Validate.java:225)
	at ca.uhn.fhir.parser.BaseParser.encodeResourceToWriter(BaseParser.java:355)
	at ca.uhn.fhir.parser.BaseParser.encodeResourceToWriter(BaseParser.java:351)
	at ca.uhn.fhir.parser.BaseParser.encodeResourceToString(BaseParser.java:340)
	at au.csiro.pathling.fhir.ErrorReportingInterceptor.buildHttpInterface(ErrorReportingInterceptor.java:94)
	at au.csiro.pathling.fhir.ErrorReportingInterceptor.reportErrorToSentry(ErrorReportingInterceptor.java:57)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at ca.uhn.fhir.interceptor.executor.InterceptorService$HookInvoker.invoke(InterceptorService.java:497)
	at ca.uhn.fhir.interceptor.executor.InterceptorService.doCallHooks(InterceptorService.java:271)
	at ca.uhn.fhir.interceptor.executor.InterceptorService.callHooks(InterceptorService.java:260)
	at ca.uhn.fhir.rest.server.RestfulServer.handleRequest(RestfulServer.java:1073)
	at ca.uhn.fhir.rest.server.RestfulServer.doGet(RestfulServer.java:336)
	at ca.uhn.fhir.rest.server.RestfulServer.service(RestfulServer.java:1650)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:755)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:547)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.lang.Thread.run(Thread.java:748)
</pre>

</body>

</html>

The underlying error looks something like this:

java.lang.NullPointerException: null
	at au.csiro.pathling.fhir.AuthorisationInterceptor.authoriseRequest(AuthorisationInterceptor.java:70)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at ca.uhn.fhir.interceptor.executor.InterceptorService$HookInvoker.invoke(InterceptorService.java:497)
	at ca.uhn.fhir.interceptor.executor.InterceptorService.doCallHooks(InterceptorService.java:271)
	at ca.uhn.fhir.interceptor.executor.InterceptorService.callHooks(InterceptorService.java:260)
	at ca.uhn.fhir.rest.server.method.BaseMethodBinding.invokeServerMethod(BaseMethodBinding.java:232)
	at ca.uhn.fhir.rest.server.method.SearchMethodBinding.invokeServer(SearchMethodBinding.java:260)
	at ca.uhn.fhir.rest.server.method.SearchMethodBinding.invokeServer(SearchMethodBinding.java:50)
	at ca.uhn.fhir.rest.server.method.BaseResourceReturningMethodBinding.doInvokeServer(BaseResourceReturningMethodBinding.java:243)
	at ca.uhn.fhir.rest.server.method.BaseResourceReturningMethodBinding.invokeServer(BaseResourceReturningMethodBinding.java:380)
	at ca.uhn.fhir.rest.server.RestfulServer.handleRequest(RestfulServer.java:998)
	at ca.uhn.fhir.rest.server.RestfulServer.doGet(RestfulServer.java:336)
	at ca.uhn.fhir.rest.server.RestfulServer.service(RestfulServer.java:1650)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:755)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:547)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.lang.Thread.run(Thread.java:748)

Concept translation

translate

collection<Coding|CodeableConcept> -> translate(conceptMapUrl: string, reverse = false, equivalence = 'equivalent') : collection<Coding>

When invoked on a Coding-valued element, returns any matching concepts using the ConceptMap specified using conceptMapUrl.

The reverse parameter controls the direction to traverse the map - true results in "source to target" mappings while false results in "target to source".

The equivalence parameter is a comma-delimited set of values from the ConceptMapEquivalence ValueSet, and is used to filter the mappings returned to only those that have an equivalence value in this list.

Example:

Condition.code.coding.translate('https://csiro.au/fhir/ConceptMap/some-map', true, 'equivalent,wider').display

Update documentation with v3 changes

Some specific tasks:

  • Update all documentation to refer to FHIRPath 2.0.0
  • Update the configuration section to reflect the changes, and use the application.yml file itself to document the possible values

These changes should also address #84 and #86.

URI value in grouping causes error

Request:

{
  "resourceType": "Parameters",
  "parameter": [
    {
      "name": "subjectResource",
      "valueCode": "Encounter"
    },
    {
      "name": "aggregation",
      "part": [
        {
          "name": "expression",
          "valueString": "count()"
        },
        {
          "name": "label",
          "valueString": "Number of encounters"
        }
      ]
    },
    {
      "name": "grouping",
      "part": [
        {
          "name": "expression",
          "valueString": "reverseResolve(Condition.encounter).code.coding.system"
        },
        {
          "name": "label",
          "valueString": "Code system"
        }
      ]
    }
  ]
}

Error:

Caused by: java.lang.ClassCastException: org.hl7.fhir.r4.model.UriType cannot be cast to org.hl7.fhir.r4.model.StringType
	at au.csiro.pathling.query.parsing.LiteralComposer.getFhirPathForType(LiteralComposer.java:32)
	at au.csiro.pathling.query.AggregateExecutor.buildDrillDown(AggregateExecutor.java:291)
	at au.csiro.pathling.query.AggregateExecutor.lambda$mapRowToGrouping$5(AggregateExecutor.java:241)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
	at au.csiro.pathling.query.AggregateExecutor.buildResponse(AggregateExecutor.java:212)
	at au.csiro.pathling.query.AggregateExecutor.execute(AggregateExecutor.java:124)
	... 31 common frames omitted

Set up CircleCI

The goal is to have the status of the build (i.e. passing tests) visible to the public, as well as automating the build and publishing of Docker images.

This includes the addition of a CircleCI build badge to the README.

Skip validation of display term

In Pathling v2, the memberOf function uses the $validate-code operation, and currently sends through the display term for validation.

This is not desirable, as it puts the burden on the user to ensure that all imported data contains the correct display terms. Validation should be the responsibility of the data transformation process that happens ahead of import to Pathling, not the responsibility of the query engine.

This ticket represents the work to fix this within Pathling 2. This issue is already resolved within Pathling 3, as it uses the ValueSet $expand operation.

Refactor subsumes function

Reimplement the subsumes/subsumedBy function as part of the refactored design of the v3 implementation.

Resolved attribute missing error in response to "Number of medication requests by status where authored between 2000 and 2018"

Request:

{
    "resourceType": "Parameters",
    "parameter": [
        {
            "name": "subjectResource",
            "valueCode": "MedicationRequest"
        },
        {
            "name": "aggregation",
            "part": [
                {
                    "name": "expression",
                    "valueString": "count()"
                },
                {
                    "name": "label",
                    "valueString": "Number of medication requests"
                }
            ]
        },
        {
            "name": "grouping",
            "part": [
                {
                    "name": "expression",
                    "valueString": "status"
                },
                {
                    "name": "label",
                    "valueString": "Status"
                }
            ]
        },
        {
            "name": "filter",
            "valueString": "authoredOn < @2018 and authoredOn > @2000"
        }
    ]
}

Exception trace:

Caused by: org.apache.spark.sql.AnalysisException: Resolved attribute(s) w8apxp_value#542 missing from hxfln8_id#609,4yvfvk_value#314,gs5esz_value#362,eqcg2i_value#416,hxfln8_value#620,oe2w4p_value#475,p0wj6x_value#520,w8apxp_value#375 in operator !Aggregate [w8apxp_value#542], [w8apxp_value#542, CASE WHEN isnull(count(distinct hxfln8_value#620)) THEN 0 ELSE count(distinct hxfln8_value#620) END AS CASE WHEN (count(hxfln8_value) IS NULL) THEN 0 ELSE count(hxfln8_value) END#648L]. Attribute(s) with the same name appear in the operation: w8apxp_value. Please check if the right attribute(s) are used.;;
!Aggregate [w8apxp_value#542], [w8apxp_value#542, CASE WHEN isnull(count(distinct hxfln8_value#620)) THEN 0 ELSE count(distinct hxfln8_value#620) END AS CASE WHEN (count(hxfln8_value) IS NULL) THEN 0 ELSE count(hxfln8_value) END#648L]
+- Project [hxfln8_id#609, 4yvfvk_value#314, gs5esz_value#362, p0wj6x_value#520, oe2w4p_value#475, eqcg2i_value#416, w8apxp_value#375, hxfln8_value#620]
   +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, p0wj6x_value#520, oe2w4p_value#475, eqcg2i_value#416, p0wj6x_id#511, w8apxp_value#375, w8apxp_id#537, hxfln8_id#609, gs5esz_value#362 AS hxfln8_value#620]
      +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, p0wj6x_value#520, oe2w4p_value#475, eqcg2i_value#416, p0wj6x_id#511, w8apxp_value#375, w8apxp_id#537, gs5esz_id#358 AS hxfln8_id#609]
         +- Filter p0wj6x_value#520: boolean
            +- Join LeftOuter, (p0wj6x_id#511 = w8apxp_id#537)
               :- Join LeftOuter, (gs5esz_id#358 = p0wj6x_id#511)
               :  :- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362]
               :  :  +- Project [4yvfvk_id#273, 4yvfvk_value#314, gs5esz_id#358, 4yvfvk_value#314 AS gs5esz_value#362]
               :  :     +- Project [4yvfvk_id#273, 4yvfvk_value#314, 4yvfvk_id#273 AS gs5esz_id#358]
               :  :        +- Project [4yvfvk_id#273, 4yvfvk_value#314]
               :  :           +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 17 more fields]
               :  :              +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 16 more fields]
               :  :                 +- Relation[id#0,meta#1,implicitRules#2,language#3,text#4,identifier#5,status#6,statusReason#7,intent#8,category#9,priority#10,doNotPerform#11,reportedBoolean#12,reportedReference#13,medicationCodeableConcept#14,medicationReference#15,subject#16,encounter#17,supportingInformation#18,authoredOn#19,requester#20,performer#21,performerType#22,recorder#23,... 15 more fields] parquet
               :  +- Project [p0wj6x_value#520, oe2w4p_value#475, eqcg2i_value#416, p0wj6x_id#511, w8apxp_value#375]
               :     +- Project [p0wj6x_id#511, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#375, eqcg2i_value#416, oe2w4p_value#475, p0wj6x_value#520]
               :        +- Project [eqcg2i_id#409, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#375, eqcg2i_value#416, oe2w4p_value#475, oe2w4p_id#468, p0wj6x_id#511, (eqcg2i_value#416 AND oe2w4p_value#475) AS p0wj6x_value#520]
               :           +- Project [eqcg2i_id#409, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#375, eqcg2i_value#416, oe2w4p_value#475, oe2w4p_id#468, eqcg2i_id#409 AS p0wj6x_id#511]
               :              +- Join LeftOuter, (eqcg2i_id#409 = oe2w4p_id#468)
               :                 :- Project [eqcg2i_id#409, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#375, eqcg2i_value#416]
               :                 :  +- Project [w8apxp_id#370, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#375, 9re1eq_id#385, eqcg2i_id#409, (to_timestamp(w8apxp_value#375, None) < to_timestamp(17532, None)) AS eqcg2i_value#416]
               :                 :     +- Project [w8apxp_id#370, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#375, 9re1eq_id#385, w8apxp_id#370 AS eqcg2i_id#409]
               :                 :        +- Join LeftOuter, (w8apxp_id#370 = 9re1eq_id#385)
               :                 :           :- Project [w8apxp_id#370, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#375]
               :                 :           :  +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, w8apxp_id#370, gs5esz_value#362.authoredOn AS w8apxp_value#375]
               :                 :           :     +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, gs5esz_id#358 AS w8apxp_id#370]
               :                 :           :        +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362]
               :                 :           :           +- Project [4yvfvk_id#273, 4yvfvk_value#314, gs5esz_id#358, 4yvfvk_value#314 AS gs5esz_value#362]
               :                 :           :              +- Project [4yvfvk_id#273, 4yvfvk_value#314, 4yvfvk_id#273 AS gs5esz_id#358]
               :                 :           :                 +- Project [4yvfvk_id#273, 4yvfvk_value#314]
               :                 :           :                    +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 17 more fields]
               :                 :           :                       +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 16 more fields]
               :                 :           :                          +- Relation[id#0,meta#1,implicitRules#2,language#3,text#4,identifier#5,status#6,statusReason#7,intent#8,category#9,priority#10,doNotPerform#11,reportedBoolean#12,reportedReference#13,medicationCodeableConcept#14,medicationReference#15,subject#16,encounter#17,supportingInformation#18,authoredOn#19,requester#20,performer#21,performerType#22,recorder#23,... 15 more fields] parquet
               :                 :           +- Project [9re1eq_id#385]
               :                 :              +- Project [9re1eq_id#385, 4yvfvk_value#314, gs5esz_value#362]
               :                 :                 +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, gs5esz_id#358 AS 9re1eq_id#385]
               :                 :                    +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362]
               :                 :                       +- Project [4yvfvk_id#273, 4yvfvk_value#314, gs5esz_id#358, 4yvfvk_value#314 AS gs5esz_value#362]
               :                 :                          +- Project [4yvfvk_id#273, 4yvfvk_value#314, 4yvfvk_id#273 AS gs5esz_id#358]
               :                 :                             +- Project [4yvfvk_id#273, 4yvfvk_value#314]
               :                 :                                +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 17 more fields]
               :                 :                                   +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 16 more fields]
               :                 :                                      +- Relation[id#0,meta#1,implicitRules#2,language#3,text#4,identifier#5,status#6,statusReason#7,intent#8,category#9,priority#10,doNotPerform#11,reportedBoolean#12,reportedReference#13,medicationCodeableConcept#14,medicationReference#15,subject#16,encounter#17,supportingInformation#18,authoredOn#19,requester#20,performer#21,performerType#22,recorder#23,... 15 more fields] parquet
               :                 +- Project [oe2w4p_value#475, oe2w4p_id#468]
               :                    +- Project [oe2w4p_id#468, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#434, oe2w4p_value#475]
               :                       +- Project [w8apxp_id#429, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#434, 95wnaa_id#444, oe2w4p_id#468, (to_timestamp(w8apxp_value#434, None) > to_timestamp(10957, None)) AS oe2w4p_value#475]
               :                          +- Project [w8apxp_id#429, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#434, 95wnaa_id#444, w8apxp_id#429 AS oe2w4p_id#468]
               :                             +- Join LeftOuter, (w8apxp_id#429 = 95wnaa_id#444)
               :                                :- Project [w8apxp_id#429, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#434]
               :                                :  +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, w8apxp_id#429, gs5esz_value#362.authoredOn AS w8apxp_value#434]
               :                                :     +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, gs5esz_id#358 AS w8apxp_id#429]
               :                                :        +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362]
               :                                :           +- Project [4yvfvk_id#273, 4yvfvk_value#314, gs5esz_id#358, 4yvfvk_value#314 AS gs5esz_value#362]
               :                                :              +- Project [4yvfvk_id#273, 4yvfvk_value#314, 4yvfvk_id#273 AS gs5esz_id#358]
               :                                :                 +- Project [4yvfvk_id#273, 4yvfvk_value#314]
               :                                :                    +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 17 more fields]
               :                                :                       +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 16 more fields]
               :                                :                          +- Relation[id#0,meta#1,implicitRules#2,language#3,text#4,identifier#5,status#6,statusReason#7,intent#8,category#9,priority#10,doNotPerform#11,reportedBoolean#12,reportedReference#13,medicationCodeableConcept#14,medicationReference#15,subject#16,encounter#17,supportingInformation#18,authoredOn#19,requester#20,performer#21,performerType#22,recorder#23,... 15 more fields] parquet
               :                                +- Project [95wnaa_id#444]
               :                                   +- Project [95wnaa_id#444, 4yvfvk_value#314, gs5esz_value#362]
               :                                      +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, gs5esz_id#358 AS 95wnaa_id#444]
               :                                         +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362]
               :                                            +- Project [4yvfvk_id#273, 4yvfvk_value#314, gs5esz_id#358, 4yvfvk_value#314 AS gs5esz_value#362]
               :                                               +- Project [4yvfvk_id#273, 4yvfvk_value#314, 4yvfvk_id#273 AS gs5esz_id#358]
               :                                                  +- Project [4yvfvk_id#273, 4yvfvk_value#314]
               :                                                     +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 17 more fields]
               :                                                        +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 16 more fields]
               :                                                           +- Relation[id#0,meta#1,implicitRules#2,language#3,text#4,identifier#5,status#6,statusReason#7,intent#8,category#9,priority#10,doNotPerform#11,reportedBoolean#12,reportedReference#13,medicationCodeableConcept#14,medicationReference#15,subject#16,encounter#17,supportingInformation#18,authoredOn#19,requester#20,performer#21,performerType#22,recorder#23,... 15 more fields] parquet
               +- Project [w8apxp_id#537]
                  +- Project [w8apxp_id#537, 4yvfvk_value#314, gs5esz_value#362, w8apxp_value#542]
                     +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, w8apxp_id#537, gs5esz_value#362.status AS w8apxp_value#542]
                        +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362, gs5esz_id#358 AS w8apxp_id#537]
                           +- Project [gs5esz_id#358, 4yvfvk_value#314, gs5esz_value#362]
                              +- Project [4yvfvk_id#273, 4yvfvk_value#314, gs5esz_id#358, 4yvfvk_value#314 AS gs5esz_value#362]
                                 +- Project [4yvfvk_id#273, 4yvfvk_value#314, 4yvfvk_id#273 AS gs5esz_id#358]
                                    +- Project [4yvfvk_id#273, 4yvfvk_value#314]
                                       +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 17 more fields]
                                          +- Project [id#0, meta#1, implicitRules#2, language#3, text#4, identifier#5, status#6, statusReason#7, intent#8, category#9, priority#10, doNotPerform#11, reportedBoolean#12, reportedReference#13, medicationCodeableConcept#14, medicationReference#15, subject#16, encounter#17, supportingInformation#18, authoredOn#19, requester#20, performer#21, performerType#22, recorder#23, ... 16 more fields]
                                             +- Relation[id#0,meta#1,implicitRules#2,language#3,text#4,identifier#5,status#6,statusReason#7,intent#8,category#9,priority#10,doNotPerform#11,reportedBoolean#12,reportedReference#13,medicationCodeableConcept#14,medicationReference#15,subject#16,encounter#17,supportingInformation#18,authoredOn#19,requester#20,performer#21,performerType#22,recorder#23,... 15 more fields] parquet

	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis(CheckAnalysis.scala:49)
	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis$(CheckAnalysis.scala:48)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:130)
	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1(CheckAnalysis.scala:582)
	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1$adapted(CheckAnalysis.scala:92)
	at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:177)
	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis(CheckAnalysis.scala:92)
	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis$(CheckAnalysis.scala:89)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:130)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:156)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:153)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:68)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:133)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:763)
	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:133)
	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:68)
	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:66)
	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:58)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$1(Dataset.scala:91)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:763)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:89)
	at org.apache.spark.sql.RelationalGroupedDataset.toDF(RelationalGroupedDataset.scala:66)
	at org.apache.spark.sql.RelationalGroupedDataset.agg(RelationalGroupedDataset.scala:256)
	at org.apache.spark.sql.RelationalGroupedDataset.agg(RelationalGroupedDataset.scala:255)
	at au.csiro.pathling.fhirpath.function.AggregateFunction.applyAggregation(AggregateFunction.java:43)
	at au.csiro.pathling.fhirpath.function.CountFunction.invoke(CountFunction.java:51)
	at au.csiro.pathling.fhirpath.parser.InvocationVisitor.visitFunctionInvocation(InvocationVisitor.java:157)
	at au.csiro.pathling.fhirpath.parser.InvocationVisitor.visitFunctionInvocation(InvocationVisitor.java:39)
	at au.csiro.pathling.fhir.FhirPathParser$FunctionInvocationContext.accept(FhirPathParser.java:1222)
	at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visit(AbstractParseTreeVisitor.java:18)
	at au.csiro.pathling.fhirpath.parser.TermVisitor.visitInvocationTerm(TermVisitor.java:38)
	at au.csiro.pathling.fhirpath.parser.TermVisitor.visitInvocationTerm(TermVisitor.java:26)
	at au.csiro.pathling.fhir.FhirPathParser$InvocationTermContext.accept(FhirPathParser.java:816)
	at au.csiro.pathling.fhirpath.parser.Visitor.visitTermExpression(Visitor.java:45)
	at au.csiro.pathling.fhirpath.parser.Visitor.visitTermExpression(Visitor.java:27)
	at au.csiro.pathling.fhir.FhirPathParser$TermExpressionContext.accept(FhirPathParser.java:405)
	at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visit(AbstractParseTreeVisitor.java:18)
	at au.csiro.pathling.fhirpath.parser.Parser.parse(Parser.java:53)
	at au.csiro.pathling.aggregate.FreshAggregateExecutor.parseAggregations(FreshAggregateExecutor.java:171)
	at au.csiro.pathling.aggregate.FreshAggregateExecutor.execute(FreshAggregateExecutor.java:120)
	at au.csiro.pathling.aggregate.AggregateProvider.aggregate(AggregateProvider.java:50)
	... 46 common frames omitted

Add configuration variable to control max heap size

Currently, the Xmx property defaults to that of the JVM in use, and usually goes for a quarter of the available physical memory. Spark requires a minimum of 471.9MB, and if Xmx (or spark.driver.memory) is set to less than this, an error will be thrown on startup.

See: https://github.com/apache/spark/blob/7296999c4751cfddcca5b77e3348354cff65d069/core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala#L218

This means that it will currently throw an error if there is less than about 3GB of RAM available to the execution container, even if this amount of memory is not actually needed by the application. We need a configuration variable to control Xmx/spark.driver.memory in cases where deployers need control.

Relates to #84.

Request to metadata causes error when no available resources

Issuing a request to the /metadata endpoint when there are currently no resources available for query (i.e. empty database) results in an error.

ca.uhn.fhir.rest.server.exceptions.InternalErrorException: Failed to call access method: java.lang.IllegalArgumentException: Collection is empty
	at ca.uhn.fhir.rest.server.method.BaseMethodBinding.invokeServerMethod(BaseMethodBinding.java:244)
	at ca.uhn.fhir.rest.server.method.ConformanceMethodBinding.invokeServer(ConformanceMethodBinding.java:144)
	at ca.uhn.fhir.rest.server.method.ConformanceMethodBinding.invokeServer(ConformanceMethodBinding.java:49)
	at ca.uhn.fhir.rest.server.method.BaseResourceReturningMethodBinding.doInvokeServer(BaseResourceReturningMethodBinding.java:247)
	at ca.uhn.fhir.rest.server.method.BaseResourceReturningMethodBinding.invokeServer(BaseResourceReturningMethodBinding.java:384)
	at ca.uhn.fhir.rest.server.RestfulServer.handleRequest(RestfulServer.java:1002)
	at ca.uhn.fhir.rest.server.RestfulServer.doGet(RestfulServer.java:336)
	at ca.uhn.fhir.rest.server.RestfulServer.service(RestfulServer.java:1695)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:763)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:489)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at ca.uhn.fhir.rest.server.method.BaseMethodBinding.invokeServerMethod(BaseMethodBinding.java:239)
	... 25 common frames omitted
Caused by: java.lang.IllegalArgumentException: Collection is empty
	at java.util.EnumSet.copyOf(EnumSet.java:174)
	at au.csiro.pathling.fhir.AnalyticsServerCapabilities.buildResources(AnalyticsServerCapabilities.java:133)
	at au.csiro.pathling.fhir.AnalyticsServerCapabilities.buildRestComponent(AnalyticsServerCapabilities.java:97)
	at au.csiro.pathling.fhir.AnalyticsServerCapabilities.getServerConformance(AnalyticsServerCapabilities.java:86)
	... 30 common frames omitted

AggregateExecutorTests causing JVM core dump

This seems to be happening across a range of different tests within the AggregateExecutorTest suite.

Here is an example of one of the errors:

[INFO] a.c.pathling.query.AggregateExecutor - Received $aggregate request: aggregations=[count()] groupings=[item.sequence.first() + 1] filters=[]
[INFO] a.c.pathling.query.SearchExecutor - Received search request: filters=[(item.sequence.first() + 1) = 2]
[INFO] a.c.pathling.query.SearchExecutor - Retrieving search results (1-100)
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGBUS (0xa) at pc=0x000000010b00e41c, pid=92243, tid=0x0000000000002803
#
# JRE version: OpenJDK Runtime Environment (8.0_242-b08) (build 1.8.0_242-b08)
# Java VM: OpenJDK 64-Bit Server VM (25.242-b08 mixed mode bsd-amd64 compressed oops)
# Problematic frame:
# v  ~StubRoutines::jbyte_disjoint_arraycopy
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /Users/xxxxx/Code/pathling/fhir-server/hs_err_pid92243.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.java.com/bugreport/crash.jsp
#

Process finished with exit code 134 (interrupted by signal 6: SIGABRT)

This can be reproduced using OpenJDK 8u242 on a Mac.

Extract operation

This change will introduce a new operation called extract. This operation is designed for transforming FHIR data into a flattened form, for use within other tools, such as statistical and machine learning models.

The operation takes a set of expressions that define columns in a tabular view of the data. A URL pointing to a delimited text file is returned, which contains the result of executing the expressions against each subject resource.

Extract operation

Request

The request for the $extract operation is a Parameters resource containing the following parameters:

  • subjectResource [1..1] - (code) The subject resource that the expressions within this query are evaluated against. Code must be a member of http://hl7.org/fhir/ValueSet/resource-types.
  • column [1..*] - An expression which is used to extract a value from each resource.
    • expression [1..1] - (string) A FHIRPath expression that defines the column. The context is a single resource of the type specified in the subjectResource parameter. The expression must evaluate to a primitive value. If any columns preceding this column end in an aggregate function, this column expression must also.
    • label [0..1] - (string) A short description for the column, for display purposes.
  • filter [0..*] - (string) A FHIRPath expression that can be evaluated against each resource in the data set to determine whether it is included within the result. The context is an individual resource of the type specified in the subjectResource parameter. The expression must evaluate to a Boolean value. Multiple filters are combined using AND logic.

Response

The response for the $extract operation is a Parameters resource containing the following parameters:

  • url [1..1] - (uri) A URL at which the result of the operation can be retrieved.

GET /fhir/$aggregate causes NullPointerException

java.lang.NullPointerException: null
    at au.csiro.pathling.query.AggregateRequest.<init>(AggregateRequest.java:51)
    at au.csiro.pathling.query.AggregateProvider.aggregate(AggregateProvider.java:30)
    at sun.reflect.GeneratedMethodAccessor172.invoke
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at ca.uhn.fhir.rest.server.method.BaseMethodBinding.invokeServerMethod(BaseMethodBinding.java:239)
    at ca.uhn.fhir.rest.server.method.OperationMethodBinding.invokeServer(OperationMethodBinding.java:329)
    at ca.uhn.fhir.rest.server.method.BaseResourceReturningMethodBinding.doInvokeServer(BaseResourceReturningMethodBinding.java:247)
    at ca.uhn.fhir.rest.server.method.BaseResourceReturningMethodBinding.invokeServer(BaseResourceReturningMethodBinding.java:384)
    at ca.uhn.fhir.rest.server.method.OperationMethodBinding.invokeServer(OperationMethodBinding.java:304)
    at ca.uhn.fhir.rest.server.RestfulServer.handleRequest(RestfulServer.java:1002)
    at ca.uhn.fhir.rest.server.RestfulServer.doGet(RestfulServer.java:336)
    at ca.uhn.fhir.rest.server.RestfulServer.service(RestfulServer.java:1695)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
    at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:755)
    at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:547)
    at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190)
    at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
    at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
    at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
    at org.eclipse.jetty.server.Server.handle(Server.java:500)
    at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
    at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
    at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
    at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
    at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
    at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
    at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
    at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
    at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
    at java.lang.Thread.run(Thread.java:748)

java.lang.reflect.InvocationTargetException: null
    at sun.reflect.GeneratedMethodAccessor172.invoke
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at ca.uhn.fhir.rest.server.method.BaseMethodBinding.invokeServerMethod(BaseMethodBinding.java:239)
    at ca.uhn.fhir.rest.server.method.OperationMethodBinding.invokeServer(OperationMethodBinding.java:329)
    at ca.uhn.fhir.rest.server.method.BaseResourceReturningMethodBinding.doInvokeServer(BaseResourceReturningMethodBinding.java:247)
    at ca.uhn.fhir.rest.server.method.BaseResourceReturningMethodBinding.invokeServer(BaseResourceReturningMethodBinding.java:384)
    at ca.uhn.fhir.rest.server.method.OperationMethodBinding.invokeServer(OperationMethodBinding.java:304)
    at ca.uhn.fhir.rest.server.RestfulServer.handleRequest(RestfulServer.java:1002)
    at ca.uhn.fhir.rest.server.RestfulServer.doGet(RestfulServer.java:336)
    at ca.uhn.fhir.rest.server.RestfulServer.service(RestfulServer.java:1695)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
    at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:755)
    at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:547)
    at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190)
    at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
    at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
    at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
    at org.eclipse.jetty.server.Server.handle(Server.java:500)
    at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
    at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
    at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
    at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
    at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
    at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
    at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
    at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
    at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
    at java.lang.Thread.run(Thread.java:748)

Issues with serialization of date time literals

There are two problems with the serialization of date time literals:

  1. The date format string used for this purpose, causing an error e.g. when creating groupings with a date time value in the aggregate operation.
  2. Dates must be converted to UTC before being represented in a FHIRPath expression, due to a shortcoming of the grammar which makes it impossible to use date literals with the ISO8601 time zone notation (+HH:MM).

CapabilityStatement should advertise security capabilities

Currently the CapabilityStatement is not accessible when authorization is enabled - this is wrong.

What we need to do is:

  1. Make sure the CapabilityStatement is always available, even when authorization is enabled.
  2. Add the necessary environment variables to allow for the configuration of the the authorize, token and revoke endpoints.
  3. Add a rest.security element to the CapabilityStatement, including extensions to advertise the authorization endpoints.
  4. Add a Well-Known Uniform Resource Identifiers (URIs) JSON file containing the authorization endpoints.

This change will add the following environment variables:

Investigate using AggregateFunction for membership operator

The membership operator uses aggregation, as it needs to reduce a number of input values into a single result value. As a result, there is a lot of duplicated code between MembershipOperator and the AggregateFunction class.

MembershipOperator could possibly inherit from AggregateFunction and reuse the same code for the aggregation functionality.

decimal encoding does not preserve precision.

The spec (http://hl7.org/fhir/datatypes.html) say:

The precision of the decimal value has significance:
e.g. 0.010 is regarded as different to 0.01, and the original precision should be preserved
Implementations SHALL handle decimal values in ways that preserve and respect the precision of the value as represented for presentation purposes

The way we store it now always converts precision to 4 decimal places. This is agains the specification which says that precision should be preserved and 1.0000 is different than 1.00. For currency amounts it adds unnecessary 0s e.g. 22.22 becomes 22.2000 and for some high precision values e.g. from observations drops some decimal places e.g. 0.0232323232 -> 0.0233.

Error initializing SparkContext.

I would have sent this as a private email if I had an email address.
I am from SNOMED and do most of the demo sessions for our data analysis tool.
Our new tool will be based on Pathling, and the data will be generated by Synthea.
I can now generate scenario data on Synthea and wanted to install Pathling.
I have Docker, Postman Java etc installed. When I run (as described in your documentation)

docker run --rm -p 8080:8080 aehrc/pathling

I get an error about initializing Spark which I will paste below. Basically not enough memory initializing Spark.
I am looking for help in just getting this up and running so I can start the actual work of loading and analyzing data.

This is on Windows. I tried on Linux but then ran into a different problem about not finding shared 64 bit library files. So I'm back to Windows which seems the easier problem.

Here is the part of the long stack trace where it first goes wrong. I have not separately installed Spark.

20:56:19.802 [main] [] INFO a.c.pathling.fhir.AnalyticsServer - Initializing Spark session
20:56:20.621 [main] [] ERROR org.apache.spark.SparkContext - Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 464519168 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:217)
at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:199)

Thanks
My email is [email protected] Happy to take this thread off of github as it may not be an "issue" but it does prevent the "getting started" instructions from working.

empty() function should support singular input

Currently, the empty function throws an error when invoked on a singular input element.

The desired behaviour is that the function should work on a singular element, and return true when executed in the context of a single resource and that element is missing.

Multiple aggregations on elements in single request produce inflated counts

Aggregation such as name.given.count() and identifier.count() produce different results if run in separate queries and when run together in a single query.

The cause of the issue is that since both expressions produce lists with multiple elements per resource, when run together the aggregation (count()) runs over the Cartesian product of the lists (per resource), resulting in inflated counts.

The ignored test AggregateExecutorTest#queryMultipleCountAggregations verifies the correct behaviour.

Refactor first function

Reimplement the first function as part of the refactored design of the v3 implementation.

Tasks:

  • Re-enable AggregateQueryTest#queryWithMathExpression
  • Re-enable AggregateQueryTest#queryWithWhereAsComparisonOperand

InstantType is decoded incorrectly

A value decoded InstantType differs from the one from encoded object.
This is an actual bug as the decoder does not compensate for the fact that Spark stores timestamps with microseconds precision., so decoded instants are way in the future.

One of the in parameters url or valueset or context must be provided

Request:

{
    "resourceType": "Parameters",
    "parameter": [
    	{
    		"name": "subjectResource",
    		"valueCode": "Patient"
    	},
        {
            "name": "aggregation",
            "part": [
                {
                    "name": "label",
                    "valueString": "Number of patients"
                },
                {
                    "name": "expression",
                    "valueString": "count()"
                }
            ]
        },
        {
            "name": "grouping",
            "part": [
                {
                    "name": "label",
                    "valueString": "Prescribed TNF inhibitor?"
                },
                {
                    "name": "expression",
                    "valueString": "reverseResolve(MedicationRequest.subject).medicationCodeableConcept.memberOf('http://snomed.info/sct?fhir_vs=ecl/(<< 416897008|Tumour necrosis factor alpha inhibitor product| OR 408154002|Adalimumab 40mg injection solution 0.8mL prefilled syringe|)') contains true"
                }
            ]
        },
        {
            "name": "grouping",
            "part": [
                {
                    "name": "label",
                    "valueString": "Contracted lung infection?"
                },
                {
                    "name": "expression",
                    "valueString": "reverseResolve(Condition.subject).code.memberOf('http://snomed.info/sct?fhir_vs=ecl/(< 64572001|Disease (disorder)| : (363698007|Finding site| = << 39607008|Lung structure|, 370135005|Pathological process| = << 441862004|Infectious process|))') contains true"
                }
            ]
        },
        {
            "name": "filter",
            "valueString": "reverseResolve(Condition.subject).code.memberOf('http://snomed.info/sct?fhir_vs=ecl/(< 64572001|Disease (disorder)| : (363698007|Finding site| = << 39352004|Joint structure|, 370135005|Pathological process| = << 263680009|Autoimmune process|))') contains true"
        },
        {
            "name": "filter",
            "valueString": "reverseResolve(Condition.subject).code.memberOf('http://snomed.info/sct?fhir_vs=ecl/(< 64572001|Disease (disorder)| : (363698007|Finding site| = << 39607008|Lung structure|, 263502005|Clinical course| = << 90734009|Chronic|))') contains true"
        }
    ]
}

Exception:

17:20:37.131 [Executor task launch worker for task 142] [qiRuCCxHaoHHmszO] ERROR org.apache.spark.executor.Executor - Exception in task 1.0 in stage 274.0 (TID 142)
ca.uhn.fhir.rest.server.exceptions.InvalidRequestException: HTTP 400 : One of the in parameters url or valueset or context must be provided
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at ca.uhn.fhir.rest.server.exceptions.BaseServerResponseException.newInstance(BaseServerResponseException.java:302)
	at ca.uhn.fhir.rest.client.impl.BaseClient.invokeClient(BaseClient.java:351)
	at ca.uhn.fhir.rest.client.impl.BaseClient.invokeClient(BaseClient.java:219)
	at ca.uhn.fhir.rest.client.impl.BaseClient.invokeClient(BaseClient.java:215)
	at ca.uhn.fhir.rest.client.impl.ClientInvocationHandler.invoke(ClientInvocationHandler.java:62)
	at com.sun.proxy.$Proxy90.expand(Unknown Source)
	at au.csiro.pathling.fhirpath.function.memberof.MemberOfMapper.call(MemberOfMapper.java:156)
	at org.apache.spark.sql.Dataset.$anonfun$mapPartitions$1(Dataset.scala:2793)
	at org.apache.spark.sql.execution.MapPartitionsExec.$anonfun$doExecute$3(objects.scala:195)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
17:20:37.131 [Executor task launch worker for task 141] [qiRuCCxHaoHHmszO] ERROR org.apache.spark.executor.Executor - Exception in task 0.0 in stage 274.0 (TID 141)
ca.uhn.fhir.rest.server.exceptions.InvalidRequestException: HTTP 400 : One of the in parameters url or valueset or context must be provided
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at ca.uhn.fhir.rest.server.exceptions.BaseServerResponseException.newInstance(BaseServerResponseException.java:302)
	at ca.uhn.fhir.rest.client.impl.BaseClient.invokeClient(BaseClient.java:351)
	at ca.uhn.fhir.rest.client.impl.BaseClient.invokeClient(BaseClient.java:219)
	at ca.uhn.fhir.rest.client.impl.BaseClient.invokeClient(BaseClient.java:215)
	at ca.uhn.fhir.rest.client.impl.ClientInvocationHandler.invoke(ClientInvocationHandler.java:62)
	at com.sun.proxy.$Proxy90.expand(Unknown Source)
	at au.csiro.pathling.fhirpath.function.memberof.MemberOfMapper.call(MemberOfMapper.java:156)
	at org.apache.spark.sql.Dataset.$anonfun$mapPartitions$1(Dataset.scala:2793)
	at org.apache.spark.sql.execution.MapPartitionsExec.$anonfun$doExecute$3(objects.scala:195)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
17:20:37.158 [task-result-getter-1] [] WARN  o.a.spark.scheduler.TaskSetManager - Lost task 1.0 in stage 274.0 (TID 142, rbh-wireless-169-87.pool.csiro.au, executor driver): ca.uhn.fhir.rest.server.exceptions.InvalidRequestException: HTTP 400 : One of the in parameters url or valueset or context must be provided
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at ca.uhn.fhir.rest.server.exceptions.BaseServerResponseException.newInstance(BaseServerResponseException.java:302)
	at ca.uhn.fhir.rest.client.impl.BaseClient.invokeClient(BaseClient.java:351)
	at ca.uhn.fhir.rest.client.impl.BaseClient.invokeClient(BaseClient.java:219)
	at ca.uhn.fhir.rest.client.impl.BaseClient.invokeClient(BaseClient.java:215)
	at ca.uhn.fhir.rest.client.impl.ClientInvocationHandler.invoke(ClientInvocationHandler.java:62)
	at com.sun.proxy.$Proxy90.expand(Unknown Source)
	at au.csiro.pathling.fhirpath.function.memberof.MemberOfMapper.call(MemberOfMapper.java:156)
	at org.apache.spark.sql.Dataset.$anonfun$mapPartitions$1(Dataset.scala:2793)
	at org.apache.spark.sql.execution.MapPartitionsExec.$anonfun$doExecute$3(objects.scala:195)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

java.io.UTFDataFormatException

We seem to have tests intermittently failing with the following error:

org.apache.spark.SparkException: 
Job aborted due to stage failure: Task 0 in stage 14.0 failed 1 times, most recent failure: Lost task 0.0 in stage 14.0 (TID 14, localhost, executor driver): java.io.UTFDataFormatException
	at java.io.ObjectOutputStream$BlockDataOutputStream.writeUTF(ObjectOutputStream.java:2164)
	at java.io.ObjectOutputStream$BlockDataOutputStream.writeUTF(ObjectOutputStream.java:2007)
	at java.io.ObjectOutputStream.writeUTF(ObjectOutputStream.java:869)
	at au.csiro.pathling.encoders.Bundles$BundleContainer.writeObject(Bundles.java:60)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
	at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:43)
	at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:456)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
	at au.csiro.pathling.encoders.r4.BundlesTest.testRetrieveBundle(BundlesTest.java:93)
Caused by: java.io.UTFDataFormatException

Refactor empty function

Reimplement the empty function as part of the refactored design of the v3 implementation.

This should also close #97.

Refactor where function

Reimplement the where function as part of the refactored design of the v3 implementation.

Tasks:

  • Make join helpers support joining of grouped datasets
  • Join from distinct identities to argument dataset filtered of null values
  • Investigate why there seem to be multiple stages loading the same Parquet file, with the same filters

Make literal types materializable

The literal FHIRPath types are not currently materializable, which means that "1 + 1" currently works in a grouping expression, but "2" does not.

Enable integration tests within CI

The integration tests are currently disabled within the CircleCI build, as they rely on running up a Docker container.

This doesn't seem to work with the current CircleCI configuration.

Establish integration test suite at the server level

Currently we don’t have tests which verify the behaviour of the HAPI server configuration, or the configuration of the embedded Jetty server. Configuration environment variables is another blind spot.

These tests would execute the main method, running up the server in a separate process. They would then use a HTTP client to send requests to the server and verify behaviour.

Aggregation functions cannot be nested

Currently, the aggregation expression within the aggregate operation only works correctly if the aggregation function is at the end of the expression.

So you can't do something like this:

name.given.count() + name.family.count()

The correct result for this should be the number of given names of all patients within the grouping, plus the number of family names for all patients in the grouping.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.