Giter Club home page Giter Club logo

delete_me_go-go-duck's People

Contributors

aodn-ci-deploy avatar danfruehauf avatar dnahodil avatar jkburges avatar jonescc avatar julian1 avatar lbesnard avatar pmbohm avatar tojofo avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

delete_me_go-go-duck's Issues

More accurate metadata parameters

When aggregating a file in gogoduck, if the user has requested:

  • lon_min, lon_max
  • lat_min, lat_max
  • time_min, time_max

The aggregated data may contain a minimum value which is larger than the given minimum value requested by the user. So for instance the user asked for lat_min 66.0, but the actual lat_min in the file in 66.2.

This results in the metadata still updating lat_min to be 66.0, however the desired behaviour is to have lat_min as 66.2 in the metadata.

This behaviour should hold for any dimension in the data set, not just for lat.

Specifying time constraint for ncks can produce errors

Based on aodn/aodn-portal#1093

This is probably related to rounding/conversion of times as it doesn't happen with all files, just with some.

This will yield an error:

ncks -a -4 -O -d TIME,2014-04-14T03:00:00.000Z,2014-04-14T07:00:00.000Z -d LATITUDE,-90.0,90.0 -d LONGITUDE,-180.0,180.0 /tmp/tmp.4sGKIqOfcX/IMOS_ACORN_V_20140414T070000Z_TURQ_FV00_1-hour-avg.nc /tmp/tmp.YwV1QVi2xh

This wouldn't (07:00:00.001):

ncks -a -4 -O -d TIME,2014-04-14T03:00:00.000Z,2014-04-14T07:00:00.001Z -d LATITUDE,-90.0,90.0 -d LONGITUDE,-180.0,180.0 /tmp/tmp.4sGKIqOfcX/IMOS_ACORN_V_20140414T070000Z_TURQ_FV00_1-hour-avg.nc /tmp/tmp.YwV1QVi2xh

This wouldn't either (remove TIME=):

ncks -a -4 -O -d LATITUDE,-90.0,90.0 -d LONGITUDE,-180.0,180.0 /tmp/tmp.4sGKIqOfcX/IMOS_ACORN_V_20140414T070000Z_TURQ_FV00_1-hour-avg.nc /tmp/tmp.YwV1QVi2xh

Problem loading job data back off filesystem

No doubt related to my injection protection changes.

2014-03-27 00:00:00,150 [quartzScheduler_Worker-9] WARN  grails.app.services.au.org.emii.gogoduck.job.JobStoreService - Invalid or corrupt job with ID: fda08d99
org.codehaus.groovy.runtime.typehandling.GroovyCastException: Cannot cast object '{"spatialExtent":{"south":"-32.751220703118","east":"115.75744628897","north":"-30.993408203118","west":"115.21911621085"},"temporalExtent":{"start":"2013-11-17T00:30:00.000Z","end":"2013-11-20T10:30:00.000Z"}}' with class 'org.codehaus.groovy.grails.web.json.JSONObject' to class 'au.org.emii.gogoduck.job.SubsetDescriptor' due to: org.codehaus.groovy.runtime.typehandling.GroovyCastException: Cannot cast object '{"south":"-32.751220703118","east":"115.75744628897","north":"-30.993408203118","west":"115.21911621085"}' with class 'org.codehaus.groovy.grails.web.json.JSONObject' to class 'au.org.emii.gogoduck.job.SpatialExtent' due to: org.codehaus.groovy.runtime.typehandling.GroovyCastException: Cannot cast object '-32.751220703118' with class 'java.lang.String' to class 'java.lang.Double'
        at au.org.emii.gogoduck.job.Job.fromJsonString(Job.groovy:49)
        at au.org.emii.gogoduck.job.JobStoreService.get(JobStoreService.groovy:9)
        at au.org.emii.gogoduck.job.JobStoreService$_list_closure2.doCall(JobStoreService.groovy:33)
        at au.org.emii.gogoduck.job.JobStoreService.list(JobStoreService.groovy:32)
        at au.org.emii.gogoduck.job.CleanupJob.execute(CleanupJob.groovy:15)
        at grails.plugins.quartz.GrailsJobFactory$GrailsJob.execute(GrailsJobFactory.java:104)
        at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
        at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)

Error email should not contain error message

The error emails from GoGoDuck still try to include an error message from the GoGoDuck script. If the log file that we now attach is meant to replace the error message in the email then the wording of the email should be changed.

Perhaps instead of:
"Processing failed with this message: An unexepected problem has occurred."
The error email could say:
"Please see the attached log for more details."

No reason given when aggregation fails because of no data found

Disclaimer: we might have spoken about this before but I don't remember. It's not documented in any GitHub issue as far as I could see. I don't know if there are technical reasons behind this or if it's just something that we haven't got right yet.

Steps to reproduce

  • Start a GoGoDuck aggregation where the geographic boundary will not cover any data
    (NB. The time range needs to be small enough that there aren't more files than GoGoDuck's configured limit)

What does happen

  • Both the status page and email acknowledge there was an error but give no more information (see below)
  • The report does not have any indication that there was an error or what it might have been (see below)

What should happen

  • We should indicate what the problem was (ie. no data matched the subset criteria) either in the status page/email or in the report (or both)

More info

The status page message

Reason: Internal error running GoGoDuck, please refer to report for more information

The email message

...
Something has gone wrong with your download request. Please see the attached log for more details:
...

The report

Applying subset 'TIME,2014-06-01T00:30:00.000Z,2014-06-06T02:30:00.000999Z;LATITUDE,-0.9677734375,0.6142578125;LONGITUDE,113.6865234375,115.4443359375'
Processing file 'IMOS_ACORN_V_20140604T053000Z_ROT_FV01_1-hour-avg.nc'
...
Processing file 'IMOS_ACORN_V_20140605T213000Z_ROT_FV01_1-hour-avg.nc'

ie. There is nothing to indicate that anything went wrong

Web-app functionality/workflow

Need some direction about the high level functionality/workflow of the gogoduck web-app.

This is what I've got in my mind, but please edit where appropriate:

  1. request comes in from Portal
  2. job is queued (only 'n' can be processed at a time??)
  3. email sent to user advising that a job has been created
  4. job is executed, with output stored on disk a location given by a unique ID
  5. email sent to user advising that a job has completed (with a download link), or that a job has failed (and why)
  6. jobs older than 'n' days are deleted

Tmp dir is not being cleaned

The gogoduck tmp dir appears to never be emptied. This can cause the partition on which it resides to fill up.

Emails need to have proper content

The current content of the emails are essentially placeholder messages. Before release these need to be replaced with well-written emails. The emails should make it clear what has happened and what the user can expect next.

GoGoDuck does not give any specific error messages (eg. too many files)

It looks to like the gogoduck script does write any output to stderr. This would mean that the current mechanism for getting human-readable error messages form the script into emails for the user is never used.

What does happen
All logging goes to stdout.
So the emails go out with a generic message rather than a message specific to what the problem was (in the cases where we want to do that).

What should happen
In the cases where there is a specific message we want to include with the failure email we need to write that to stderr.

Can't choose geoserver for GoGoDuck operation

Having different environments, we should be able to pass a geoserver_address parameter, or something similar for GoGoDuck's operation. This parameter should propagate from the portal all the way to GoGoDuck's core.

At the moment http://geoserver-123.aodn.org.au/geoserver is used. It can be the default.

Filename output

It would be good to create better filenames for the download of aggregated files.
Suggested filename:

IMOS_[geoserverView( trim url from the end) ]_START-[DateStart(gogoduckInput)]_END-[DateEnd(gogoduckOutput)].nc

@smancini any input ?

Gogoduck fails with CARS aggregation

Steps to reproduce:
Select layer CARS 2009 - Australian weekly sea water temperature, select the subset 18 Dec 2009, 22:09 UTC - 25 Dec 2009, 23:04 UTC.
Run gogoduck job.
Expected outcome: receive an aggregated netcdf file with no errors
Observed outcome: error with no aggregation report recieved

Error emails contain no error message

I was using GoGoDuck fom the RC Portal today. My jobs failed and I received an error email but there was no error message included.

What should happen
An error message should have been included in the email.

Update repo description

Change from "NetCDF aggregation service" to "NetCDF subset and concatenation service". Go-go-duck is not aggregating anything, and since some are talking about a future
NetCDF aggregator it would be good to avoid any confusion.

No indication of error

I tried to download a dataset from portal-rc.aodn.org.au, the dataset was IMOS - SRS Satellite - SST L3S - 01 day mosaic - LOZ GoGoduck test. It was for 2 years of data. and a very tiny bounding box

What happened
The email only told me something had gone wrong and the aggregation report only showed this:

Applying subset 'TIME,2013-03-21T03:20:00.000Z,2015-01-28T03:20:00.000999Z;LATITUDE,-23.32,-23.05;LONGITUDE,113.64,113.86'
Processing file '20130322032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130323032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130324032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130325032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130326032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130327032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130328032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130329032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130330032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130331032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130401032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130402032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130403032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130404032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130405032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130406032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130407032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130408032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130409032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130410032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130411032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130412032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130413032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130414032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130415032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130416032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130417032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130418032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130419032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130420032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130421032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130422032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130423032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130424032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130425032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130426032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130427032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130428032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130429032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130430032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130501032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130502032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130503032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130504032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130505032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130506032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130507032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130508032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130509032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130510032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130511032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130512032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130513032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130514032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130515032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130516032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130517032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130518032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130519032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130520032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130521032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130522032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130523032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130524032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130525032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130526032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130527032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130528032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130529032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130530032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130531032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130601032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130602032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130603032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130604032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130605032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130606032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130607032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130608032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130609032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130610032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130611032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130612032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130613032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130614032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130615032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130616032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130617032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130618032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130619032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130620032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130621032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130622032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130623032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130624032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130625032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130626032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130627032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130628032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130629032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130630032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130701032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130702032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130703032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130704032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130705032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130706032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130707032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130708032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130709032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130710032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130711032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130712032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130713032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130714032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130715032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130716032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130717032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130718032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130719032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130720032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130721032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130722032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130723032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130724032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130725032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130726032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130727032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130728032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130729032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130730032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130731032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130801032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130802032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130803032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130804032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130805032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130806032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130807032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130808032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130809032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130810032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130811032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130812032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130813032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130814032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130815032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130816032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130817032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130818032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130819032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130820032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130821032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130822032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130823032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130824032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130825032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130826032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130827032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130828032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130829032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130830032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130831032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130901032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130902032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130903032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130904032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130905032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130906032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130907032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130908032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130909032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130910032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130911032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130912032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130913032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130914032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130915032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130916032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130917032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130918032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130919032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130920032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130921032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130922032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130923032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130924032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130925032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130926032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130927032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130928032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130929032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20130930032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131001032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131002032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131003032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131004032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131005032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131006032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131007032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131008032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131009032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131010032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131011032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'
Processing file '20131012032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv02.0.nc'

Seb and I spoke to Dan and he indicated that it was set to cut out at 15minutes to prevent anything ever getting stuck.

What should happen:

If it does try to aggregate for too long the aggregation report should tell me that was the problem

GoGoDuck metadata on subset/aggregation to be correct

From aodn/aodn-portal#1212.

What to do
On step 1 select a collection on which GoGoDuck is supported like CARS 2009 Australia sea water temperature. Perform a filtering on time and geographically. Download and inspect the global attributes time_coverage_start/end and geospatial_lat/lon_min/max against the actual TIME, LATITUDE and LONGITUDE values.

You can use :

ncdump -h yourFile.nc | grep coverage
ncdump -t -v DAY_OF_YEAR yourFile.nc

By the way, format in time_coverage_start/end is "yyyy-mm-ddTHH:MM:SSZ" and don't need decimals of seconds.

and

ncdump -h yourFile.nc | grep geospatial
ncdump -v LATITUDE yourFile.nc
ncdump -v LONGITUDE yourFile.nc

What to expect
The global attributes time_coverage_start/end and geospatial_lat/lon_min/max are consistent with the actual TIME, LATITUDE and LONGITUDE min and max values.

What happens
The global attributes time_coverage_start/end and geospatial_lat/lon_min/max are documented upon the values used during the filtering process and not on the actual values retrieved from the filtering.

Once I got :

:geospatial_lat_min = "-44.7373046875" ;
:geospatial_lat_max = "-39.7275390625" ;
:geospatial_lon_min = "144.0966796875" ;
:geospatial_lon_max = "149.1943359375" ;

while

LATITUDE = -44.5, -44, -43.5, -43, -42.5, -42, -41.5, -41, -40.5, -40 ;
LONGITUDE = 144.5, 145, 145.5, 146, 146.5, 147, 147.5, 148, 148.5, 149 ;

GoGoDuck gets stuck on "long" aggregations

From my humble observation, it looks like if the aggregation command output is too long and the following code gets jammed:

    Process execute(cmd) {
        log.info("Executing command: '${cmd}'")

        def proc = cmd.execute()
        proc.waitFor()

        return proc
    }

If I remove the logging from the gogoduck.sh executable - the aggregation works. The buffer of the executing command should be cleared.

An example for an aggregation that has a too long of an output:

{
  layerName: 'acorn_hourly_avg_rot_qc_timeseries_url',
  emailAddress: '[email protected]',
  subsetDescriptor: {
    temporalExtent: {
      start: "2013-11-10T00:30:00.000Z",
      end:   "2013-11-20T10:30:00.000Z",
    },
    spatialExtent: {
      north: '90.0',
      south: '-90.0',
      east:  '180.0',
      west:  '-180.0'
    }
  }
}

A proper fix would also include a time limit on aggregations, so if for any reason gogoduck.sh gets stuck - it will not jam the whole application.

SRS satellite special NetCDF file format

Context:

  • GoGoDuck is a temporal and spatial subsetting and aggregating tool, such that the result file needs to have its temporal and spatial global attributes updated according to what's actually found in the temporal and spatial coordinates.
  • The IMOS NetCDF conventions makes compulsory the use of the following temporal and spatial global attributes. These are basically the global attributes that GoGoDuck needs to update:
    • time_coverage_start / end
    • geospatial_lat_min / max
    • geospatial_lon_min / max
    • geospatial_vertical_min / max

Problem:
Satellite NetCDF files seem to include extra temporal and spatial global attributes such as:

  • start_time / stop_time
  • northernmost_latitude / southernmost_latitude
  • easternmost_longitude / westernmost_longitude

See example: http://thredds.aodn.org.au/thredds/dodsC/IMOS/SRS/SST/ghrsst/L3S-1d/dn/2016/20160804092000-ABOM-L3S_GHRSST-SSTfnd-AVHRR_D-1d_dn.nc.html .

What to shall we do?
1- Enforce IMOS NetCDF conventions and get rid of redundant entries in the original files when no strong reason apply:

start_time: 20160803T202039Z
time_coverage_start: 20160803T202039Z
stop_time: 20160804T234329Z
time_coverage_end: 20160804T234329Z

2- Non IMOS compliant spatio-temporal global attributes are not included in the file output by GoGoDuck.
3- Non IMOS compliant spatio-temporal global attributes have to be updated just as the compliant ones in the file output by GoGoDuck.

Job 'successful' but result file was empty

@pblain performed an aggregation. He got a success email but the result file was empty.

GoGoDuck log says:

2014-03-19 13:00:58,256 [quartzScheduler_Worker-7] INFO  au.org.emii.gogoduck.worker.Worker - Executing command: '/bin/bash /mnt/ebs/tom
cat7/gogoduck/webapps/gogoduck/resources/worker/gogoduck.sh -p acorn_hourly_avg_turq_nonqc_timeseries_url -s TIME,2014-03-17T17:00:00.00
0Z,2014-03-18T18:00:00.000Z;LATITUDE,-31.630615234375,-31.388916015625;LONGITUDE,114.29077148436,114.68627929686 -o /mnt/ebs/tomcat7/gog
oduck/data/gogoduck/11c6e20b/output.nc -l 100'
2014-03-19 13:00:58,952 [quartzScheduler_Worker-7] INFO  au.org.emii.gogoduck.worker.Worker - worker output: ESC[32m[Wed Mar 19 13:00:58
 EST 2014] INFO: Using linking method, could find '' in its locationESC[0m
ESC[32m[Wed Mar 19 13:00:58 EST 2014] INFO: Applying profile '/mnt/ebs/tomcat7/gogoduck/webapps/gogoduck/resources/worker/profiles/defau
lt'ESC[0m
ESC[32m[Wed Mar 19 13:00:58 EST 2014] INFO: Applying subset '-d TIME,2014-03-17T17:00:00.000Z,2014-03-18T18:00:00.000Z -d LATITUDE,-31.6
30615234375,-31.388916015625 -d LONGITUDE,114.29077148436,114.68627929686' to '/tmp/tmp.wuUnkWC2De/*'ESC[0m
ERROR: nco__open() unable to open file "/tmp/tmp.NEJ6J5wUWC"
ERROR NC_ENOTNC Not a netCDF file
...

Params were:

{
   "subsetDescriptor": {
      "spatialExtent": {
         "south": "-31.630615234375",
         "north": "-31.388916015625",
         "east": "114.68627929686",
         "west": "114.29077148436"
      },
      "temporalExtent": {
         "start": "2014-03-17T17:00:00.000Z",
         "end": "2014-03-18T18:00:00.000Z"
      }
   },
   "createdTimestamp": "2014-03-19T13:00:58.195+11:00",
   "emailAddress": "[email protected]",
   "layerName": "acorn_hourly_avg_turq_nonqc_timeseries_url",
   "uuid": "11c6e20b"
}

Injection vulnerability

I think there is a potential injection attack in the GoGoDuck.

In the following code job.layerName is a string that comes from the request. I don't think that it is sanitised before we get here.

def cmdOptions = String.format(
    '-p %1s -s TIME,%2s,%3s;LATITUDE,%4s,%5s;LONGITUDE,%6s,%7s -o %8s -l %9$1s',
    job.layerName,
    job.subsetDescriptor.temporalExtent.start,
    job.subsetDescriptor.temporalExtent.end,
    job.subsetDescriptor.spatialExtent.south,
    job.subsetDescriptor.spatialExtent.north,
    job.subsetDescriptor.spatialExtent.west,
    job.subsetDescriptor.spatialExtent.east,
    outputFilename,
    fileLimit
)
shellCmd.call(cmdOptions)

What should happen
The layer name should be validated or sanitised so that no injection could be made to add/change the command that is executed.

GoGoDuck should pack output files to reduce their size

Reported by @rogerpoctor

Selecting 3 files as per report below

Aggregation will use only the following variables for srs: 'time,lat,lon,dt_analysis,l2p_flags,quality_level,satellite_zenith_angle,sea_surface_temperature,sses_bias,sses_count,sses_standard_deviation,sst_dtime,wind_speed,wind_speed_dtime_from_sst', any other variable will be omitted!!
Applying subset 'TIME,2015-10-29T03:20:00.000Z,2015-11-01T03:20:00.000999Z;LATITUDE,-90.0,90.0;LONGITUDE,-180.0,180.0'
Processing file '20151031032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv01.0.nc'
Processing file '20151030032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv01.0.nc'
Processing file '20151101032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv01.0.nc'
Your aggregation was successful!

results in a downloaded file size of 546 MB (573,041,346 bytes)
IMOS-Aggregation-20151105T091550.740+1100.nc

whereas on THREDDS the filesizes are:
20151101032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv01.0.nc 67.19 Mbytes 2015-11-04T02:35:35Z
20151031032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv01.0.nc 62.89 Mbytes 2015-11-03T02:38:30Z
20151030032000-ABOM-L3S_GHRSST-SSTskin-AVHRR_D-1d_day-v02.0-fv01.0.nc 57.43 Mbytes 2015-11-02T02:34:40Z

i.e. a total (67.19+62.89+57.43MB) of less than 190MB. Why the huge discrepancy?

@danfruehauf responded:

The SRS files are "packed" (ncpdq), the result file is not packed, hence the size difference.

So the request is that GoGoDuck packs the output files to reduce their size.

No limit on aggregation size

I'm sure this has been discussed before, however, it's just happened that the disk on 4-nsp-mel was filled because of a large aggregation, i.e.:

{
   "subsetDescriptor": {
      "spatialExtent": {
         "south": -90,
         "north": 90,
         "east": 180,
         "west": -180
      },
      "temporalExtent": {
         "start": "2014-01-01T03:20:00.000Z",
         "end": "2015-07-31T03:20:00.000999Z"
      }
   },
   "createdTimestamp": "2015-08-25T17:01:22.969+10:00",
   "status": "FAILED",
   "reason": "TIMEOUT_EXPIRED",
   "geoserver": "http://geoserver-123.aodn.org.au/geoserver",
   "emailAddress": scrubbed,
   "layerName": "srs_sst_l3c_1d_day_n19_gridded_url",
   "uuid": "f3bdbc46",
   "startedTimestamp": "2015-08-25T17:01:24.383+10:00",
   "finishedTimestamp": "2015-08-25T19:01:26.033+10:00"
}

Travis build fails

If the URL stays valid, here is the output:
https://travis-ci.org/aodn/go-go-duck/builds/52348208

If not, it failed on:

Using worker: worker-linux-bb3654bb-2.bb.travis-ci.org:travis-linux-13
system_info
Build system information
Build language: groovy
Build image provisioning date and time
Wed Feb 4 18:22:50 UTC 2015
Operating System Details
Distributor ID: Ubuntu
Description: Ubuntu 12.04 LTS
Release: 12.04
Codename: precise
Linux Version
2.6.32-042stab090.5
Cookbooks Version
23bb455 https://github.com/travis-ci/travis-cookbooks/tree/23bb455
GCC version
gcc (Ubuntu/Linaro 4.6.3-1ubuntu5) 4.6.3
Copyright (C) 2011 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
LLVM version
clang version 3.4 (tags/RELEASE_34/final)
Target: x86_64-unknown-linux-gnu
Thread model: posix
Pre-installed Ruby versions
ruby-1.9.3-p551
Pre-installed Node.js versions
v0.10.36
Pre-installed Go versions
1.4.1
Redis version
redis-server 2.8.19
riak version
2.0.2
MongoDB version
MongoDB 2.4.12
CouchDB version
couchdb 1.6.1
Neo4j version
1.9.4
Cassandra version
2.0.9
RabbitMQ Version
3.4.3
ElasticSearch version
1.4.0
Installed Sphinx versions
2.0.10
2.1.9
2.2.6
Default Sphinx version
2.2.6
Installed Firefox version
firefox 31.0esr
PhantomJS version
1.9.8
ant -version
Apache Ant(TM) version 1.8.2 compiled on December 3 2011
mvn -version
Apache Maven 3.2.5 (12a6b3acb947671f09b81f49094c53f426d8cea1; 2014-12-14T17:29:23+00:00)
Maven home: /usr/local/maven
Java version: 1.7.0_76, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-7-oracle/jre
Default locale: en, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-042stab090.5", arch: "amd64", family: "unix"
git.checkout
0.53s$ git clone --depth=50 --branch=srs_oc git://github.com/aodn/go-go-duck.git aodn/go-go-duck
Cloning into 'aodn/go-go-duck'...
remote: Counting objects: 1313, done.
remote: Compressing objects: 100% (407/407), done.
remote: Total 1313 (delta 493), reused 1245 (delta 443), pack-reused 0
Receiving objects: 100% (1313/1313), 224.25 KiB | 0 bytes/s, done.
Resolving deltas: 100% (493/493), done.
Checking connectivity... done.
$ cd aodn/go-go-duck
$ git checkout -qf 50f36313394a95b270a48133f12bf3f79426e523
Setting environment variables from .travis.yml
$ export TZ="Australia/Hobart"
$ export JAVA_OPTS="-Xmx1G -Xms512m -XX:PermSize=64m -XX:MaxPermSize=512m"
$ jdk_switcher use openjdk7
Switching to OpenJDK7 (java-1.7.0-openjdk-amd64), JAVA_HOME will be set to /usr/lib/jvm/java-7-openjdk-amd64
$ java -version
java version "1.7.0_75"
OpenJDK Runtime Environment (IcedTea 2.5.4) (7u75-2.5.4-1~precise1)
OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
$ javac -version
javac 1.7.0_75
before_install.1
11.48s$ sudo apt-get install tzdata
Reading package lists... Done
Building dependency tree
Reading state information... Done
tzdata is already the newest version.
The following package was automatically installed and is no longer required:
libgeos-3.2.2
Use 'apt-get autoremove' to remove them.
0 upgraded, 0 newly installed, 0 to remove and 119 not upgraded.
before_install.2
0.95s$ sudo dpkg-reconfigure tzdata
Current default time zone: 'Etc/UTC'
Local time is now: Thu Feb 26 22:56:45 UTC 2015.
Universal Time is now: Thu Feb 26 22:56:45 UTC 2015.
before_install.3
1.58s$ sudo add-apt-repository -y ppa:groovy-dev/grails
Executing: gpg --ignore-time-conflict --no-options --no-default-keyring --secret-keyring /tmp/tmp.Nht4q6qBb9 --trustdb-name /etc/apt/trustdb.gpg --keyring /etc/apt/trusted.gpg --primary-keyring /etc/apt/trusted.gpg --keyring /etc/apt/trusted.gpg.d//apt.postgresql.org.gpg --keyserver hkp://keyserver.ubuntu.com:80/ --recv 53D38EDBBF34743E0CDBD834E354FBD902A9EC29
gpg: requesting key 02A9EC29 from hkp server keyserver.ubuntu.com
gpg: key 02A9EC29: public key "Launchpad PPA for Groovy Developers" imported
gpg: Total number processed: 1
gpg: imported: 1 (RSA: 1)
before_install.4
8.34s$ sudo apt-get update
Get:1 http://downloads-distro.mongodb.org dist Release.gpg [490 B]
Get:2 http://downloads-distro.mongodb.org dist Release [2,042 B]
Get:3 http://security.ubuntu.com precise-security Release.gpg [198 B]
Get:4 http://downloads-distro.mongodb.org dist/10gen amd64 Packages [27.4 kB]
Hit http://us.archive.ubuntu.com precise Release.gpg
Get:5 http://us.archive.ubuntu.com precise-updates Release.gpg [198 B]
Get:6 http://us.archive.ubuntu.com precise-backports Release.gpg [198 B]
Get:7 http://security.ubuntu.com precise-security Release [53.0 kB]
Get:8 http://downloads-distro.mongodb.org dist/10gen i386 Packages [27.1 kB]
Hit http://us.archive.ubuntu.com precise Release
Get:9 http://us.archive.ubuntu.com precise-updates Release [194 kB]
Ign http://downloads-distro.mongodb.org dist/10gen TranslationIndex
Hit http://apt.postgresql.org precise-pgdg Release.gpg
Get:10 http://security.ubuntu.com precise-security/main Sources [124 kB]
Ign http://downloads-distro.mongodb.org dist/10gen Translation-en_US
Ign http://downloads-distro.mongodb.org dist/10gen Translation-en
Get:11 http://www.rabbitmq.com testing Release.gpg [198 B]
Get:12 http://us.archive.ubuntu.com precise-backports Release [53.1 kB]
Hit http://ppa.launchpad.net precise Release.gpg
Hit http://ppa.launchpad.net precise Release.gpg
Get:13 http://ppa.launchpad.net precise Release.gpg [316 B]
Hit http://ppa.launchpad.net precise Release.gpg
Get:14 http://ppa.launchpad.net precise Release.gpg [316 B]
Hit http://ppa.launchpad.net precise Release.gpg
Get:15 http://ppa.launchpad.net precise Release.gpg [316 B]
Get:16 http://security.ubuntu.com precise-security/restricted Sources [3,759 B]
Get:17 http://security.ubuntu.com precise-security/universe Sources [33.8 kB]
Get:18 http://security.ubuntu.com precise-security/multiverse Sources [1,819 B]
Get:19 http://security.ubuntu.com precise-security/main amd64 Packages [486 kB]
Hit http://us.archive.ubuntu.com precise/main Sources
Hit http://us.archive.ubuntu.com precise/restricted Sources
Hit http://us.archive.ubuntu.com precise/universe Sources
Hit http://us.archive.ubuntu.com precise/multiverse Sources
Hit http://us.archive.ubuntu.com precise/main amd64 Packages
Hit http://us.archive.ubuntu.com precise/restricted amd64 Packages
Hit http://us.archive.ubuntu.com precise/universe amd64 Packages
Hit http://us.archive.ubuntu.com precise/multiverse amd64 Packages
Hit http://us.archive.ubuntu.com precise/main i386 Packages
Hit http://apt.postgresql.org precise-pgdg Release
Hit http://us.archive.ubuntu.com precise/restricted i386 Packages
Hit http://us.archive.ubuntu.com precise/universe i386 Packages
Hit http://us.archive.ubuntu.com precise/multiverse i386 Packages
Hit http://us.archive.ubuntu.com precise/main TranslationIndex
Hit http://us.archive.ubuntu.com precise/multiverse TranslationIndex
Hit http://us.archive.ubuntu.com precise/restricted TranslationIndex
Hit http://us.archive.ubuntu.com precise/universe TranslationIndex
Get:20 http://us.archive.ubuntu.com precise-updates/main Sources [485 kB]
Get:21 http://www.rabbitmq.com testing Release [17.6 kB]
Hit http://ppa.launchpad.net precise Release
Get:22 http://us.archive.ubuntu.com precise-updates/restricted Sources [7,981 B]
Hit http://apt.postgresql.org precise-pgdg/main amd64 Packages
Get:23 http://us.archive.ubuntu.com precise-updates/universe Sources [112 kB]
Get:24 http://security.ubuntu.com precise-security/restricted amd64 Packages [8,943 B]
Get:25 http://security.ubuntu.com precise-security/universe amd64 Packages [107 kB]
Get:26 http://us.archive.ubuntu.com precise-updates/multiverse Sources [9,390 B]
Get:27 http://us.archive.ubuntu.com precise-updates/main amd64 Packages [877 kB]
Get:28 http://security.ubuntu.com precise-security/multiverse amd64 Packages [2,460 B]
Get:29 http://security.ubuntu.com precise-security/main i386 Packages [527 kB]
Hit http://ppa.launchpad.net precise Release
Get:30 http://ppa.launchpad.net precise Release [11.9 kB]
Hit http://ppa.launchpad.net precise Release
Get:31 http://us.archive.ubuntu.com precise-updates/restricted amd64 Packages [13.6 kB]
Get:32 http://us.archive.ubuntu.com precise-updates/universe amd64 Packages [253 kB]
Get:33 http://security.ubuntu.com precise-security/restricted i386 Packages [8,939 B]
Get:34 http://security.ubuntu.com precise-security/universe i386 Packages [114 kB]
Hit http://apt.postgresql.org precise-pgdg/main i386 Packages
Get:35 http://security.ubuntu.com precise-security/multiverse i386 Packages [2,651 B]
Get:36 http://security.ubuntu.com precise-security/main TranslationIndex [208 B]
Get:37 http://security.ubuntu.com precise-security/multiverse TranslationIndex [199 B]
Get:38 http://security.ubuntu.com precise-security/restricted TranslationIndex [202 B]
Get:39 http://security.ubuntu.com precise-security/universe TranslationIndex [205 B]
Get:40 http://us.archive.ubuntu.com precise-updates/multiverse amd64 Packages [16.4 kB]
Get:41 http://us.archive.ubuntu.com precise-updates/main i386 Packages [914 kB]
Get:42 http://security.ubuntu.com precise-security/main Translation-en [216 kB]
Get:43 http://ppa.launchpad.net precise Release [12.9 kB]
Hit http://ppa.launchpad.net precise Release
Get:44 http://ppa.launchpad.net precise Release [13.0 kB]
Hit http://security.ubuntu.com precise-security/multiverse Translation-en
Hit http://security.ubuntu.com precise-security/restricted Translation-en
Get:45 http://us.archive.ubuntu.com precise-updates/restricted i386 Packages [13.6 kB]
Get:46 http://us.archive.ubuntu.com precise-updates/universe i386 Packages [262 kB]
Get:47 http://security.ubuntu.com precise-security/universe Translation-en [65.3 kB]
Get:48 http://us.archive.ubuntu.com precise-updates/multiverse i386 Packages [16.6 kB]
Get:49 http://us.archive.ubuntu.com precise-updates/main TranslationIndex [10.6 kB]
Get:50 http://us.archive.ubuntu.com precise-updates/multiverse TranslationIndex [7,613 B]
Get:51 http://us.archive.ubuntu.com precise-updates/restricted TranslationIndex [7,297 B]
Get:52 http://us.archive.ubuntu.com precise-updates/universe TranslationIndex [8,333 B]
Get:53 http://us.archive.ubuntu.com precise-backports/main Sources [5,411 B]
Get:54 http://us.archive.ubuntu.com precise-backports/restricted Sources [28 B]
Get:55 http://us.archive.ubuntu.com precise-backports/universe Sources [41.3 kB]
Get:56 http://www.rabbitmq.com testing/main amd64 Packages [495 B]
Get:57 http://us.archive.ubuntu.com precise-backports/multiverse Sources [5,750 B]
Get:58 http://us.archive.ubuntu.com precise-backports/main amd64 Packages [5,491 B]
Get:59 http://us.archive.ubuntu.com precise-backports/restricted amd64 Packages [28 B]
Get:60 http://us.archive.ubuntu.com precise-backports/universe amd64 Packages [43.4 kB]
Ign http://apt.postgresql.org precise-pgdg/main TranslationIndex
Get:61 http://us.archive.ubuntu.com precise-backports/multiverse amd64 Packages [5,419 B]
Get:62 http://us.archive.ubuntu.com precise-backports/main i386 Packages [5,484 B]
Get:63 http://us.archive.ubuntu.com precise-backports/restricted i386 Packages [28 B]
Get:64 http://us.archive.ubuntu.com precise-backports/universe i386 Packages [43.2 kB]
Hit http://ppa.launchpad.net precise/main amd64 Packages
Hit http://ppa.launchpad.net precise/main i386 Packages
Hit http://ppa.launchpad.net precise/main TranslationIndex
Get:65 http://us.archive.ubuntu.com precise-backports/multiverse i386 Packages [5,413 B]
Get:66 http://us.archive.ubuntu.com precise-backports/main TranslationIndex [202 B]
Get:67 http://us.archive.ubuntu.com precise-backports/multiverse TranslationIndex [202 B]
Get:68 http://us.archive.ubuntu.com precise-backports/restricted TranslationIndex [193 B]
Get:69 http://us.archive.ubuntu.com precise-backports/universe TranslationIndex [205 B]
Hit http://us.archive.ubuntu.com precise/main Translation-en
Hit http://us.archive.ubuntu.com precise/multiverse Translation-en
Hit http://us.archive.ubuntu.com precise/restricted Translation-en
Hit http://us.archive.ubuntu.com precise/universe Translation-en
Get:70 http://us.archive.ubuntu.com precise-updates/main Translation-en [385 kB]
Get:71 http://www.rabbitmq.com testing/main i386 Packages [495 B]
Ign http://www.rabbitmq.com testing/main TranslationIndex
Get:72 http://us.archive.ubuntu.com precise-updates/multiverse Translation-en [9,533 B]
Hit http://us.archive.ubuntu.com precise-updates/restricted Translation-en
Get:73 http://us.archive.ubuntu.com precise-updates/universe Translation-en [148 kB]
Hit http://us.archive.ubuntu.com precise-backports/main Translation-en
Hit http://us.archive.ubuntu.com precise-backports/multiverse Translation-en
Hit http://us.archive.ubuntu.com precise-backports/restricted Translation-en
Hit http://us.archive.ubuntu.com precise-backports/universe Translation-en
Hit http://ppa.launchpad.net precise/main amd64 Packages
Hit http://ppa.launchpad.net precise/main i386 Packages
Hit http://ppa.launchpad.net precise/main TranslationIndex
Get:74 http://ppa.launchpad.net precise/main Sources [3,658 B]
Get:75 http://ppa.launchpad.net precise/main amd64 Packages [2,477 B]
Get:76 http://ppa.launchpad.net precise/main i386 Packages [2,477 B]
Ign http://ppa.launchpad.net precise/main TranslationIndex
Hit http://ppa.launchpad.net precise/main amd64 Packages
Hit http://ppa.launchpad.net precise/main i386 Packages
Hit http://ppa.launchpad.net precise/main TranslationIndex
Get:77 http://ppa.launchpad.net precise/main amd64 Packages [1,398 B]
Get:78 http://ppa.launchpad.net precise/main i386 Packages [943 B]
Get:79 http://ppa.launchpad.net precise/main TranslationIndex [199 B]
Hit http://ppa.launchpad.net precise/main amd64 Packages
Hit http://ppa.launchpad.net precise/main i386 Packages
Hit http://ppa.launchpad.net precise/main TranslationIndex
Hit http://ppa.launchpad.net precise/main Translation-en
Get:80 http://ppa.launchpad.net precise/main amd64 Packages [3,383 B]
Get:81 http://ppa.launchpad.net precise/main i386 Packages [3,383 B]
Get:82 http://ppa.launchpad.net precise/main TranslationIndex [199 B]
Hit http://ppa.launchpad.net precise/main Translation-en
Hit http://ppa.launchpad.net precise/main Translation-en
Get:83 http://ppa.launchpad.net precise/main Translation-en [735 B]
Hit http://ppa.launchpad.net precise/main Translation-en
Get:84 http://ppa.launchpad.net precise/main Translation-en [1,556 B]
Ign http://apt.postgresql.org precise-pgdg/main Translation-en_US
Ign http://apt.postgresql.org precise-pgdg/main Translation-en
Ign http://www.rabbitmq.com testing/main Translation-en_US
Ign http://www.rabbitmq.com testing/main Translation-en
Ign http://ppa.launchpad.net precise/main Translation-en_US
Ign http://ppa.launchpad.net precise/main Translation-en
Fetched 5,857 kB in 4s (1,400 kB/s)
Reading package lists... Done
before_install.5
12.84s$ sudo apt-get install grails-2.2.0
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following package was automatically installed and is no longer required:
libgeos-3.2.2
Use 'apt-get autoremove' to remove them.
The following NEW packages will be installed:
grails-2.2.0
0 upgraded, 1 newly installed, 0 to remove and 148 not upgraded.
Need to get 116 MB of archives.
After this operation, 151 MB of additional disk space will be used.
Get:1 http://ppa.launchpad.net/groovy-dev/grails/ubuntu/ precise/main grails-2.2.0 all 1.0-1ubuntu0 [116 MB]
Fetched 116 MB in 8s (14.1 MB/s)
Selecting previously unselected package grails-2.2.0.
(Reading database ... 73599 files and directories currently installed.)
Unpacking grails-2.2.0 (from .../grails-2.2.0_1.0-1ubuntu0_all.deb) ...
Setting up grails-2.2.0 (1.0-1ubuntu0) ...
update-alternatives: using /usr/share/grails/2.2.0/bin/grails to provide /usr/bin/grails (grails) in auto mode.
16.15s$ grails refresh-dependencies && grails test-app
| Loading Grails 2.2.0
| Configuring classpath
| Downloading: ivy-1.0.1.RELEASE.xml
| Downloading: ivy-2.4.1.xml
| Downloading: ivy-1.1.xml
| Downloading: ivy-1.45.xml
| Downloading: ivy-1.45.xml
| Downloading: ivy-1.0.xml
| Downloading: ivy-2.2.0.xml
| Downloading: ivy-1.2.1.xml
| Downloading: ivy-3.2.3.xml
| Downloading: ivy-2.7.1.xml
| Downloading: ivy-1.6.2.xml
| Downloading: ivy-1.6.2.xml
| Downloading: ivy-1.8.2.xml
| Downloading: ivy-1.8.2.xml
| Downloading: ivy-1.8.2.xml
| Downloading: ivy-1.7.1.xml
| Downloading: tomcat-2.2.0.pom
| Downloading: ivy-7.0.30.xml
| Downloading: ivy-7.0.30.xml
| Downloading: ivy-7.0.30.xml
| Downloading: ivy-3.7.2.xml
| Downloading: ivy-7.0.30.xml
| Downloading: ivy-7.0.30.xml
| Downloading: org.springframework.uaa.client-1.0.1.RELEASE.jar
| Downloading: protobuf-java-2.4.1.jar
| Downloading: json-simple-1.1.jar
| Downloading: bcpg-jdk15-1.45.jar
| Downloading: bcprov-jdk15-1.45.jar
| Downloading: jline-1.0.jar
| Downloading: ivy-2.2.0.jar
| Downloading: jansi-1.2.1.jar
| Downloading: jna-3.2.3.jar
| Downloading: serializer-2.7.1.jar
| Downloading: grails-docs-2.2.0.jar
| Downloading: grails-bootstrap-2.2.0.jar
| Downloading: grails-scripts-2.2.0.jar
| Downloading: slf4j-api-1.6.2.jar
| Downloading: jcl-over-slf4j-1.6.2.jar
| Downloading: ant-1.8.2.jar
| Downloading: ant-launcher-1.8.2.jar
| Downloading: ant-junit-1.8.2.jar
| Downloading: ant-trax-1.7.1.jar
| Downloading: tomcat-2.2.0.zip
| Downloading: tomcat-embed-core-7.0.30.jar
| Downloading: grails-plugin-tomcat-2.2.0.jar
| Downloading: tomcat-catalina-ant-7.0.30.jar
| Downloading: tomcat-embed-jasper-7.0.30.jar
| Downloading: ecj-3.7.2.jar
| Downloading: tomcat-embed-logging-juli-7.0.30.jar
| Downloading: tomcat-embed-logging-log4j-7.0.30.jar
| Downloading: ivy-2.0.5.xml
| Downloading: ivy-1.8.3.xml
| Downloading: ivy-1.0.xml
| Downloading: ivy-1.3.1.xml
| Downloading: ivy-1.0.xml
| Downloading: ivy-1.2_jdk5.xml
| Downloading: ivy-1.5.xml
| Downloading: ivy-3.2.1.xml
| Downloading: ivy-2.1.xml
| Downloading: ivy-2.6.xml
| Downloading: ivy-1.1.xml
| Downloading: ivy-1.0.1.Final.xml
| Downloading: ivy-2.4.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: ivy-1.1.2.RELEASE.xml
| Downloading: ivy-1.1.2.RELEASE.xml
| Downloading: ivy-1.1.2.RELEASE.xml
| Downloading: cache-1.0.1.pom
| Downloading: webxml-1.4.1.pom
| Downloading: ivy-1.6.10.xml
| Downloading: ivy-1.6.10.xml
| Downloading: ivy-2.2.xml
| Downloading: ivy-3.1.xml
| Downloading: ivy-1.2.2.xml
| Downloading: ivy-2.0.8.xml
| Downloading: ivy-1.4.xml
| Downloading: ivy-1.5.6.xml
| Downloading: ivy-1.3.164.xml
| Downloading: ivy-1.1.2.xml
| Downloading: ivy-1.1.4c.xml
| Downloading: ivy-2.4.6.xml
| Downloading: ivy-1.2.16.xml
| Downloading: ivy-1.6.2.xml
| Downloading: database-migration-1.2.1.pom
| Downloading: ivy-2.0.5.xml
| Downloading: hibernate-2.2.0.pom
| Downloading: ivy-3.2.0.Final.xml
| Downloading: ivy-3.6.10.Final.xml
| Downloading: ivy-4.1.0.Final.xml
| Downloading: ivy-1.0.0.GA.xml
| Downloading: ivy-3.6.10.Final.xml
| Downloading: ivy-1.6.1.xml
| Downloading: ivy-2.7.7.xml
| Downloading: ivy-3.16.1-GA.xml
| Downloading: jquery-1.8.3.pom
| Downloading: resources-1.1.6.pom
| Downloading: ivy-4.10.xml
| Downloading: ivy-3.1.2.RELEASE.xml
| Downloading: spock-grails-support-0.7-groovy-2.0.pom
| Downloading: spock-grails-support-0.7-groovy-2.0.pom.sha1
| Downloading: spock-core-0.7-groovy-2.0.pom
| Downloading: spock-core-0.7-groovy-2.0.pom.sha1
| Downloading: junit-dep-4.10.pom
| Downloading: junit-dep-4.10.pom.sha1
| Downloading: hamcrest-core-1.3.pom
| Downloading: hamcrest-core-1.3.pom.sha1
| Downloading: hamcrest-parent-1.3.pom
| Downloading: hamcrest-parent-1.3.pom.sha1
| Downloading: groovy-all-2.0.5.jar
| Downloading: commons-beanutils-1.8.3.jar
| Downloading: commons-el-1.0.jar
| Downloading: commons-validator-1.3.1.jar
| Downloading: aopalliance-1.0.jar
| Downloading: concurrentlinkedhashmap-lru-1.2_jdk5.jar
| Downloading: commons-codec-1.5.jar
| Downloading: commons-collections-3.2.1.jar
| Downloading: commons-io-2.1.jar
| Downloading: commons-lang-2.6.jar
| Downloading: jta-1.1.jar
| Downloading: hibernate-jpa-2.0-api-1.0.1.Final.jar
| Downloading: sitemesh-2.4.jar
| Downloading: grails-core-2.2.0.jar
| Downloading: grails-crud-2.2.0.jar
| Downloading: grails-hibernate-2.2.0.jar
| Downloading: grails-resources-2.2.0.jar
| Downloading: grails-spring-2.2.0.jar
| Downloading: grails-web-2.2.0.jar
| Downloading: grails-logging-2.2.0.jar
| Downloading: grails-plugin-codecs-2.2.0.jar
| Downloading: grails-plugin-controllers-2.2.0.jar
| Downloading: grails-plugin-domain-class-2.2.0.jar
| Downloading: grails-plugin-converters-2.2.0.jar
| Downloading: grails-plugin-datasource-2.2.0.jar
| Downloading: grails-plugin-filters-2.2.0.jar
| Downloading: grails-plugin-gsp-2.2.0.jar
| Downloading: grails-plugin-i18n-2.2.0.jar
| Downloading: grails-plugin-log4j-2.2.0.jar
| Downloading: grails-plugin-scaffolding-2.2.0.jar
| Downloading: grails-plugin-services-2.2.0.jar
| Downloading: grails-plugin-servlets-2.2.0.jar
| Downloading: grails-plugin-mimetypes-2.2.0.jar
| Downloading: grails-plugin-url-mappings-2.2.0.jar
| Downloading: grails-plugin-validation-2.2.0.jar
| Downloading: spring-core-3.1.2.RELEASE.jar
| Downloading: spring-aop-3.1.2.RELEASE.jar
| Downloading: spring-aspects-3.1.2.RELEASE.jar
| Downloading: spring-asm-3.1.2.RELEASE.jar
| Downloading: spring-beans-3.1.2.RELEASE.jar
| Downloading: spring-context-3.1.2.RELEASE.jar
| Downloading: spring-context-support-3.1.2.RELEASE.jar
| Downloading: spring-expression-3.1.2.RELEASE.jar
| Downloading: spring-jdbc-3.1.2.RELEASE.jar
| Downloading: spring-jms-3.1.2.RELEASE.jar
| Downloading: spring-orm-3.1.2.RELEASE.jar
| Downloading: spring-tx-3.1.2.RELEASE.jar
| Downloading: spring-web-3.1.2.RELEASE.jar
| Downloading: spring-webmvc-3.1.2.RELEASE.jar
| Downloading: grails-datastore-gorm-1.1.2.RELEASE.jar
| Downloading: grails-datastore-core-1.1.2.RELEASE.jar
| Downloading: grails-datastore-simple-1.1.2.RELEASE.jar
| Downloading: cache-1.0.1.zip
| Downloading: webxml-1.4.1.zip
| Downloading: aspectjweaver-1.6.10.jar
| Downloading: aspectjrt-1.6.10.jar
| Downloading: cglib-2.2.jar
| Downloading: asm-3.1.jar
| Downloading: commons-fileupload-1.2.2.jar
| Downloading: oro-2.0.8.jar
| Downloading: commons-dbcp-1.4.jar
| Downloading: commons-pool-1.5.6.jar
| Downloading: h2-1.3.164.jar
| Downloading: jstl-1.1.2.jar
| Downloading: xpp3_min-1.1.4c.jar
| Downloading: ehcache-core-2.4.6.jar
| Downloading: log4j-1.2.16.jar
| Downloading: jul-to-slf4j-1.6.2.jar
| Downloading: database-migration-1.2.1.zip
| Downloading: hibernate-2.2.0.zip
| Downloading: jquery-1.8.3.zip
| Downloading: resources-1.1.6.zip
| Downloading: liquibase-core-2.0.5.jar
| Downloading: hibernate-commons-annotations-3.2.0.Final.jar
| Downloading: hibernate-core-3.6.10.Final.jar
| Downloading: hibernate-validator-4.1.0.Final.jar
| Downloading: validation-api-1.0.0.GA.jar
| Downloading: hibernate-ehcache-3.6.10.Final.jar
| Downloading: dom4j-1.6.1.jar
| Downloading: antlr-2.7.7.jar
| Downloading: javassist-3.16.1-GA.jar
| Downloading: junit-4.10.jar
| Downloading: grails-plugin-testing-2.2.0.jar
| Downloading: grails-test-2.2.0.jar
| Downloading: spring-test-3.1.2.RELEASE.jar
| Downloading: spock-grails-support-0.7-groovy-2.0.jar
| Downloading: spock-grails-support-0.7-groovy-2.0.jar.sha1
| Downloading: spock-core-0.7-groovy-2.0.jar
| Downloading: spock-core-0.7-groovy-2.0.jar.sha1
| Downloading: junit-dep-4.10.jar
| Downloading: junit-dep-4.10.jar.sha1
| Downloading: hamcrest-core-1.3.jar
| Downloading: hamcrest-core-1.3.jar.sha1
| Error Failed to resolve dependencies (Set log level to 'warn' in BuildConfig.groovy for more information):
- org.grails.plugins:joda-time:1.4
- org.grails.plugins:mail:1.0.1
- org.grails.plugins:quartz:1.0.1
The command "grails refresh-dependencies && grails test-app" exited with 1.
Done. Your build exited with 1.

OutOfMemoryException when downloading large file

Steps to reproduce
Create an aggregation which will result in a large file (problem spotted on a 1.9GB file)
Try to download the file using the GoGoDuck webapp

What should happen
File downloads

What does happen
GoGoDuck webapp throws OutOfMemoryException (seen in log)
Download doesn't work

Job doesn't progress in queue?

A couple of days ago I tried to use the portal to extract 10 years' worth of satellite SST data for a smallish box near Tas. Maybe it's too big a request, but as far as I can tell it hasn't even tried to do it yet. According to the status page it has been at position 18 in the queue ever since I sent the request:

Job Id:
6cf2cd26
Submitted:
May 22, 2016 5:00:12 PM
Status:
New
Position in Queue:
18

Is this currently the expected behaviour? In an ideal world, what I would expect is:

  • The job to advance in the queue (at least within a day of the request being sent), OR
  • To be told straight away that I've requested too much data.

Aggregation fails when no geographic area supplied

Steps to reproduce

In the Portal load an ACORN layer
Do not draw a bounding box or polygon
Change the time range to be a short period around 3 days
Go to step 3
Choose download an aggregation

What should happen

Your aggregation should run using the whole world bounding box (-90, 90, -180, 180)

What does happen

Aggregation fails and you get an email like this

Job failure: e7a6692d, error message: getopt: invalid option -- '9'
getopt: invalid option -- '0'
getopt: invalid option -- ','
getopt: invalid option -- '1'
getopt: invalid option -- '8'
getopt: invalid option -- '0'
getopt: invalid option -- ','
nco_err_exit(): ERROR Short NCO-generated message (usually name of function that triggered error): nco__open()
nco_err_exit(): ERROR Error code is -51. Translation into English with nc_strerror(-51) is "NetCDF: Unknown file format"

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.