Giter Club home page Giter Club logo

ice_discharge's Introduction

Table of Contents

Greenland Ice Sheet solid ice discharge from 1986 through last month

This is the source for “Greenland Ice Sheet solid ice discharge from 1986 through March 2020” and previous and subsequent versions.

Before using the data you should check for any open/active issue tagged WARNING

Related Work

Citation

Publication

@article{mankoff_2020_ice,
  doi = {10.5194/essd-12-1367-2020}
  url = {https://doi.org/10.5194/essd-12-1367-2020},
  year = {2020},
  volume = 12,
  issue = 2,
  pages = {1367 -- 1383},
  publisher = {Copernicus {GmbH}},
  author = {Kenneth D. Mankoff and Anne Solgaard and William Colgan and 
            Andreas P. Ahlstrøm and Shfaqat Abbas Khan and Robert S. Fausto},
  title = {{G}reenland {I}ce {S}heet solid ice discharge 
           from 1986 through March 2020},
  journal = {Earth System Science Data}
}

Project

@data{mankoff_2020_data,
    author    = {Mankoff, Ken and Solgaard, Anne},
    publisher = {GEUS Dataverse},
    title     = {{G}reenland {I}ce {S}heet solid ice discharge from 1986 through last month: Discharge},
    year      = {2020},
    doi       = {10.22008/promice/data/ice_discharge/},
    url       = {https://doi.org/10.22008/promice/data/ice_discharge/}}

Discharge data

  • Note: The version number updates approximately every two weeks.
@data{mankoff_2020_discharge,
    author    = {Mankoff, Ken and Solgaard, Anne},
    publisher = {GEUS Dataverse},
    title     = {{G}reenland {I}ce {S}heet solid ice discharge from 1986 through last month: Discharge},
    year      = {2020},
    version   = {VERSION NUMBER},
    doi       = {10.22008/promice/data/ice_discharge/d/v02},
    url       = {https://doi.org/10.22008/promice/data/ice_discharge/d/v02}}

Discharge gates

@data{mankoff_2020_gates,
    author    = {Mankoff, Ken},
    publisher = {GEUS Dataverse},
    title     = {{G}reenland {I}ce {S}heet solid ice discharge from 1986 through last month: Gates},
    UNF       = {UNF:6:/eJSVvL8Rp1NG997hIhUag==},
    year      = {2020},
    version   = {VERSION_NUMBER},
    doi       = {10.22008/promice/data/ice_discharge/gates/v02},
    url       = {https://doi.org/10.22008/promice/data/ice_discharge/gates/v02}}

Funding

DatesOrganizationProgramEffort
2023 –NASA GISSModeling Analysis and Prediction program.Maintenance
2022 –GEUSPROMICEDistribution (data hosting)
2018 – 2022GEUSPROMICEDevelopment; publication; distribution




Open science vs. reproducible science

  • This work is open - every line of code needed to recreate it is include in this git repository, although the ~100 GB of velocity inputs are not included.
  • We recognize that “open” is not necessarily “reproducible”

Source: https://github.com/karthik/rstudio2019

ice_discharge's People

Contributors

mankoff avatar signehl avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

ice_discharge's Issues

Use GRASS GIS temporal framework

The initial manuscript assigned each raster to the central time-stamp, even though each raster represents a time-span. The GRASS Temporal Framework [1,2,3] supports time-spans. The algorithm should be modified to take advantage of time-span data storage. Visualizing this is easy, and can be done as in [4] (see also example figure from that paper included below).

Screenshot_20190604_091752

Limitations and Issues:

  • Converting those boxes to a single numeric value for any given date or annual average is a new challenge.

  • Each pixel within the raster may have a different time-span, or be some weighted average of multiple time-stamps within the time-span. That information is not likely to even be exposed at the level where we access the data.

[1] Gebbert & Pebesma, 2017 http://dx.doi.org/10.1080/13658816.2017.1306862
[2] https://grass.osgeo.org/grass70/manuals/temporalintro.html
[3] https://grass.osgeo.org/grass76/manuals/g.gui.timeline.html and
[4] Hanna /et al./, 2013 https://www.nature.com/articles/nature12238

Helheim 10 % drop on 2019-12-26

The following Org workbook demonstrates the error. It can also be seen graphically by looking at the Sentinel 1 velocity product from file IV_20191214_20200107.nc and one before or after, zoomed in near the Helheim gate. Pixels at the N edge of the glacier appear much slower.

* Helheim decline
:PROPERTIES:
:header-args:jupyter-python+: :session helheim :eval no-export
:END:

Last data point [2019-12-26 Thu] has a 10 % drop.

#+BEGIN_SRC jupyter-python :exports results :results raw drawer
import numpy as np
import pandas as pd

sector = pd.read_csv("./out/sector_D.csv", index_col=0, parse_dates=True)
hc = sector.columns[['HELHEIM' in _ for _ in sector.columns]].values[0]
sector.loc[sector.index.year >= 2019][hc]
#+END_SRC

#+RESULTS:
#+begin_example
Date
2019-01-12    32.003
2019-01-24    32.523
2019-02-05    32.588
2019-02-17    32.680
2019-03-01    33.028
2019-03-13    33.544
2019-03-25    33.494
2019-04-05    33.415
2019-04-18    33.661
2019-04-30    33.205
2019-05-12    33.411
2019-05-24    33.436
2019-06-05    33.347
2019-06-17    33.900
2019-06-29    34.478
2019-07-11    34.941
2019-07-23    35.942
2019-08-04    35.447
2019-08-16    35.432
2019-08-28    36.546
2019-09-09    36.386
2019-09-21    37.203
2019-10-03    36.858
2019-10-15    37.184
2019-10-27    36.341
2019-11-08    35.779
2019-11-20    35.570
2019-12-02    34.598
2019-12-14    33.921
2019-12-26    29.730
2020-01-07    34.539
Name: HELHEIMGLETSCHER, dtype: float64
#+end_example

Let's find it in the raw data

#+BEGIN_SRC jupyter-python :exports results :results raw drawer
meta = pd.read_csv("./out/gate_meta.csv")
gate_id = meta[meta['Mouginot_2019'] == hc]['gate'].values[0]
meta[meta['gate'] == gate_id].T
#+END_SRC

#+RESULTS:
|               |               171 |
|---------------+-------------------|
| gate          |               231 |
| mean_x        |            304157 |
| mean_y        |          -2576528 |
| lon           | -38.2680878986995 |
| lat           |  66.3773177878685 |
| n_pixels      |                36 |
| sector        |                63 |
| region        |                SE |
| Bjork_2015    |  Helheim Gletsjer |
| Mouginot_2019 |  HELHEIMGLETSCHER |


#+BEGIN_SRC jupyter-python :exports results :results raw drawer
dat = pd.read_csv("./tmp/dat_100_5000.csv")
dat = dat[dat['gates_gateID@gates_100_5000'] == gate_id]

col_id = ['vel_eff_2019_1' in _ for _ in dat.columns]
vel = dat.iloc[:,col_id]
vel.columns = [_[8:18] for _ in vel.columns]
col_id = ['err_eff_2019_1' in _ for _ in dat.columns]
err = dat.iloc[:,col_id]
err.columns = [_[8:18] for _ in err.columns]

vel[vel.columns[-6:]]
#+END_SRC

#+RESULTS:
|      | 2019_10_27 | 2019_11_08 | 2019_11_20 | 2019_12_02 | 2019_12_14 | 2019_12_26 |
|------+------------+------------+------------+------------+------------+------------|
| 4008 |          0 |          0 |    3.59353 |    6.37899 |    8.73327 |    5.36361 |
| 4009 |          0 |          0 |    6070.31 |    5907.67 |    5636.05 |    829.695 |
| 4010 |          0 |          0 |    6070.31 |    5907.67 |    5636.05 |    829.695 |
| 4011 |          0 |          0 |    6422.81 |    6232.66 |    6049.39 |    1211.98 |
| 4012 |          0 |          0 |    6312.86 |    5996.95 |    5898.27 |    1689.72 |
| 4013 |          0 |          0 |    6615.85 |    6289.21 |    6309.13 |    2509.64 |
| 4014 |          0 |          0 |    6838.14 |    6765.23 |    6147.08 |    6401.22 |
| 4015 |          0 |          0 |    6838.14 |    6765.23 |    6147.08 |    6401.22 |
| 4016 |          0 |          0 |    6638.59 |     6313.4 |    6475.88 |    6064.98 |
| 4017 |    8286.74 |    8114.79 |    7987.67 |    7878.28 |    7861.25 |    7774.74 |
| 4018 |    8286.74 |    8114.79 |    7987.67 |    7878.28 |    7861.25 |    7774.74 |
| 4019 |    8549.14 |    8331.16 |    8154.72 |       8090 |    7988.51 |    7936.16 |
| 4020 |    8549.14 |    8331.16 |    8154.72 |       8090 |    7988.51 |    7936.16 |
| 4021 |    8549.14 |    8331.16 |    8154.72 |       8090 |    7988.51 |    7936.16 |
| 4022 |    9213.67 |    8848.32 |    8753.04 |    8675.07 |    8582.07 |    8592.01 |
| 4023 |    8721.11 |    8466.87 |    8390.01 |    8255.41 |    8177.78 |    8192.03 |
| 4024 |    9119.38 |    8882.95 |    8701.69 |    8660.64 |     8547.2 |    8423.07 |
| 4025 |    9119.38 |    8882.95 |    8701.69 |    8660.64 |     8547.2 |    8423.07 |
| 4026 |    9119.38 |    8882.95 |    8701.69 |    8660.64 |     8547.2 |    8423.07 |
| 4027 |    9116.25 |    8879.54 |    8717.64 |    8598.36 |    8499.84 |    8451.84 |
| 4028 |    9116.25 |    8879.54 |    8717.64 |    8598.36 |    8499.84 |    8451.84 |
| 4029 |    8872.64 |    8610.32 |    8453.23 |    8392.89 |    8407.33 |    8306.79 |
| 4030 |    8872.64 |    8610.32 |    8453.23 |    8392.89 |    8407.33 |    8306.79 |
| 4031 |    8722.84 |    8425.22 |    8256.52 |    8251.52 |    8205.33 |    8182.11 |
| 4032 |    8211.32 |    7042.34 |    7925.64 |    7507.21 |    7008.91 |    7688.17 |
| 4033 |    8211.32 |    7042.34 |    7925.64 |    7507.21 |    7008.91 |    7688.17 |
| 4034 |    5180.69 |    2579.19 |    5577.93 |    5595.76 |    5208.05 |    4644.75 |
| 4035 |    5180.69 |    2579.19 |    5577.93 |    5595.76 |    5208.05 |    4644.75 |
| 4036 |    5180.69 |    2579.19 |    5577.93 |    5595.76 |    5208.05 |    4644.75 |
| 4037 |    5611.34 |    5716.22 |    5504.59 |    4990.52 |    4917.02 |    5809.17 |
| 4038 |       4337 |    4450.64 |    4233.21 |    3826.61 |    4161.36 |    4575.45 |
| 4039 |    835.603 |    1348.97 |    3503.86 |     151.47 |    497.261 |    113.531 |
| 4040 |    835.603 |    1348.97 |    3503.86 |     151.47 |    497.261 |    113.531 |
| 4041 |    835.603 |    1348.97 |    3503.86 |     151.47 |    497.261 |    113.531 |
| 4042 |    7.34474 |    11.9245 |    23.5171 |    2.01749 |   0.626358 |    45.1899 |
| 4043 |    7.34474 |    11.9245 |    23.5171 |    2.01749 |   0.626358 |    45.1899 |

#+CAPTION: Table showing velocities at Helheim gate pixels.
#+CAPTION: It appears the 2019_12_26 slowdown is due to the top right corner of this table.


Let's make a table to explore the errors. Something like:
|          | Date                   |
|----------+------------------------|
| pixel_id | Velocity (err) [err %] |


#+BEGIN_SRC jupyter-python :exports results :results raw drawer
df = vel.round(1).astype(np.str)
df = df + ' ('
df = df + err.round(1).astype(np.str)
df = df + ')'
ratio = (err/vel * 100).round(1).fillna(0).astype(np.str)
df = df + ' ['
df = df + ratio
df = df + '%]'
df
df[df.columns[-3:]].head(15)
#+END_SRC

#+RESULTS:
|      | 2019_12_02            | 2019_12_14            | 2019_12_26           |
|------+-----------------------+-----------------------+----------------------|
| 4008 | 6.4 (2.1) [33.3%]     | 8.7 (1.8) [20.8%]     | 5.4 (1.2) [23.1%]    |
| 4009 | 5907.7 (87.3) [1.5%]  | 5636.1 (94.5) [1.7%]  | 829.7 (7.8) [0.9%]   |
| 4010 | 5907.7 (87.3) [1.5%]  | 5636.1 (94.5) [1.7%]  | 829.7 (7.8) [0.9%]   |
| 4011 | 6232.7 (134.4) [2.2%] | 6049.4 (145.7) [2.4%] | 1212.0 (25.8) [2.1%] |
| 4012 | 5997.0 (114.4) [1.9%] | 5898.3 (81.3) [1.4%]  | 1689.7 (12.4) [0.7%] |
| 4013 | 6289.2 (170.7) [2.7%] | 6309.1 (129.3) [2.1%] | 2509.6 (35.9) [1.4%] |
| 4014 | 6765.2 (136.6) [2.0%] | 6147.1 (78.2) [1.3%]  | 6401.2 (55.3) [0.9%] |
| 4015 | 6765.2 (136.6) [2.0%] | 6147.1 (78.2) [1.3%]  | 6401.2 (55.3) [0.9%] |
| 4016 | 6313.4 (78.3) [1.2%]  | 6475.9 (77.7) [1.2%]  | 6065.0 (43.8) [0.7%] |
| 4017 | 7878.3 (41.9) [0.5%]  | 7861.3 (46.5) [0.6%]  | 7774.7 (41.2) [0.5%] |
| 4018 | 7878.3 (41.9) [0.5%]  | 7861.3 (46.5) [0.6%]  | 7774.7 (41.2) [0.5%] |
| 4019 | 8090.0 (59.5) [0.7%]  | 7988.5 (46.3) [0.6%]  | 7936.2 (32.4) [0.4%] |
| 4020 | 8090.0 (59.5) [0.7%]  | 7988.5 (46.3) [0.6%]  | 7936.2 (32.4) [0.4%] |
| 4021 | 8090.0 (59.5) [0.7%]  | 7988.5 (46.3) [0.6%]  | 7936.2 (32.4) [0.4%] |
| 4022 | 8675.1 (79.0) [0.9%]  | 8582.1 (75.2) [0.9%]  | 8592.0 (62.3) [0.7%] |

#+CAPTION: Published error product does not capture these bad pixels.
#+CAPTION: *NOTE*: Published error product is not useful. There is no way we know velocity to <1 %.

In summary - velocity drop seems non-physical, but is not captured in the error product. Perhaps an outlier filtering algorithm could be implemented to catch this type of issue.

Detangle scripts from Org source

The algorithm itself is stored in the Org source file. I tangle that file to generate the scripts, then run them. This is a barrier to non-Org users who want to run the code. The scripts could be tangled and then added to this repository. The downside to this is that the scripts would then exist in two places and are not likely to stay in sync. They could be maintained only outside of the Org file, but then we lose the benefit of literate programming.

Twitter bot

Tweet updated top_few.png image each time it is updated.
Provide link to dataverse

Improve the dh/dt data set

In the original paper, changes to the ice sheet surface (usually thinning) comes from Khan, 2016. That is an annual data set that does not cover the beginning or end of the time series.

It would be good to have an improved dh/dt data set that covers the full time series and has sub-annual resolution.

Sub-annual resolution raises a new complication: If snow is included in the ice thickness estimate, then the ice density assumption of 917 kg/m^3 is more incorrect.

Add a metric for coverage staleness

The coverage metric used in the paper deals with missing data. It could be useful to better understand which data is missing and for how long. For example,

  • Are there some pixels that appear in only one velocity product, and then are filled in at all other times?
  • If there are four sequential time stamps (see Table), is it the same 50 % missing at both times? Or is the 50 % missing (covered) at time t1 swapped at time t2? In the former case, the discharge estimate is less robust. The 50 % coverage filled in at t1 and t2 is interpolated from t0 to t3. In the latter case the discharge estimate is more robust. The 50 % coverage filled in at t1 is only interpolated from t0 to t2, and the 50 % coverage filled in at t2 is only interpolated from t1 to t3.
Time Coverage
t0 100 %
t1 50 %
t2 50 %
t3 100 %

A metric to track the duration of coverage gaps for each pixel (staleness) could help improve our uncertainty estimate.

Forty-six years of Greenland Ice Sheet mass balance from 1972 to 2018

Since this was accepted, Mouginot /et al./ (2019) https://doi.org/10.1073/pnas.1904242116 has been published.

  • Mouginot 2019 extends the time series slightly earlier in time. We opted to stop at 1986 due to noise in the velocity products prior to that.

  • Mouginot 2019 also has approximately 50 Gt / yr higher discharge than both our and the recent King /et al./ (2018), similar to many earlier papers, as mentioned in our paper. Reasons for the higher discharge are unknown, as we do not have access to their algorithms nor processed input data.

  • Mouginot 2019 also provides additional data: Total mass balance, in addition to discharge.

SE slowdown in 2022

There appears to be a decrease in discharge in the SE sector in 2022.

Given the magnitude and that this has not been reported elsewhere yet, I suspect it may be an artifact from low quality velocity data and not real.

Investigations are ongoing.

Update baseline thickness and dh/dt code

Current baseline thickness uses a GIMP map with widely varying timestamps among pixels.

We should:

  • Use a temporally consistent baseline DEM
  • Improve dh/dt treatment with updated dh/dt datasets from either Khan or Winstrup

MEaSUREs update

Subject: [CRYOLIST] Data set updates: MEaSUREs Greenland Annual/Quarterly/Monthly/6 and 12 day Ice Sheet Velocity Mosaics
To: "[email protected]" [email protected]
Date: 2022-09-27 13:08:20 -0700 [PDT] (Tue 01:08:20 PM)

Dear Colleague,

New versions for the MEaSUREs Greenland Annual/Quarterly/Monthly Ice Sheet Velocity Mosaics from SAR
and Landsat and the MEaSUREs Greenland 6 and 12 day Ice Sheet Velocity Mosaics from SAR data sets
are now available through the NASA National Snow and Ice Data Center Distributed Active Archive
Center (NSIDC DAAC). These annual, quarterly, monthly, and 6 and 12 day data sets have been
recalibrated to reduce biases imposed by quadratic fits to control points. A correction has been
added for the submergence/emergence velocity and interferometric phase data were added where
available. New data have also been added; temporal coverages now span 01 December 2014 - 30 November
2021 for the annual data, 01 December 2014 - 28 February 2022 for the quarterly and monthly data,
and 01 January 2015 - 22 June 2022 for the 6 and 12 day data. These data sets are part of the NASA
Making Earth System Data Records for Use in Research Environments (MEaSUREs) program.

The NSIDC DAAC provides access to documentation and these data sets:

MEaSUREs Greenland Annual Ice Sheet Velocity Mosaics from SAR and Landsat, Version 4

https://nsidc.org/data/nsidc-0725

Data set DOI: https://doi.org/10.5067/RS8GFZ848ZU9

MEaSUREs Greenland Quarterly Ice Sheet Velocity Mosaics from SAR and Landsat, Version 4

https://nsidc.org/data/nsidc-0727

Data set DOI: https://doi.org/10.5067/BGBF7KY84Q2N

MEaSUREs Greenland Monthly Ice Sheet Velocity Mosaics from SAR and Landsat, Version 4

https://nsidc.org/data/nsidc-0731

Data set DOI: https://doi.org/10.5067/LRCN2Z8KYYX6

MEaSUREs Greenland 6 and 12 day Ice Sheet Velocity Mosaics from SAR, Version 2

https://nsidc.org/data/nsidc-0766

Data set DOI: https://doi.org/10.5067/1AMEDB6VJ1NZ

Automatic monitoring

To support auto-updating with minimal human interaction, we should implement some level of automatic monitoring.

  • Implement some algorithm that looks at the last timestamp relative to previous ones, and flags out of limit values
    • e.g. if latest discharge is > 2 sigma from historical, flag it
  • Mail graphic and/or tabular summaries to key personnel
  • Delay pushing (or publishing?) the update to the Dataverse by ~24 hours, giving key personnel an opportunity to stop the update if necessary.

Improve time series filtering

The filter used to flag suspected invalid velocities is basic. It could be improved, perhaps using pyts or some other library rather than self-coding everything. The Imaging time series capabilities looks interesting and might be worth exploring.

Code?

Hi @mankoff ! Congrats on the great work, I think it will be very useful to the community.

The paper mentions that the code is located here - do you plan to do that? Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.