Giter Club home page Giter Club logo

projects's Introduction


Welcome to codema-dev projects!

Download, wrangle & explore all Irish energy datasets used by the codema-dev team

⚠️ Some projects use closed-access datasets for which you will need permission from the codema-dev team to use! Email us at [email protected]


Setup

Run the projects in your browser by clicking on the following buttons:

Binder ⬅️ click me to launch workspace

⬅️ click me

Binder can take a few minutes to setup this workspace, click Build logs > show to see view the build progress.

  • Double click on the project you want to open

  • Right click on the README.md file, Open With > Notebook and run all cells

open-with-notebook.png


Binder runs this code in the cloud for free with the help of NumFocus, if you find this useful consider donating to them here

This link was generated using:


Gitpod ready-to-code ⬅️ click me launch workspace

⬅️ click me
  • Double click on the project you want to open

  • Right click README.md > Open Preview to view the project guide

  • Change your Terminal directory to a project folder by running:

    cd NAME-OF-PROJECT

⚠️ Warning! ⚠️

  • If (/workspace/projects/venv) disappears from your prompt this means your Terminal no longer has access to all of the dependencies required to run projects so you need to reactivate it by running:
    conda activate /workspace/projects/venv
  • If the Terminal disappears from the bottom of your screen click ≡ > Terminal > New`` Terminal

💻 Running locally

⬅️ click me

Easy:

Lightweight:

  • Install:

  • Install all project dependencies via each project's environment.yml in your Terminal:

    conda create env --file environment.yml && conda activate NAME-OF-ENVIRONMENT
    

    Click the environment.yml to view the environment name

  • Follow the GitPod instructions


How-To Guides

⚠️ Accessing closed-access data

⬅️ click me
  • Create a new file called .env in your project directory

  • Add your s3 credentials to the .env file:

AWS_ACCESS_KEY_ID = "AKIA...."
AWS_SECRET_ACCESS_KEY = "KXY6..."

❓ FAQ

⬅️ click me
  • If after running a project you see ...

    (1)

    botocore.exceptions.NoCredentialsError: Unable to locate credentials

    ... follow the instructions at ⚠️ Accessing closed-access data

    (2)

    ModuleNotFoundError

    ... install the missing module with conda install NAME or pip install NAME and raise an issue on our Github


Tools

Masterplan Tools

Data Storage

All raw data is saved on both Google Drive and Amazon s3. Amazon s3 is easier to query from within code than Google Drive as it is possible to authenticate via environment variables to avoid a username/password login step. Google Drive is still used for all data manipulated by Excel or QGIS. Amazon s3 enables the sharing of data between projects by storing intermediate datasets which in most cases here change only in frequently.

Code Storage & Version Control

All code is saved to GitHub which uses git for version control: updating, reverting, branching, merging etc.

Code Engine

Getting code up and running on your local machine can be somewhat involved. Code engines such as binder or Gitpod enable running this code on cloud machines for free. They automate the building of the required installations using configuration files: environment.yml for binder and .gitpod.yml + .gitpod.Dockerfile for Gitpod.

Package Management

All Python packages are installed (mostly) from the conda-forge channel using the conda package manager.

Code

Package Use Equivalent-To Example-Use
pandas Data wrangling, visualisation & analysis Microsoft Excel Estimating annual residential heat loss by combining columns and constants
GeoPandas Geodata wrangling, visualisation & analysis QGIS Linking small areas to postcode boundaries
Ploomber To specify and execute all of the steps that need to be run in order to generate the output datasets or visualisations - Downloading and cleaning building data, and plotting district heating viability on a map
seaborn Plotting charts and maps QGIS Plotting building energy ratings
bokeh Plotting interactive charts and maps Tableau Plotting district heating viability on a map
NetworkX Graph analysis - Finding the nearest substation to each region along the nearest electricity line
Scikit Learn Machine learning - Clustering substations via intersubstation distances

NoCode

Package Use Equivalent-To Example-Use
Microsoft Excel Data wrangling, visualisation & analysis pandas Estimating waste heat source potential
Google Sheets Data wrangling, visualisation & analysis pandas ""
QGIS Geodata wrangling, visualisation & analysis GeoPandas Plotting report-ready images of district heating viability
Tableau Plotting charts and maps QGIS Plotting residential fuel poverty & hosting it online on Tableau Public

Website

Jekyll is used to generate the website from simple text (or Markdown) files and a pre-defined template.

It creates the necessary HTML, css & JavaScript files

GitHub Pages is used to build and deploy the website from the file generated by Jekyll


Why?

In previous years all data wrangling was performed solely using Microsoft Excel. Although this is useful for small datasets, it soon becomes a burden when working with multiple, large datasets.

For example, when generating the previous residential energy estimates it was necessary to create up to 16 separate workbooks for each local authority each containing as many as 15 sheets, as the datasets were too large to fit into a single workbook. Although each workbook performed the same logic to clean and merge datasets, changing this logic meant changing all of the separate workbooks one at a time.

Moving to open-source scripting tools enabled using logic written down in scripts (or text files) to wrangle and merge data files, thus separating data from the logic operating on it. This means that if any dataset is updated, re-generating outputs is as simple as running a few scripts. Furthermore these scripts can be shared without sharing the underlying datasets.


Tools Considered

Criteria: a tool capable of modelling retrofitting hundreds of thousands of buildings to estimate energy & carbon savings, BER rating improvement and costs.

EnergyPLAN is an energy system model that works well for comparing aggregated demand against renewable supply profiles. It doesn't, however, model individual buildings and instead requires aggregated inputs for building energy demands.

SEAI's Dwelling Energy Assessment Procedure (DEAP) Excel model, EnergyPlus and RC_BuildingSimulator can model individual buildings using simple physics-based simulations but are difficult to scale. As a result, it is necessary to create a limited number of representative archetypes (<100) in order to use them to model building stocks. At present, archetype creation for these models is a long, manual process. To avoid this limitation some scripting libraries were experimented with to see if this process could be sped up:

  • DEAP: pycel enables replacing individual building characteristics specified in a DEAP Excel model via a Python process, however, as of January 2020 pycel library didn't support all operations performed in the DEAP spreadsheet.

  • EnergyPlus: eppy enables replacing building characteristics and geomeppy geometry-specific characteristics via Python. As of September 2020 these libraries are better suited to parameterising existing models than for creating them from scratch.

RC_BuildingSimulator is a Python library and so can be easily scaled. This library wasn't used as it is not actively maintained, cumbersome to adapt to this use case and would require some validation as to its accuracy as it is not a widely used library.

CityEnergyAnalyst also models individual buildings using physics-based simulations but is designed for district-level simulations. However, it is tied to Openstreetmaps as a data source for building geometries and ages and to swiss building standards by building age for archetypes. As of October 2020 Openstreetmaps was not as complete as in Switzerland, and decoupling CityEnergyAnalyst from it proved difficult.

Tool Barrier
EnergyPLAN Modelling building energy demands
DEAP Scaling building energy demands
EnergyPlus ""
RC_BuildingSimulator Adaptation & validation for the Dublin building stock
CityEnergyAnalyst Poor data quality for Dublin buildings

As a consequence, we developed rc-building-model which re-implements the DEAP model in Python. This model was tested and validated against the DEAP Excel model for individual buildings, and implemented to easily and rapidly scale to the Dublin building stock.


Keeping the global environment.yml up to date

This environment.yml is built by merging the environment.yml from each project. Binder & GitPod use it to create a sandbox environment in which all dependencies are installed.

To update this file run:

conda env create --file environment.meta.yml --name codema-dev-projects-meta
conda activate codema-dev-projects-meta
invoke merge-environment-ymls

conda env create creates a virtual environment by reading environment.meta.yml in which invoke is defined as a dependency. invoke then runs the function merge_environment_ymls from tasks.py which merges the environment.yml from each project and from environment.meta.yml together into a single environment.yml

To speed up Binder builds, Binder reads the codema-dev/projects dependencies from a separate repository codema-dev/projects-sandbox. You must also update the environment.yml here with your newly generated environment.yml to keep Binder up to date!

Every time any file is changed Binder rebuilds the entire repository and reinstalls the dependencies. By keeping the environment and the content separate Binder only reinstalls dependencies when the dependencies change. This means that it no longer has to download & resolve dependency conflicts which can take ~20 minutes.

projects's People

Contributors

john-oshea avatar oisindoherty3 avatar rdmolony avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

projects's Issues

Retrofit impact failed downloading small area boundaries

/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/dag/dag.py:390: UserWarning: 
=========================== DAG render with warnings ===========================
- NotebookRunner: plot_retrofit_costs -> MetaProduct({'nb': File('data/no..._costs.ipynb')}) -
- /home/jovyan/projects/estimate-retrofit-impact-on-heat-pump-viability/plot_retrofit_costs.py -
:1:1 'geopandas as gpd' imported but unused
- NotebookRunner: plot_energy_savings -> MetaProduct({'nb': File('data/no...avings.ipynb')}) -
- /home/jovyan/projects/estimate-retrofit-impact-on-heat-pump-viability/plot_energy_savings.py -
:1:1 'geopandas as gpd' imported but unused
- NotebookRunner: plot_pre_vs_post_retrofit_bers -> MetaProduct({'nb': File('data/no...t_bers.ipynb')}) -
- /home/jovyan/projects/estimate-retrofit-impact-on-heat-pump-viability/plot_pre_vs_post_retrofit_bers.py -
:1:1 'geopandas as gpd' imported but unused
============================== Summary (3 tasks) ===============================
NotebookRunner: plot_retrofit_costs -> MetaProduct({'nb': File('data/no..._costs.ipynb')})
NotebookRunner: plot_energy_savings -> MetaProduct({'nb': File('data/no...avings.ipynb')})
NotebookRunner: plot_pre_vs_post_retrofit_bers -> MetaProduct({'nb': File('data/no...t_bers.ipynb')})
=========================== DAG render with warnings ===========================

  warnings.warn(str(warnings_))
Building task 'plot_uvalue_distribution':  23%|▍ | 3/13 [00:26<01:53, 11.34s/it]
Executing:   0%|                                       | 0/12 [00:00<?, ?cell/s]
Executing:   8%|██▌                            | 1/12 [00:01<00:17,  1.58s/cell]
Executing:  17%|█████▏                         | 2/12 [00:02<00:14,  1.42s/cell]
Executing:  33%|██████████▎                    | 4/12 [00:03<00:04,  1.72cell/s]
Executing:  50%|███████████████▌               | 6/12 [00:09<00:10,  1.80s/cell]
Executing:  58%|██████████████████             | 7/12 [00:09<00:07,  1.47s/cell]
Executing:  67%|████████████████████▋          | 8/12 [00:10<00:04,  1.22s/cell]
Executing:  75%|███████████████████████▎       | 9/12 [00:10<00:03,  1.00s/cell]
Executing:  83%|█████████████████████████     | 10/12 [00:11<00:01,  1.00cell/s]
Executing:  92%|███████████████████████████▌  | 11/12 [00:12<00:00,  1.03cell/s]
Executing: 100%|██████████████████████████████| 12/12 [00:14<00:00,  1.18s/cell]
Building task 'plot_retrofit_costs':  62%|████▎  | 8/13 [02:08<01:44, 20.84s/it]
Executing:   0%|                                       | 0/17 [00:00<?, ?cell/s]
Executing:   6%|█▊                             | 1/17 [00:02<00:39,  2.44s/cell]
Executing:  12%|███▋                           | 2/17 [00:03<00:22,  1.51s/cell]
Executing:  24%|███████▎                       | 4/17 [00:03<00:07,  1.63cell/s]
Executing:  35%|██████████▉                    | 6/17 [00:03<00:03,  2.76cell/s]
Executing:  41%|████████████▊                  | 7/17 [00:09<00:16,  1.67s/cell]
Executing:  47%|██████████████▌                | 8/17 [00:09<00:11,  1.31s/cell]
Executing:  53%|████████████████▍              | 9/17 [00:09<00:07,  1.03cell/s]
Executing:  59%|█████████████████▋            | 10/17 [00:09<00:05,  1.37cell/s]
Executing:  65%|███████████████████▍          | 11/17 [00:09<00:03,  1.77cell/s]
Executing:  71%|█████████████████████▏        | 12/17 [00:10<00:02,  2.22cell/s]
Executing:  76%|██████████████████████▉       | 13/17 [00:10<00:01,  2.80cell/s]
Executing:  82%|████████████████████████▋     | 14/17 [00:10<00:00,  3.47cell/s]
Executing:  88%|██████████████████████████▍   | 15/17 [00:10<00:00,  4.23cell/s]
Executing:  94%|████████████████████████████▏ | 16/17 [00:10<00:00,  4.76cell/s]
Executing: 100%|██████████████████████████████| 17/17 [00:11<00:00,  1.52cell/s]
Building task 'plot_energy_savings':  77%|████▌ | 10/13 [02:54<01:09, 23.25s/it]
Executing:   0%|                                       | 0/26 [00:00<?, ?cell/s]
Executing:   4%|█▏                             | 1/26 [00:02<01:05,  2.60s/cell]
Executing:   8%|██▍                            | 2/26 [00:03<00:36,  1.52s/cell]
Executing:  12%|███▌                           | 3/26 [00:03<00:20,  1.14cell/s]
Executing:  15%|████▊                          | 4/26 [00:03<00:12,  1.73cell/s]
Executing:  19%|█████▉                         | 5/26 [00:03<00:08,  2.40cell/s]
Executing:  23%|███████▏                       | 6/26 [00:03<00:06,  3.23cell/s]
Executing:  27%|████████▎                      | 7/26 [00:09<00:39,  2.08s/cell]
Executing:  31%|█████████▌                     | 8/26 [00:14<00:55,  3.09s/cell]
Executing:  35%|██████████▋                    | 9/26 [00:19<01:01,  3.60s/cell]
Executing:  38%|███████████▌                  | 10/26 [00:19<00:40,  2.52s/cell]
Executing:  42%|████████████▋                 | 11/26 [00:19<00:27,  1.80s/cell]
Executing:  46%|█████████████▊                | 12/26 [00:19<00:18,  1.29s/cell]
Executing:  50%|███████████████               | 13/26 [00:20<00:12,  1.06cell/s]
Executing:  54%|████████████████▏             | 14/26 [00:20<00:08,  1.42cell/s]
Executing:  58%|█████████████████▎            | 15/26 [00:20<00:05,  1.89cell/s]
Executing:  62%|██████████████████▍           | 16/26 [00:20<00:04,  2.44cell/s]
Executing:  65%|███████████████████▌          | 17/26 [00:20<00:02,  3.11cell/s]
Executing:  69%|████████████████████▊         | 18/26 [00:20<00:02,  3.83cell/s]
Executing:  73%|█████████████████████▉        | 19/26 [00:20<00:01,  4.54cell/s]
Executing:  77%|███████████████████████       | 20/26 [00:20<00:01,  5.18cell/s]
Executing:  81%|████████████████████████▏     | 21/26 [00:21<00:01,  4.75cell/s]
Executing:  85%|█████████████████████████▍    | 22/26 [00:21<00:00,  5.30cell/s]
Executing:  88%|██████████████████████████▌   | 23/26 [00:21<00:00,  5.74cell/s]
Executing:  92%|███████████████████████████▋  | 24/26 [00:21<00:00,  5.19cell/s]
Executing:  96%|████████████████████████████▊ | 25/26 [00:21<00:00,  5.52cell/s]
Executing: 100%|██████████████████████████████| 26/26 [00:22<00:00,  1.14cell/s]
Building task 'plot_pre_vs_post_retrofit_bers':  85%|▊| 11/13 [03:17<00:46, 23.1
Executing:   0%|                                       | 0/10 [00:00<?, ?cell/s]
Executing:  10%|███                            | 1/10 [00:03<00:32,  3.62s/cell]
Executing:  20%|██████▏                        | 2/10 [00:04<00:16,  2.06s/cell]
Executing:  40%|████████████▍                  | 4/10 [00:04<00:04,  1.25cell/s]
Executing:  60%|██████████████████▌            | 6/10 [00:04<00:01,  2.19cell/s]
Executing:  80%|████████████████████████▊      | 8/10 [00:14<00:04,  2.26s/cell]
Executing:  90%|███████████████████████████▉   | 9/10 [00:15<00:01,  1.86s/cell]
Executing: 100%|██████████████████████████████| 10/10 [00:16<00:00,  1.61s/cell]
Building task 'plot_pre_vs_post_retrofit_bers': 100%|█| 13/13 [03:34<00:00, 16.4
Traceback (most recent call last):
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/cli/io.py", line 20, in wrapper
    fn(**kwargs)
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/cli/build.py", line 51, in main
    report = dag.build(force=args.force, debug=args.debug)
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/dag/dag.py", line 482, in build
    report = callable_()
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/dag/dag.py", line 581, in _build
    raise build_exception
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/dag/dag.py", line 514, in _build
    show_progress=show_progress)
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/executors/serial.py", line 138, in __call__
    raise DAGBuildError(str(exceptions_all))
ploomber.exceptions.DAGBuildError: 
=============================== DAG build failed ===============================
- PythonCallable: download_small_area_boundaries -> File('data/external/dub..._routing_keys.gpkg') -
- /srv/conda/envs/notebook/lib/python3.7/site-packages/codema_dev_tasks/requests.py:8 -
multiprocessing.pool.RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/fsspec/implementations/http.py", line 394, in _info
    **kwargs,
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/fsspec/implementations/http.py", line 753, in _file_info
    r.raise_for_status()
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/aiohttp/client_reqrep.py", line 1005, in raise_for_status
    headers=self.headers,
aiohttp.client_exceptions.ClientResponseError: 403, message='Forbidden', url=URL('https://codema-dev.s3.eu-west-1.amazonaws.com/dublin_small_area_boundaries_in_routing_keys.gpkg')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/tasks/abc.py", line 562, in _build
    res = self._run()
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/tasks/abc.py", line 669, in _run
    self.run()
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/tasks/tasks.py", line 124, in run
    out = self.source.primitive(**params)
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/codema_dev_tasks/requests.py", line 34, in fetch_file
    with fsspec.open(url, "rb") as remote_file:
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/fsspec/core.py", line 103, in __enter__
    f = self.fs.open(self.path, mode=mode)
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/fsspec/spec.py", line 1012, in open
    **kwargs,
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/fsspec/implementations/http.py", line 349, in _open
    size = size or self.info(path, **kwargs)["size"]
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/fsspec/asyn.py", line 91, in wrapper
    return sync(self.loop, func, *args, **kwargs)
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/fsspec/asyn.py", line 71, in sync
    raise return_result
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/fsspec/asyn.py", line 25, in _runner
    result[0] = await coro
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/fsspec/implementations/http.py", line 402, in _info
    raise FileNotFoundError(url) from exc
FileNotFoundError: https://codema-dev.s3.eu-west-1.amazonaws.com/dublin_small_area_boundaries_in_routing_keys.gpkg

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/srv/conda/envs/notebook/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/tasks/abc.py", line 581, in _build
    raise TaskBuildError(msg) from e
ploomber.exceptions.TaskBuildError: Error building task "download_small_area_boundaries"
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/executors/serial.py", line 186, in catch_exceptions
    fn()
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/executors/serial.py", line 159, in __call__
    return self.fn(**self.kwargs)
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/executors/serial.py", line 166, in catch_warnings
    result = fn()
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/executors/serial.py", line 159, in __call__
    return self.fn(**self.kwargs)
  File "/srv/conda/envs/notebook/lib/python3.7/site-packages/ploomber/executors/serial.py", line 220, in build_in_subprocess
    report, meta = res.get()
  File "/srv/conda/envs/notebook/lib/python3.7/multiprocessing/pool.py", line 657, in get
    raise self._value
ploomber.exceptions.TaskBuildError: Error building task "download_small_area_boundaries"
=============================== Summary (1 task) ===============================
PythonCallable: download_small_area_boundaries -> File('data/external/dub..._routing_keys.gpkg')
=============================== DAG build failed ===============================

Add functional tests to each project

Is your feature request related to a problem? Please describe.
It won't be possible to update project dependencies without creating bugs unless there are tests to catch where these bugs occur

Describe the solution you'd like
Basic functional tests for each project...

Each project produces the expected output given some sample data

Note: Mocking out s3 or 3rds party sources

Add table of tools tried

Is your feature request related to a problem? Please describe.
We tried to use a bunch of pre existing tools and ended up making our own. We need to document this

Freeze the dependencies

Is your feature request related to a problem? Please describe.
These projects will break if the dependencies are not frozen as 3rd party APIs will change. If this repo is to be kept up to date it needs a basic functional test for each project to test that each project does what it says it does from the users point of view. These tests will then catch breaking API changes on updates and will enable updating frozen dependencies to newer dependencies in the future

Create Residential Stock stuck filling unknowns with archetypes on binder

Building task 'extract_buildings_meeting_conditions':  64%|▋| 7/11 [00:44<00:40,Buildings in Dublin: 298142
Buildings meeting conditions: 203746
Building task 'fill_unknown_buildings_with_archetypes':  91%|▉| 10/11 [01:53<00:

pandas memory issues on binder - need to replace it with vaex or dask to run the same operations on binder & neither dask nor vaex supports an equivalent to pd.combine_first which fills unknown values in one dataframe with known values from another

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.