Giter Club home page Giter Club logo

neural-proof-nets's Introduction

neural-roof-nets

About

Code for the paper Neural Proof Nets (2009.12702).

Update 12/2022

No longer maintained. Follow spindle for updates.

Update 19/04/2021

This branch contains an updated version of the original submission, with many major improvements on the output data structures, model architecture and training process. Please make sure to also update your trained weights when upgrading.

  • The bi-modal decoder is replaced with an encoder and a highway connection to the supertagging decoder. This way we can still get lexically informed atom representations but without the quadratic memory cost of the cross attention matrix.
  • Beam search now penalizes structural errors (like incorrect type constructions and frames failing an invariance check) -- searching with a high beam width should now return more passing analyses, increasing coverage but at the cost of inference time.
  • RobBert used instead of BERTje.

The new model's benchmarks and relative differences to the original are reported in the table below.

Metric (%) Greedy Beam (2) Beam (3) Beam (5) Type Oracle
Coverage 89.9 (N/A) 95.3 (N/A) 96.1 (N/A) 97 (N/A) 97.2 (N/A)
Token Accuracy 88.5 (+3.0) 93.0 (+1.6) 93.3 (+0.9) 93.7 (+0.5) --
Frame Accuracy 65.7 (+8.1) 68.1 (+2.8) 69.1 (+1.1) 70.0 (+0.4) --
λ→ Accuracy 69.4 (+9.4) 71.1 (+5.5) 71.8 (+4.1) 72.6 (+3) 91.2 (+5.8)
λ→◇□ Accuracy 66.4 (+9.5) 68.8 (+5.1) 69.8 (+3.9) 70.6 (+2.9) 91.2 (+5.8)

Usage

Installation

Python 3.9+

Clone the project locally. In a clean python venv do pip install -r requirements.txt

Inference

To run the model in inference mode:

  1. Download pretrained weights from here

  2. Unzip the downloaded file, and place its contents in a directory stored_models, alongside Parser. Your resulting directory structure should look like:

    +--Parser
       +--data
       +--neural
       +--parsing
       +--train.py
       +--__init__.py
    +--stored_models
       +--model_weights.model
    +--README.md
    +--requirements.txt
    
  3. Run a python console from your working directory, and run:

    >>> from Parser.neural.inference import get_model
    >>> model = get_model(device)
    >>> analyses = model.infer(xs, n)
    

    where device is either "cpu" or "cuda", xs a list of strings to parse and n the beam size. analyses will be a list (one item per input string) of lists (one item per beam) of Analysis objects. A non-failing analysis can be converted into a λ-term, as in the example below:

    >>> sent = "Wat is de lambda-term van dit voorbeeld?"
    >>> analysis = model.infer([sent], 1)[0][0]
    >>> proofnet = analysis.to_proofnet()
    >>> proofnet.print_term(show_words=True, show_types=False, show_decorations=True)
    'Wat ▵ʷʰᵇᵒᵈʸ(λx₀.is ▵ᵖʳᵉᵈᶜ(▿ᵖʳᵉᵈᶜ(x₀)) ▵ˢᵘ(▾ᵐᵒᵈ(van ▵ᵒᵇʲ¹(▾ᵈᵉᵗ(dit) voorbeeld?)) ▾ᵈᵉᵗ(de) lambda-term))'
    

Evaluation

To evaluate on the test set data, follow steps 1 and 2 of the previous paragraph. You will also need a binary version of the processed dataset, placed in the outermost project directory.

  1. You can download a preprocessed version here.

    • Alternatively, you can convert the original dataset into the parser format yourself by running the script in Parser.data.convert_aethel (additionally requires a local clone of the extraction code).

    Your directory structure should look like:

    +--Parser
       +--data
       +--neural
       +--parsing
       +--train.py
       +--__init__.py
    +--stored_models
       +--model_weights.p
    +--processed.p
    +--README.md
    +--requirements.txt
    
  2. Open a python console and run

    >>> from Parser.neural.evaluation import fill_table
    >>> results = fill_table(bs)
    

    where bs the list of beam sizes (ints) to test with. Note that this runs a single model instead of averaging, so a small variation to the paper reported numbers is to be expected.

Training

Follow step 1 of previous paragraph and take a look at Parser.train.

Help

If you get stuck and require assistance or encounter something unexpected, feel free to get in touch.

neural-proof-nets's People

Contributors

dependabot[bot] avatar konstantinoskokos avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

carodepourtales

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.