Giter Club home page Giter Club logo

paninij's Introduction

@PaniniJ

Build Status

A realization of Panini via Java compiler plugins. Panini is a capsule-oriented programming model, which introduces a new programming abstraction, the capsule. The main motivation behind this abstraction is to enable more modular reasoning about concurrent programs.

See https://paninij.github.io/ for basic usage and documentation.

paninij's People

Contributors

ddmills avatar dwtj avatar ekuntz11 avatar eyhlin avatar hridesh avatar jlmaddox avatar terenberger avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

paninij's Issues

Current Dynamic Transfer Checks are Unsound

I believe that I've discovered that the current method of checking transfer safety was unsound. In particular, the checks could not prevent the following transfer violation:

@Capsule
public class Foo
{
    @Local Bar b1;
    @Local Bar b2;

    public void proc() {
        Secret s = new Secret();
        b1.send(s);
        b2.send(s);
    }
}

This illustrates that some pointers on the stack also need to be checked in dynamic ownership check. We need a new methodology for constructing dynamic transfer checks.

Capsule procedures must be declared public

Capsule procedures should only be generated if the procedure is labelled as "public." Anything else is considered a private or protected method.

Currently, if a procedure has no modifier (package visibility), it will attempt to build an interface which matches the template, however the interface makes all methods public, and then there is a discrepancy between template and generated.

Procedure Return Type: Classes with public fields are unsafe

Suppose some capsule has a procedure of the form Point ones(), where Point is a Java class defined

public class Point {
  public int x;
  public int y;
  public void incX() {
    x++;
  }
}

and ones() is defined

  Point ones() {
    Point rv = new Point();
    rv.x = 1;
    rv.y = 1;
  }

This will trigger @PaninJ to create a duck which extends Point. This duck, any duck which extends Point, and any other duck which extends a class with public fields are all unsafe in three ways:

  • Data Consistency.
  • Data Races.
  • Incorrect Aliases

In both @PaninJ and panc, ducks extend a class in such a way that the user-defined methods of the duck delegates to a "results" object (i.e. the object with which the duck is resolved). This delegation strategy is safe when all user code interacts with the duck's methods: the object's data is kept consistent by consistently delegating to the encapsulated object and race conditions are avoided by blocking on all method calls until the object has been resolved.

However, by introducing public fields, a user can modify the fields of a duck without also modifying the state of the encapsulated results object. In the Point example, there are two copies of the two fields.

  • The coordinates on the duck itself are what the user will be interacting with when accessing the duck's fields.
  • The coordinates on the encapsulated results object are what the user will be interacting with whenever using the duck's methods.

These two sets of fields are not going to be consistent unless we implement a solution which keeps them consistent. We could implement a solution which copies each of the fields of the results object to the duck's fields when the duck is resolved. We could even implement a solution which syncs the two sets of fields at the entrance and exit of the duck's methods.

However, implementing these synchronization mechanisms will not solve two important problems:

  • Data Races: The user is able to read from and write to the duck's fields before that duck has even been resolved.
  • Incorrect Aliases: Some Java objects in the system may refer to the encapsulated object when they should be pointing to the duck. What would be the correct synchronization behavior if a Point's x coordinate was changed on both the duck and the encapsulated result? I think that the most recent change should be used. However, there would be no a way of knowing which one was modified most recently.

This idea of trying to synchronize the fields of two objects just can't work when the user is allowed to write to fields. Well, you might say that the fields need be declared final, however this solution won't work either, because if the duck's fields are final, then they cannot be updated once the duck has been resolved.

All of this seems to lead me to the conclusion that there is no way to safely allow non-private fields on a duck when the delegation strategy is used.

Target Java 1.7

Requires to get rid of a few places where we use lambdas and streams in code-gen.

Procedure Return Type: Classes defined in java or javax

Original issue by dwtj at: https://github.com/dwtj/panini/issues/41

In order to resolve issue https://github.com/dwtj/panini/issues/40, most ducks are put into the same package as the class which they are extending. This presents an important problem whenever a duck extends a class which is defined somewhere within the java or javax package hierarchies: I believe that the Oracle JVM does not allow the instantiation of any user-defined classes within these package hierarchies. Attempting to do so throws an exception of the form java.lang.SecurityException: Prohibited package name.

More on this issue is described in this stack overflow page. According to some answers here, there is no way around this.

There are some cases in which we could work around this issue without causing the problems described in issue https://github.com/dwtj/panini/issues/40: if the class that needs to be extended by a duck is defined within either java or javax and that class has NO package protected methods, then the corresponding duck can be put into the DEFAULT_DUCK_PACKAGE.

In the worst case, where the class that needs to be extended by a duck is defined within either java or javax and that class has a package protected method, then the following workaround could be used: the corresponding duck could be put into the DEFAULT_DUCK_PACKAGE and each of the package protected methods can be defined to just throw a runtime error (e.g. an UnsupportedOperationException). If we did this, then the duck would not be trying to delegate to a method to which it does not have permissions.

(There may also be some tricks that can be performed using reflection, but I don't have any insight on how and I am suspicious about whether there would be some problematic runtime costs.)

Procedure Return Type: Final Classes not supported

We cannot make a duck future which wraps a final class, because we need to be able to extend it.

Final classes also include all of the standard primitive wrappers: Integer, Double, etc, and String.

When a final class is returned from a procedure, we should block and include a warning.

All fields of a Capsule that are of capsule type must have @local or @imports annotations

If a capsule references another capsule, it must be annotated with @Local or @imports.

foreach field on capsule template {
  if field's type elem is capsule interface and not annotated with @local or @imports
    error
  else if there doesn't exists a template currently being compiled whose prefix matches the field's type
    error
  end if
}

for this we will need to have a list populated at the start of the annotation processor of all the templates that are being built.

Procedure Argument Type: Variable length

We have not investigated whether any of these capsule procedure declarations work:

public void foo(Object[] baz) {...}
public void foo(int[] baz) {...}
public void foo(Object... baz) {...}
public void foo(int... baz) {...}

Arguments are vastly simpler than the return types because they do not need to get futurized.

However, there are two potential problem area with these argument types.

  1. How the message ID’s (for the capsule queues) get serialized - is it enough?. This just needs to get tested.
  2. Variable-type argument can be primitive (4th example above). However, there comes a time when we forward (or delegate) these arguments from a concrete capsule to its template, and you cannot pass an array of primitives as a substitute for a variable-length argument:
public void foo(int... baz) {...}
foo([1, 2, 3, 4]); // Does not work

Document Runtime Package

Runtime package will be the most used/accessed package and can serve as the main point of documentation for users

Rename Project to @PaniniJ

Original issue by dwtj at: https://github.com/dwtj/panini/issues/36

Anything named PaniniPress should be changed to @PaniniJ. This will affect the PaniniPress annotation processor class and maybe some maven-related configuration.

Note that when the PaniniPress class name is changed, the capsule-generation project's service file will also need to be changed.

Procedure Argument Type: Arrays

Original issue by TErenberger at: https://github.com/dwtj/panini/issues/17

Currently a method that generates a duck cannot have an array as an argument. Currently class imports are handled in a very shallow method which will cause problems with things like type parameters and arrays where the actual class is under the surface.

There needs to be a method of finding these embedded classes and including them in the ducks import list.

[pipeline-refactor] Rules for @Future and @Block

The @Future and @Block are annotations that can be added to a Capsule's procedure in order to explicitly state that you want to return a java.util.concurrent.Future or you want the procedure to block until the value is obtained.

  1. A procedure cannot be annotated with both @Future and @Block
  2. Reserved declarations (run, design, init) cannot be annotated with @Future or @Block
  3. Private capsule methods cannot be annotated with @Future and @Block
  4. Procedures on signatures can be annotated with @Future and @Block - any class that implements the signature and it's procedures does not need to annotate with @Future or @Block
  5. Procedures which return a Final type will by default block
  6. Procedures which return a Primitive type will by default block
  7. Procedures which return a type with a final method will by default block
  8. A Procedure which returns a primitive type and has an @Future annotation must autobox the result in order to Futureize it

If a Procedure returns a type which cannot be converted into to a Duck Future, then it will block by default and throw a compiler warning.

Procedure Return Type: Primitives not supported

Java primitives cannot be returned from procedure calls, currently you must use a non-final, non-package-protected wrapper class.

Because of the nature of primitives and the operators that work on them, these procedures will need to block.

We could also potentially return a Futurized version of the wrapper class ("Integer") etc with an explicit annotation on the procedure definition: @Future.

Generated Capsules: Conditions About Whether a Capsule Needs a `main()` Method Are Too Naive

Original issue created by dwtj at: https://github.com/dwtj/panini/issues/42

As of commit f722f95, main() methods are added to every capsule which is both active and has no @wired capsules. (In that commit, the term "root" capsule was used to refer to capsules which met both of these conditions.) However, there are examples which demonstrate that these are not sufficient criteria for determining whether a capsule needs a main() method.

Consider a passive capsule type named Supervisor with five child capsule fields, each of type Worker. Suppose that

  • The Supervisor is passive.
  • The Supervisor wires itself into each of the five Worker capsules.
  • Each Worker is active.

In this case, the user should be able to start up an instance of a Supervisor, so a Supervisor should have a main(). However, if we use the current naive criteria, the Supervisor is not given a main() method.

I'm not exactly sure about the right way to solve this in general. But ideally, we want a solution which is defined modularly. By this I mean that we should be able to determine whether a given capsule type needs a main() from just the capsule implementation and the interfaces of the capsules to which it has references.

Are there any simple workarounds that we could use to fix this problem. For example, could we use the current naive criteria but then allow the user to use some sort of syntax to opt-into the inclusion of a main() method (e.g. with an annotation or an argument to the @Capsule annotation).

[capsule-generation/*] Need to generate both active and passive capsules

Original by dwtj at: https://github.com/dwtj/panini/issues/27

What are the differences between active and passive capsule? What parts of the capsule generation process need to be handled differently, depending on whether the capsule being generated is active or passive. One obvious example is the run() method: in the case of passive capsules, we need to auto-generate a run method which unpacks ducks (and other messages) and handles them appropriately; in the case of active capsules, the user-defined run() method needs to be delegated to.

What other differences are there? Do we even need the queue on active capsules? Maybe. I'm not sure. Maybe active capsules need the queue in order to handle messages other than ducks (e.g. shutdown and terminate). If we don't need the queue, then should active capsules with the Thread execution profile actually inherit from some different class than Capsule$Thread?

If indeed there are many differences, then we should consider carefully how we separate and organize passive-related code from active-related code. (The creation of another Make* class may be one way.)

Generated Capsules: Procedure ID's cannot be ambiguous

Generated capsule ID's are currently namespaced, but variables cannot contain periods, so they get replaced with underscores "_". Arrays variables (that end with brackets "[]" also need to be replaced, and are currently replaced with "Array". However, these names can still be ambiguous since a classname could potentially collide with the auto generated IDs.

Once solution would be to trash the named ID's and just put integers everywhere.
Another solution could be a reversible hash - could be potentially slower

Consider Simplifying Source Generation Using Library

The source code generation in @paninij is complex, inflexible, and prone to breaking with even minor changes. These properties stem from the fact that the source generation process works by concatenating strings in an unstructured way which is not supported by any type safety.

If at some point we need to make significant changes to the way in which source code is generated, it may be worthwhile to refactor some code to use a library meant for generating Java source code. Just today I became aware of two such libraries:

I suspect that there may also be other such libraries.

Note that it may be necessary to change our source generation code sometime soon to help Ganesha add the hybrid execution profile.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.