Giter Club home page Giter Club logo

paramz's People

Contributors

alexgrig avatar beckdaniel avatar befelix avatar connorfuhrman-episci avatar martinbubel avatar mrksr avatar mzwiessele avatar robertocalandra avatar trevorcampbell avatar vmarkovtsev avatar zhenwendai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

paramz's Issues

parallel optimization is broken

When attempting to optimize in parallel an error is raised due to the optimization objects not being returned by the pool when begin unpickled. I submitted a PR #21 for this. I included a new test that should catch this issue in the future. It turns out the test I added before was not sufficient.

What does m.name[0].tie_to(other) do?

Hi everybody,
In the documentation of the Param class I found a mention to a method call tie_to, which, in my understanding, should tie a parameter in one model to some other parameter in the same or an other model, but I cannot find anywhere in the code where this is implemented. Maybe you can give me some pointers?

Some more details about what I am trying to do.
Say we have a custom Parameterized, called A, having a parameter alpha and two fields field1 and field2, both instances os an other custom Parameterized class B, having their own set of parameters [kappa, theta]: I would like to tie the value of A.alpha to A.field1.kappa and A.field2.kappa, while leaving theta independent for the two fields. Ideally, the parameters of A should be [alpha, kappa1, kappa2]. Is it somehow possible to implement this behaviour?

Thanks and best regards,

Giacomo

IS THIS PROJECT STILL ALIVE

Hello,
There are some frameworks using this package (e,g. GPy), so my question is: is this project still alive?
Last activity seams to be 2019.

Else it seems to need new maintainers.

tests seem broken

I tried to convert the testing from .travis.yml to both github actions and locally but it always finished with "0 tests detected".

Probably nose has changed over time?

Add LICENSE to MANIFEST

Dear all,
could you please add the respective BSD3 license to the MANIFEST, s.t. it is included in the tarball?

Cheers,
Simon

paramz print unwanted messages in console output

paramz/transformations.py is full of print() call that can not be filtered out. This can be cumbersome if one is doing many optimizations (e.g. for Monte Carlo simulation purpose). It would be nice to have a way to disable those messages.

Minimum required version for decorator

Hey,

please add a minimum required version for the decorator package to the setup.py file. paramz does not seem to work with decorator 3.4, which is the default version in Ubuntu 14.02. Everything works fine after updating decorator to the latest version (4.0.10).

Steps to reproduce:

Install decorator version 3.4.0

import GPy
import numpy as np
k = GPy.kern.Matern32(1)
k.K(np.array([[1]]))
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-7-457009ae13cf> in <module>()
----> 1 k.K(np.array([[1]]))

/usr/local/lib/python2.7/dist-packages/GPy/kern/src/kernel_slice_operations.pyc in wrap(self, X, X2, *a, **kw)
     83     @wraps(f)
     84     def wrap(self, X, X2 = None, *a, **kw):
---> 85         with _Slice_wrap(self, X, X2) as s:
     86             ret = f(self, s.X, s.X2, *a, **kw)
     87         return ret

/usr/local/lib/python2.7/dist-packages/GPy/kern/src/kernel_slice_operations.pyc in __init__(self, k, X, X2, diag, ret_shape)
     52         if (self.k._all_dims_active is not None) and (self.k._sliced_X == 0):
     53             self.k._check_active_dims(X)
---> 54             self.X = self.k._slice_X(X)
     55             self.X2 = self.k._slice_X(X2) if X2 is not None else X2
     56             self.ret = True

/usr/local/lib/python2.7/dist-packages/paramz/caching.pyc in g(obj, *args, **kw)
    274             obj = args[0]
    275             if not hasattr(obj, 'cache'):
--> 276                 obj.cache = FunctionCache()
    277             cache = obj.cache
    278             if self.f in cache:

AttributeError: 'numpy.ndarray' object has no attribute 'cache'
``

Using deprecated/dangerous method to set 'data' in numpy array

In numpy 1.12.0, release notes:

Assignment of ndarray object's data attribute

Assigning the 'data' attribute is an inherently unsafe operation as pointed out in gh-7083. Such a capability will be removed in the future.

This appears to happen where we assign directly to the data of param_array and gradient_full. I've noticed warnings from

paramz\parameterized.py:266
paramz\parameterized.py:267
paramz\core\parameter_core.py:284
paramz\core\parameter_core.py:285

Constraint that makes parameters sum to one?

Hello,

I am wondering whether param provides a function that constrains specified parameters sum to one? For example, if I have two parameters, say pi_1 and pi_2, I want pi_1 + pi_2 = 1.

Maybe does tie_to() implement this? (Couldnt find an explanation on tie)

Best,
pc

How can I get the value function in each iteration of optimization?

Hi,

I am using GPy and as it is using this package for model representation I think I should look here for my answer. Basically, I want to get the cost function of each iteration when the m.optimize() is called. I know if I set message=True, I will get some messages, but I need the details of every iteration.

Is there a way that I can get access to the callback function used in Scipy minimize method? Any help would be appreciated.

Thanks,
Sahand

How can I use the scipy optimizer to optimize my model?

I am trying to accomplish the following: Optimize the parameters of a numpy array using grid search.
For this, I want to avoid using the model.optimize method, but want to change the parameters of the model using a vector that has the same shape as the parameters vector.

Assume we have the following model (and that we have an evaluation function get_loss):

m = GPy.models.GPRegression(X, Y)

for coordinate_tuple in grid:
    m.assign_parameters(coordinate_tuple) # Where coordinate_tuple.shape == self.param_array.shape
    get_loss(m)
    ...

How can I modify the parameters vector? Can I simply re-assign to
m.parameters = coordinate_tuple or does it require more operations?

IPython checking in verbose_optimization.py doesn't work

In optimization/verbose_optimization.py, the following code tries to check whether it's being run from within an IPython notebook:

try:  # pragma: no cover
               from IPython.display import display
               from ipywidgets import IntProgress, HTML, Box, VBox, HBox
               self.text = HTML(width='100%')
               self.progress = IntProgress(min=0, max=maxiters)
               #self.progresstext = Text(width='100%', disabled=True, value='0/{}'.format(maxiters))
               self.model_show = HTML()
               self.ipython_notebook = ipython_notebook
           except:
               # Not in Ipython notebook
               self.ipython_notebook = False

but when I run code in my shell (not a notebook) that calls this, self.ipython_notebook is True after this block executes.

According to this stackoverflow post, a good way to do this check is

try:
   get_ipython
   self.ipython_notebook=True
except:
   self.ipython_notebook=False

For reference:
python 3.6.7
ipython 6.5.0
ipywidgets 7.4.0
paramz 0.9.4

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.