Comments (8)
Hello Andreas, I am closing this one, but feel free to reopen it with a feature request. We could indeed provide more convenience tooling, for example an abstraction for derived parameters (they'd have to be functions of the original parameters).
from syne-tune.
Hello,
Syne Tune does not support conditional configuration spaces. This is mostly because we've been focused on Bayesian optimization, where supporting conditional spaces is difficult.
For your first example, I'd use two variables, x [randint]
and x_is_none [choice]
. For the second, as you said, I'd encode the difference as non-negative. Again, we do not support fixed constraints between the parameters, as this would need constrained optimization of the acquisition function.
from syne-tune.
I think it would be nice to have some sugar that allows encoding factors or exponents; often you want powers of two, and/or factors of difference. These can be taken into account downstream but if synetune could do the translation internally, that would make the use easier.
Indeed, I can encode the factor myself but then the space that synetune operates in is different from the space I want to see in MLFlow.
It wouldn't require any changes to the BO to allow this translation, and it would also allow encoding things like powers of two more naturally.
Anyway, thanks for your replies so far, I'm actually running it on a real system now, let's see how it goes.
from syne-tune.
Hi Andreas,
Regarding power of two of exponents, it is already supported (unless I misunderstood your question).
For instance,
from syne_tune.config_space import logfinrange
hp = logfinrange(lower=8, upper=128, size=5, cast_int=True)
print(hp.sample(size=10))
what do you mean by factor of differences?
from syne-tune.
Oh it is, thank you, I overlooked that, I think I tried lograndint
but that wasn't the right one. Thank you!
What I mean, it something like "I want the hidden size of the transformer to be somewhere between 1x and 4x the encoding size" or something like that. Basically you want a conditional distribution of one parameter given another, and I think one of the simplest forms of that is that one is a random multiple of the other.
from syne-tune.
Oh I see, we do not support hyperparameters that are function from the others at the moment. I think it would make sense to support general conditional search space but it is a larger effort, in the meantime having one "ratio" as an hyperparameter is probably the easiest I would say.
from syne-tune.
I guess I was wondering how many of the conditional space cases can be covered by a reparametrization, which wouldn't require any changes to the optimizer, only to how the search space is presented to the optimizer.
I guess that's always possible but the question is whether any auxiliar variables are required and how easy it is to learn for a GP in the transformed space?
from syne-tune.
I believe the auxiliary variables should be learnable without too much difficulty for a GP but of-course it depends on the exact range being used, let us know if you observe something strange.
from syne-tune.
Related Issues (20)
- RemoteLauncher corrupts requirements.txt when not ending with newline HOT 5
- Conditional/Inactive hyperparameters HOT 6
- Troubles with maximising using MORandomScalarizationBayesOpt HOT 4
- Run BOHB/SyncBOHB using lcbench HOT 2
- Open `MultiObjectiveMultiSurrogateSearcher` to additional arguments HOT 2
- Simple example for learning curve plotting HOT 7
- Surprising results of trial values over time HOT 3
- Conditional sampling in configuration space HOT 4
- Using sigterm / catching sigterm to enable checkpointing HOT 10
- Docs for continuing aborted runs HOT 12
- Hard to find default configurations for schedulers HOT 3
- Difficulties setting rungs / stopping HOT 20
- GP not robust to NaN metric HOT 2
- Direct support for time as a resource? HOT 7
- Acquisition functions in Bayesian optimization HOT 1
- Update Ray dependencies, as dependabot flags them as security vulnerabilities
- Set custom GPU Ids for LocalBackend HOT 2
- [Question] Multiple runs for same parameter values HOT 5
- ModuleNotFoundError: No module named 'sagemaker.interactive_apps' HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from syne-tune.