Comments (3)
Could you provide the code examples to show the problem?
from smt.
The script i use is the following :
from __future__ import print_function, division
import numpy as np
from scipy import linalg
from smt.utils import compute_rms_error
from smt.problems import WingWeight, Sphere
from smt.sampling_methods import LHS, Random, FullFactorial
from smt.surrogate_models import LS, QP, KPLS, KRG, KPLSK, GEKPL
try:
import matplotlib.pyplot as plt
plot_status = True
except:
plot_status = False
import scipy.interpolate
from mpl_toolkits.mplot3d import Axes3D
import matplotlib.pyplot as plt
from matplotlib import cm
import pylab
########### Initialization of the problem, construction of the training and validation points
ndim = 10 #number of variables
ndoe = 300
#Construction of the DOE for the fuction
fun = WingWeight(ndim = ndim)
sampling = FullFactorial(xlimits = fun.xlimits)
b = sampling(ndoe)
#Compute the outputs
f = fun(b)
# Construction of the validation points
nval = 100
sampling = LHS(xlimits = fun.xlimits, criterion = 'cm' )
bval = sampling(nval)
fval = fun(bval)
#plot of validation vs sampling points
fig = plt.figure(1)
ax = fig.add_subplot(2, 1, 1)
ax.plot(f, f, "o" , label = 'sampling points')
ax.plot(fval, fval, 'r.', label = 'validation points')
ax.set_xlabel('F')
ax.set_ylabel('F')
ax.legend(loc='upper left')
plt.title('Plot of validation vs sampling points')
ax.grid()
pylab.show()
#-------------------------- The KPLSK model--------------------------------#
t = KPLSK(n_comp = ndim, theta0 = [1e-2]*ndim , poly = 'linear', corr = 'squar_exp')
t.set_training_values(b,f)
t.train()
# Prediction of the validation points
fnew = t.predict_values(bval)
print('KPLSK model, err: '+ str(compute_rms_error(t,bval,fval)))
if plot_status:
# Plot the function, the prediction and the 95% confidence interval based on
# the MSE
fig = plt.figure()
plt.plot(fval, fval, '-', label='$F_{true}$')
plt.plot(fval, fnew, 'r.', label='$\hat{F}$')
plt.xlabel('$F_{true}$')
plt.ylabel('$\hat{F}$')
plt.legend(loc='upper left')
plt.title('KPLSK model: validation of the prediction model')
if plot_status:
plt.show()
# Value of theta
print("theta values", t.optimal_theta)
Which comes with the following error for number of training points ndoe < 600 :
*Training ...
/home/dmitris/.local/lib/python3.8/site-packages/sklearn/cross_decomposition/_pls.py:353: UserWarning: X scores are null at iteration 9
warnings.warn('X scores are null at iteration %s' % k)
capi_return is NULL
Call-back cb_calcfc_in__cobyla__user__routines failed.
Traceback (most recent call last):
File "KPLSK_easy_wing.py", line 73, in
t.train()
File "/home/dmitris/Programs/SMT/smt-master/smt/surrogate_models/surrogate_model.py", line 248, in train
self._train()
File "/home/dmitris/Programs/SMT/smt-master/smt/surrogate_models/krg_based.py", line 123, in _train
self._new_train()
File "/home/dmitris/Programs/SMT/smt-master/smt/surrogate_models/krg_based.py", line 111, in _new_train
self.optimal_rlf_value, self.optimal_par, self.optimal_theta = self._optimize_hyperparam(
File "/home/dmitris/Programs/SMT/smt-master/smt/surrogate_models/krg_based.py", line 450, in _optimize_hyperparam
optimal_theta = 10.0 ** optimize.fmin_cobyla(
File "/usr/lib/python3/dist-packages/scipy/optimize/cobyla.py", line 166, in fmin_cobyla
sol = _minimize_cobyla(func, x0, args, constraints=con,
File "/usr/lib/python3/dist-packages/scipy/optimize/cobyla.py", line 250, in _minimize_cobyla
xopt, info = _cobyla.minimize(calcfc, m=m, x=np.copy(x0), rhobeg=rhobeg,
File "/usr/lib/python3/dist-packages/scipy/optimize/cobyla.py", line 242, in calcfc
f = fun(x, args)
File "/home/dmitris/Programs/SMT/smt-master/smt/surrogate_models/krg_based.py", line 414, in minus_reduced_likelihood_function
return -self._reduced_likelihood_function(theta=10.0 ** log10t)[0]
File "/home/dmitris/Programs/SMT/smt-master/smt/surrogate_models/krg_based.py", line 207, in _reduced_likelihood_function
raise Exception(
Exception: F is too ill conditioned. Poor combination of regression model and observations.
For ndoe >= 600 :
When the number of training points exceeds a specific number, the training process proceeds uninterrupted
from smt.
If you try with LHS it works fine. The thing is with FullFactorial points are "aligned" and this can lead too ill-conditionned kriging matrix. Moreover using KPLSK you should take of number of components n_comp
smaller than ndim
. Indeed the purpose is to solve the problem in the reduced space of principal components.
from smt.
Related Issues (20)
- Installation error HOT 4
- Fixing seed of surrogates HOT 1
- Does SMT support parallel computing or GPU computing? HOT 1
- SMT version 1.3 or 2.0? HOT 2
- Can't Run Example EGO Code HOT 2
- Kriging surrogate model in case of multiple outputs HOT 5
- Sampling Method sometimes gets stuck HOT 2
- pyDOE2 non-functional with python 3.12, appears to be unsupported. HOT 5
- Question on drawing drawing sample from MFK GP posterior. HOT 3
- Any interest in putting this on conda-forge? HOT 3
- Zero predicted variance while prediction differs from ground truth HOT 1
- Adaptive sampling methods for Kriging models and its variants including multi-fidelity Kriging HOT 1
- AttributeError: 'DesignSpace' object has no attribute '_cs' HOT 4
- InactiveHyperparameterSetError encountered when solving the HierarchicalGoldstein problem. HOT 3
- data-driven Multi-Fidelity Kriging HOT 1
- ValueError by passing an int instead of a float HOT 3
- SMT 2.5 is not available on conda-forge HOT 2
- Error from SMT --> ValueError: setting an array element with a sequence. HOT 4
- Question on new syntax to train GENN (JENN) models HOT 6
- Meta: add "Discussions" to GitHub repo? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from smt.