Giter Club home page Giter Club logo

montepython_public's People

Contributors

ajcuesta avatar baudren avatar benabed avatar dchooper avatar jesustorrado avatar lesgourg avatar montefra avatar surhudm avatar thomastram avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

montepython_public's Issues

cannot import name CosmoSevereError ?

Running Monte Python v2.0.4

with CLASS v2.2.0

Testing likelihoods for:
-> fake_planck_bluebook

Creating chains/lcdm/p/2014-04-22_1__2.txt

Traceback (most recent call last):
File "montepython/MontePython.py", line 41, in
sys.exit(run())
File "/.../montepython/montepython/run.py", line 40, in run
import sampler
File "/.../montepython/montepython/sampler.py", line 24, in
from classy import CosmoSevereError, CosmoComputationError
ImportError: cannot import name CosmoSevereError

Covariance matrix has negative eigenvalues

I am doing MCMC with Planck likelihood + additional custom parameters in the power spectrum (using generate_Pk.py). To get a covariance matrix, I did the follows:

(1) Do a short run:

montepython run --conf default.conf -p default.param -c covmat/base.covmat -o res_mcmc_short -N  1000

Here the base.covmat is the default covariance matrix which comes with montepython. Note that base.covmat only has standard parameters in it, not my custom parameters.

(2) After the short run, a covariance matrix is generated with all parameters. Then I plan to use the new covariance for a long run. However, there are eigenvalues of the new covariance matrix. So at the first chain step, the code stops, with error message

File "/home/wangyi/PublicCode/montepython/montepython/mcmc.py", line 116, in get_new_position
rd.gauss(0, 1)*data.jumping_factor

The error message is because in that line of code one calculate the sqrt(eigenvalue_of_covmat), which cannot proceed when the eigenvalue is negative.

Would you see if this is because I am doing something wrong, or it is a bug in the code? Thanks a lot!

===== Attachments =====

(1) My custom spectrum, and generated covariance matrix (in res_mcmc_short folder).

https://www.dropbox.com/s/1l312qzh2kjobs6/vary_all_mcmc.tar.gz?dl=0

(2) Mathematica notebook to show that the covariance matrix has negative eigenvalue.

https://www.dropbox.com/s/h6313jrzg15t9jd/load_covmat.nb?dl=0

Execution errors and memory leaks in MCMC for modified CLASS

Hi,

I am working with Montepython with a modified version of CLASS that computes predictions for modified gravity theories. These models often have pathological features for certain values of the parameters (ghosts, instabilities in the perturbations...). The code runs some class_tests so when a problem happens, CLASS throws an error and stops execution. Although this works fine for normal use, I've found problems when running MCMCs.

Specifically, when I run a chain in the cluster, the code finishes much earlier than expected. Examination of the output shows that

  • first the code encounters several problematic models (instabilities, etc...)
  • the last few errors correspond to class_alloc problems ("could not allocate pth->thermodynamics_table with size 1966272"), then Montepython stops the execution.

It seems that these errors produce memory leaks in the code, possibly because after class_test stops the execution of a model, the modules are not free (e.g. background_free, etc... are not called).

Please let me know if you have any hint on how to solve this problem.

I would also suggest for Montepython to keep a record of the problematic parameter values, together with the errors that they produced. This would be very useful for debugging and understanding certain models better.

Thanks,
Miguel

sigma8

Hello,
I am trying to insert sigma8 in the derived parameters. From what I can see in the classy.pyx and classy.pxd of the CLASS code version I am using (2.2.0), sigma8 seems to be a parameter that Montepython recognizes.
However when I put it in my param file

data.parameters['sigma8']    = [0.8,-1,-1, 0,     1,   'derived']

I obtain only zeros for sigma8.

Thanks in advance

Find the minimum of a distribution through a temperature option

On behalf of Yves Dirian:

Finally in order to find the maximum of the likelihood distribution, we
realized that the method of taking only a small jumping factor and restart
from the bestfit obtained with the global run was not efficient. Indeed by
doing so the -Loglike effectively decreases a bit, but iterating the
process leads each time to small decreases but at the end the method do not
seem to converge really fast. This is obviously caused by the fact that the
probability for accepting a new point is too big, and leads to a kind of
dispersive behavior such as the one you can find in the attached picture
wo_temp.png.

We therefore decreased this probability in adding a ''temperature''
parameter T such as the probability is now given by exp((Loglike(n) -
Loglike(n-1))*(1/T)), in analogy with the Boltzmann factor, and the method
seems much more appropriate for minimization procedures, as you can see in
w_temp.png. For our runs we chose T=10^-2 and still 0.1 for the jumping
factor, in order to have a good acceptance rate.

We thought that you could maybe implement such an option in Montepython,
since it appears to be really efficient and does not seem very time
consuming. Of course this method has some limits, but it seems to me that
if you start for the bestfit of a complete global run and if your
distribution is well peaked, there should be no problem in using it.

Required Numpy version too low

Hello,

I think you might want change the prerequisite version of numpy (currently 1.4.1 on readthedoc).
I was using 1.7.1, and my numpy.linalg.det had issue with the line 472 of montepython/likelihoods/euclid_lensing/init.py

det_theory[:] = np.linalg.det(Cov_theory[:, :, :])

Indeed my 1.7.1 version of det wanted a 2rank matrix, and could not handled this rank3 matrix.
This issue was solved by upgrading to numpy 1.11.1

I hope this helps!
Aurélien

How to specify priors?

Hi,

I'd like to add a Gaussian prior, let's say:
data.parameters['H0'] = [72., None, None, 1.0, 1, 'cosmo', 'gaussian', 73., 2.00]

Is that the right way of passing the input? In that case, there seems to be a bug in data.py, where the entry self['role'] = array[-1] should be replaced by self['role'] = array[5]. Do you agree?

Indeed, in prior.py it is defined:

self.prior_type = array[6].lower()
self.mu = array[7]
self.sigma = array[8]

Cheers,
Francesco

montepython with different versions of CLASS

I encountered a problem trying to run Montepython with a modified version of CLASS. I have several folders with different versions of CLASS, all contained in my code/ directory. Eventhough I edit the default.conf to include the proper path to the version I want for the run, Montepython picks up the one which is first alphabetically first ('class_bigravity' in my case). I made sure that the version I wanted contained the word 'class' in the directory.

This can be easily surpassed by adding a symbolic link ('ln -s my-class-version class') pointing to the version of interes, but it would be good to correct the issue to avoid potential confusion.

CAMB instead of CLASS?

I'd like to try to use MontePython with a highly modified CAMB to test it out before implementing all the changes in CLASS. Is there any capability for using CAMB as the Boltzmann code instead of CLASS, or am I better off just starting to work on modifying CLASS with the same physics as I've already implemented in CAMB?

Montepython getting stopped

HI.

I am trying to run the latest version of montepython but it is getting stopped with following error.

akhilesh@cosmos:~/cmbsofts/july_14/montepython$ python montepython/MontePython.py -o planck/ -p base.param --conf default.conf -c covmat/base.covmat
Running Monte Python v2.0.4

with CLASS v2.3.2
/!\ Running CLASS from a non version-controlled repository
/!\ Detecting empty folder, logging the parameter file

Testing likelihoods for:
-> Planck_highl, Planck_lowl, lowlike


clik version 5887
CAMspec e61cec87-3a37-43ca-8ed1-edcfcaf5c00a

Checking likelihood '/home/akhilesh/cmbsofts/planck_likelihood/plc-1.0/../CAMspec_v6.2TN_2013_02_26_dist.clik' on test data. got -3908.71 expected -3908.71 (diff -3.67322e-08)


clik version 5887
gibbs d462e865-e178-449a-ac29-5c16ab9b38f5

Checking likelihood '/home/akhilesh/cmbsofts/planck_likelihood/plc-1.0/../commander_v4.1_lm49.clik' on test data. got 3.2784 expected 3.2784 (diff -2.55576e-10)

Initializing Planck low-likelihood, version v2.1

clik version 5887
lowlike "lowlike v222"

Checking likelihood '/home/akhilesh/cmbsofts/planck_likelihood/plc-1.0/../lowlike_v222.clik' on test data. got -1007.04 expected -1007.04 (diff -1.87824e-07)

Creating planck/2014-08-01_10__1.txt

Input covariance matrix:
['omega_b', 'omega_cdm', 'H0', 'A_s', 'n_s', 'tau_reio', 'A_ps_100', 'A_ps_143', 'A_ps_217', 'A_cib_143', 'A_cib_217', 'A_sz', 'r_ps', 'r_cib', 'n_Dl_cib', 'cal_100', 'cal_217', 'xi_sz_cib', 'A_ksz', 'Bm_1_1', 'z_reio', 'Omega_Lambda', 'YHe', 'ln10^{10}A_s\r']
[[ 7.76e-08 -4.15e-07 2.36e-04 2.77e-15 1.14e-06 1.04e-06 -1.30e-03 -6.27e-04 -2.21e-04 -7.73e-05 -1.86e-04 3.02e-05 6.99e-07
-3.16e-06 -7.67e-07 7.57e-10 -2.04e-08 3.83e-06 -4.27e-05 -1.20e-05 5.65e-05 2.95e-06 3.34e-08 1.26e-06]
[ -4.15e-07 7.06e-06 -3.06e-03 -1.31e-14 -1.55e-05 -1.15e-05 6.88e-03 3.46e-03 2.89e-03 -2.08e-04 6.14e-05 -1.30e-04 -1.26e-06
-1.29e-05 6.42e-06 -2.32e-08 1.24e-07 -1.69e-05 -4.92e-05 1.63e-04 -7.05e-04 -4.33e-05 -1.79e-07 -5.92e-06]
[ 2.36e-04 -3.06e-03 1.42e+00 8.28e-12 7.08e-03 5.45e-03 -3.72e+00 -1.83e+00 -1.33e+00 1.56e-02 -1.83e-01 6.79e-02 6.53e-04
2.59e-03 -2.83e-03 8.12e-06 -6.93e-05 1.03e-02 3.09e-03 -7.24e-02 3.29e-01 1.95e-02 1.01e-04 3.75e-03]
[ 2.77e-15 -1.31e-14 8.28e-12 2.89e-21 8.23e-14 6.65e-13 -1.25e-10 -8.99e-12 4.34e-11 -2.31e-11 -3.54e-11 -1.03e-12 1.53e-13
-9.57e-13 -2.98e-14 1.73e-16 -4.65e-16 8.94e-13 -1.05e-11 1.09e-12 5.70e-11 9.97e-14 1.19e-15 1.31e-12]
[ 1.14e-06 -1.55e-05 7.08e-03 8.23e-14 5.30e-05 3.80e-05 -4.96e-02 -8.78e-03 5.77e-03 -4.46e-03 -7.72e-03 -6.37e-05 3.73e-05
-1.94e-04 -1.36e-05 -9.81e-09 -3.62e-08 1.03e-04 -2.57e-03 -1.94e-04 2.58e-03 9.80e-05 4.91e-07 3.73e-05]
[ 1.04e-06 -1.15e-05 5.45e-03 6.65e-13 3.80e-05 1.66e-04 -3.45e-02 -8.20e-03 2.11e-03 -3.61e-03 -6.44e-03 2.72e-04 2.14e-05
-1.23e-04 -3.16e-05 4.40e-08 -5.85e-07 1.91e-04 -1.56e-03 -2.53e-04 1.39e-02 7.41e-05 4.49e-07 3.02e-04]
[ -1.30e-03 6.88e-03 -3.72e+00 -1.25e-10 -4.96e-02 -3.45e-02 3.56e+03 1.24e+02 -1.07e+02 5.77e+01 8.13e+01 -6.74e+01 -8.13e-01
-7.10e-02 6.61e-01 5.07e-04 -3.48e-04 -2.75e-01 -1.50e+01 1.73e+01 -2.45e+00 -4.76e-02 -5.62e-04 -5.68e-02]
[ -6.27e-04 3.46e-03 -1.83e+00 -8.99e-12 -8.78e-03 -8.20e-03 1.24e+02 1.78e+02 2.19e+01 -3.62e+01 5.79e+00 -6.89e+00 -1.64e-01
-1.09e-01 1.03e-01 1.60e-04 9.68e-04 1.01e+00 -1.31e+01 5.15e-01 -4.22e-01 -2.33e-02 -2.70e-04 -3.94e-03]
[ -2.21e-04 2.89e-03 -1.33e+00 4.34e-11 5.77e-03 2.11e-03 -1.07e+02 2.19e+01 2.60e+02 -1.70e+01 -1.01e+02 3.67e+00 4.33e-02
-1.45e+00 -8.57e-01 -1.86e-04 -3.39e-03 5.22e-01 -9.95e+00 1.04e+00 3.28e-01 -1.82e-02 -9.49e-05 1.99e-02]
[ -7.73e-05 -2.08e-04 1.56e-02 -2.31e-11 -4.46e-03 -3.61e-03 5.77e+01 -3.62e+01 -1.70e+01 2.70e+01 1.07e+01 -1.96e+00 1.90e-02
1.88e-01 8.55e-02 -2.31e-05 -4.75e-04 9.08e-02 -5.60e-01 -4.08e-01 -2.99e-01 6.72e-04 -3.32e-05 -1.05e-02]
[ -1.86e-04 6.14e-05 -1.83e-01 -3.54e-11 -7.72e-03 -6.44e-03 8.13e+01 5.79e+00 -1.01e+02 1.07e+01 5.09e+01 -1.67e+00 2.03e-02
6.64e-01 5.19e-01 1.00e-04 1.70e-03 -2.75e-01 -1.78e+00 -3.64e-01 -5.03e-01 -1.55e-03 -8.00e-05 -1.61e-02]
[ 3.02e-05 -1.30e-04 6.79e-02 -1.03e-12 -6.37e-05 2.72e-04 -6.74e+01 -6.89e+00 3.67e+00 -1.96e+00 -1.67e+00 7.50e+00 4.16e-02
2.42e-01 -4.08e-02 3.32e-05 -2.04e-04 5.85e-03 -1.94e-01 1.74e-01 1.19e-02 8.97e-04 1.30e-05 -4.55e-04]
[ 6.99e-07 -1.26e-06 6.53e-04 1.53e-13 3.73e-05 2.14e-05 -8.13e-01 -1.64e-01 4.33e-02 1.90e-02 2.03e-02 4.16e-02 6.18e-03
-5.10e-03 4.18e-03 4.96e-07 -5.04e-06 3.59e-03 -1.15e-02 3.44e-03 1.64e-03 7.59e-06 3.01e-07 6.98e-05]
[ -3.16e-06 -1.29e-05 2.59e-03 -9.57e-13 -1.94e-04 -1.23e-04 -7.10e-02 -1.09e-01 -1.45e+00 1.88e-01 6.64e-01 2.42e-01 -5.10e-03
4.23e-02 -4.73e-03 3.35e-06 2.56e-05 2.14e-03 -4.25e-02 -9.30e-03 -1.01e-02 5.90e-05 -1.36e-06 -4.36e-04]
[ -7.67e-07 6.42e-06 -2.83e-03 -2.98e-14 -1.36e-05 -3.16e-05 6.61e-01 1.03e-01 -8.57e-01 8.55e-02 5.19e-01 -4.08e-02 4.18e-03
-4.73e-03 1.51e-02 1.10e-06 9.94e-06 -7.95e-04 -8.08e-03 6.86e-03 -2.27e-03 -3.99e-05 -3.28e-07 -1.30e-05]
[ 7.57e-10 -2.32e-08 8.12e-06 1.73e-16 -9.81e-09 4.40e-08 5.07e-04 1.60e-04 -1.86e-04 -2.31e-05 1.00e-04 3.32e-05 4.96e-07
3.35e-06 1.10e-06 1.65e-07 2.18e-08 5.66e-07 -4.82e-06 -1.72e-05 2.52e-06 1.15e-07 3.20e-10 7.50e-08]
[ -2.04e-08 1.24e-07 -6.93e-05 -4.65e-16 -3.62e-08 -5.85e-07 -3.48e-04 9.68e-04 -3.39e-03 -4.75e-04 1.70e-03 -2.04e-04 -5.04e-06
2.56e-05 9.94e-06 2.18e-08 1.90e-06 -1.20e-05 6.85e-06 1.19e-04 -4.19e-05 -8.82e-07 -8.78e-09 -2.13e-07]
[ 3.83e-06 -1.69e-05 1.03e-02 8.94e-13 1.03e-04 1.91e-04 -2.75e-01 1.01e+00 5.22e-01 9.08e-02 -2.75e-01 5.85e-03 3.59e-03
2.14e-03 -7.95e-04 5.66e-07 -1.20e-05 8.13e-02 2.43e-02 8.05e-03 1.52e-02 1.27e-04 1.65e-06 4.06e-04]
[ -4.27e-05 -4.92e-05 3.09e-03 -1.05e-11 -2.57e-03 -1.56e-03 -1.50e+01 -1.31e+01 -9.95e+00 -5.60e-01 -1.78e+00 -1.94e-01 -1.15e-02
-4.25e-02 -8.08e-03 -4.82e-06 6.85e-06 2.43e-02 7.97e+00 -2.63e-03 -1.23e-01 2.46e-04 -1.84e-05 -4.76e-03]
[ -1.20e-05 1.63e-04 -7.24e-02 1.09e-12 -1.94e-04 -2.53e-04 1.73e+01 5.15e-01 1.04e+00 -4.08e-01 -3.64e-01 1.74e-01 3.44e-03
-9.30e-03 6.86e-03 -1.72e-05 1.19e-04 8.05e-03 -2.63e-03 3.23e-01 -1.41e-02 -1.00e-03 -5.17e-06 5.00e-04]
[ 5.65e-05 -7.05e-04 3.29e-01 5.70e-11 2.58e-03 1.39e-02 -2.45e+00 -4.22e-01 3.28e-01 -2.99e-01 -5.03e-01 1.19e-02 1.64e-03
-1.01e-02 -2.27e-03 2.52e-06 -4.19e-05 1.52e-02 -1.23e-01 -1.41e-02 1.18e+00 4.51e-03 2.43e-05 2.59e-02]
[ 2.95e-06 -4.33e-05 1.95e-02 9.97e-14 9.80e-05 7.41e-05 -4.76e-02 -2.33e-02 -1.82e-02 6.72e-04 -1.55e-03 8.97e-04 7.59e-06
5.90e-05 -3.99e-05 1.15e-07 -8.82e-07 1.27e-04 2.46e-04 -1.00e-03 4.51e-03 2.72e-04 1.27e-06 4.52e-05]
[ 3.34e-08 -1.79e-07 1.01e-04 1.19e-15 4.91e-07 4.49e-07 -5.62e-04 -2.70e-04 -9.49e-05 -3.32e-05 -8.00e-05 1.30e-05 3.01e-07
-1.36e-06 -3.28e-07 3.20e-10 -8.78e-09 1.65e-06 -1.84e-05 -5.17e-06 2.43e-05 1.27e-06 1.44e-08 5.40e-07]
[ 1.26e-06 -5.92e-06 3.75e-03 1.31e-12 3.73e-05 3.02e-04 -5.68e-02 -3.94e-03 1.99e-02 -1.05e-02 -1.61e-02 -4.55e-04 6.98e-05
-4.36e-04 -1.30e-05 7.50e-08 -2.13e-07 4.06e-04 -4.76e-03 5.00e-04 2.59e-02 4.52e-05 5.40e-07 5.96e-04]]

First treatment (scaling)
['omega_b', 'omega_cdm', 'H0', 'A_s', 'n_s', 'tau_reio', 'A_ps_100', 'A_ps_143', 'A_ps_217', 'A_cib_143', 'A_cib_217', 'A_sz', 'r_ps', 'r_cib', 'n_Dl_cib', 'cal_100', 'cal_217', 'xi_sz_cib', 'A_ksz', 'Bm_1_1', 'z_reio', 'Omega_Lambda', 'YHe', 'ln10^{10}A_s\r']
[[ 7.76e-04 -4.15e-05 2.36e-02 2.77e-04 1.14e-04 1.04e-04 -1.30e-01 -6.27e-02 -2.21e-02 -7.73e-03 -1.86e-02 3.02e-03 6.99e-05
-3.16e-04 -7.67e-05 7.57e-08 -2.04e-06 3.83e-04 -4.27e-03 -1.20e-03 5.65e-03 2.95e-04 3.34e-06 1.26e-04]
[ -4.15e-05 7.06e-06 -3.06e-03 -1.31e-05 -1.55e-05 -1.15e-05 6.88e-03 3.46e-03 2.89e-03 -2.08e-04 6.14e-05 -1.30e-04 -1.26e-06
-1.29e-05 6.42e-06 -2.32e-08 1.24e-07 -1.69e-05 -4.92e-05 1.63e-04 -7.05e-04 -4.33e-05 -1.79e-07 -5.92e-06]
[ 2.36e-02 -3.06e-03 1.42e+00 8.28e-03 7.08e-03 5.45e-03 -3.72e+00 -1.83e+00 -1.33e+00 1.56e-02 -1.83e-01 6.79e-02 6.53e-04
2.59e-03 -2.83e-03 8.12e-06 -6.93e-05 1.03e-02 3.09e-03 -7.24e-02 3.29e-01 1.95e-02 1.01e-04 3.75e-03]
[ 2.77e-04 -1.31e-05 8.28e-03 2.89e-03 8.23e-05 6.65e-04 -1.25e-01 -8.99e-03 4.34e-02 -2.31e-02 -3.54e-02 -1.03e-03 1.53e-04
-9.57e-04 -2.98e-05 1.73e-07 -4.65e-07 8.94e-04 -1.05e-02 1.09e-03 5.70e-02 9.97e-05 1.19e-06 1.31e-03]
[ 1.14e-04 -1.55e-05 7.08e-03 8.23e-05 5.30e-05 3.80e-05 -4.96e-02 -8.78e-03 5.77e-03 -4.46e-03 -7.72e-03 -6.37e-05 3.73e-05
-1.94e-04 -1.36e-05 -9.81e-09 -3.62e-08 1.03e-04 -2.57e-03 -1.94e-04 2.58e-03 9.80e-05 4.91e-07 3.73e-05]
[ 1.04e-04 -1.15e-05 5.45e-03 6.65e-04 3.80e-05 1.66e-04 -3.45e-02 -8.20e-03 2.11e-03 -3.61e-03 -6.44e-03 2.72e-04 2.14e-05
-1.23e-04 -3.16e-05 4.40e-08 -5.85e-07 1.91e-04 -1.56e-03 -2.53e-04 1.39e-02 7.41e-05 4.49e-07 3.02e-04]
[ -1.30e-01 6.88e-03 -3.72e+00 -1.25e-01 -4.96e-02 -3.45e-02 3.56e+03 1.24e+02 -1.07e+02 5.77e+01 8.13e+01 -6.74e+01 -8.13e-01
-7.10e-02 6.61e-01 5.07e-04 -3.48e-04 -2.75e-01 -1.50e+01 1.73e+01 -2.45e+00 -4.76e-02 -5.62e-04 -5.68e-02]
[ -6.27e-02 3.46e-03 -1.83e+00 -8.99e-03 -8.78e-03 -8.20e-03 1.24e+02 1.78e+02 2.19e+01 -3.62e+01 5.79e+00 -6.89e+00 -1.64e-01
-1.09e-01 1.03e-01 1.60e-04 9.68e-04 1.01e+00 -1.31e+01 5.15e-01 -4.22e-01 -2.33e-02 -2.70e-04 -3.94e-03]
[ -2.21e-02 2.89e-03 -1.33e+00 4.34e-02 5.77e-03 2.11e-03 -1.07e+02 2.19e+01 2.60e+02 -1.70e+01 -1.01e+02 3.67e+00 4.33e-02
-1.45e+00 -8.57e-01 -1.86e-04 -3.39e-03 5.22e-01 -9.95e+00 1.04e+00 3.28e-01 -1.82e-02 -9.49e-05 1.99e-02]
[ -7.73e-03 -2.08e-04 1.56e-02 -2.31e-02 -4.46e-03 -3.61e-03 5.77e+01 -3.62e+01 -1.70e+01 2.70e+01 1.07e+01 -1.96e+00 1.90e-02
1.88e-01 8.55e-02 -2.31e-05 -4.75e-04 9.08e-02 -5.60e-01 -4.08e-01 -2.99e-01 6.72e-04 -3.32e-05 -1.05e-02]
[ -1.86e-02 6.14e-05 -1.83e-01 -3.54e-02 -7.72e-03 -6.44e-03 8.13e+01 5.79e+00 -1.01e+02 1.07e+01 5.09e+01 -1.67e+00 2.03e-02
6.64e-01 5.19e-01 1.00e-04 1.70e-03 -2.75e-01 -1.78e+00 -3.64e-01 -5.03e-01 -1.55e-03 -8.00e-05 -1.61e-02]
[ 3.02e-03 -1.30e-04 6.79e-02 -1.03e-03 -6.37e-05 2.72e-04 -6.74e+01 -6.89e+00 3.67e+00 -1.96e+00 -1.67e+00 7.50e+00 4.16e-02
2.42e-01 -4.08e-02 3.32e-05 -2.04e-04 5.85e-03 -1.94e-01 1.74e-01 1.19e-02 8.97e-04 1.30e-05 -4.55e-04]
[ 6.99e-05 -1.26e-06 6.53e-04 1.53e-04 3.73e-05 2.14e-05 -8.13e-01 -1.64e-01 4.33e-02 1.90e-02 2.03e-02 4.16e-02 6.18e-03
-5.10e-03 4.18e-03 4.96e-07 -5.04e-06 3.59e-03 -1.15e-02 3.44e-03 1.64e-03 7.59e-06 3.01e-07 6.98e-05]
[ -3.16e-04 -1.29e-05 2.59e-03 -9.57e-04 -1.94e-04 -1.23e-04 -7.10e-02 -1.09e-01 -1.45e+00 1.88e-01 6.64e-01 2.42e-01 -5.10e-03
4.23e-02 -4.73e-03 3.35e-06 2.56e-05 2.14e-03 -4.25e-02 -9.30e-03 -1.01e-02 5.90e-05 -1.36e-06 -4.36e-04]
[ -7.67e-05 6.42e-06 -2.83e-03 -2.98e-05 -1.36e-05 -3.16e-05 6.61e-01 1.03e-01 -8.57e-01 8.55e-02 5.19e-01 -4.08e-02 4.18e-03
-4.73e-03 1.51e-02 1.10e-06 9.94e-06 -7.95e-04 -8.08e-03 6.86e-03 -2.27e-03 -3.99e-05 -3.28e-07 -1.30e-05]
[ 7.57e-08 -2.32e-08 8.12e-06 1.73e-07 -9.81e-09 4.40e-08 5.07e-04 1.60e-04 -1.86e-04 -2.31e-05 1.00e-04 3.32e-05 4.96e-07
3.35e-06 1.10e-06 1.65e-07 2.18e-08 5.66e-07 -4.82e-06 -1.72e-05 2.52e-06 1.15e-07 3.20e-10 7.50e-08]
[ -2.04e-06 1.24e-07 -6.93e-05 -4.65e-07 -3.62e-08 -5.85e-07 -3.48e-04 9.68e-04 -3.39e-03 -4.75e-04 1.70e-03 -2.04e-04 -5.04e-06
2.56e-05 9.94e-06 2.18e-08 1.90e-06 -1.20e-05 6.85e-06 1.19e-04 -4.19e-05 -8.82e-07 -8.78e-09 -2.13e-07]
[ 3.83e-04 -1.69e-05 1.03e-02 8.94e-04 1.03e-04 1.91e-04 -2.75e-01 1.01e+00 5.22e-01 9.08e-02 -2.75e-01 5.85e-03 3.59e-03
2.14e-03 -7.95e-04 5.66e-07 -1.20e-05 8.13e-02 2.43e-02 8.05e-03 1.52e-02 1.27e-04 1.65e-06 4.06e-04]
[ -4.27e-03 -4.92e-05 3.09e-03 -1.05e-02 -2.57e-03 -1.56e-03 -1.50e+01 -1.31e+01 -9.95e+00 -5.60e-01 -1.78e+00 -1.94e-01 -1.15e-02
-4.25e-02 -8.08e-03 -4.82e-06 6.85e-06 2.43e-02 7.97e+00 -2.63e-03 -1.23e-01 2.46e-04 -1.84e-05 -4.76e-03]
[ -1.20e-03 1.63e-04 -7.24e-02 1.09e-03 -1.94e-04 -2.53e-04 1.73e+01 5.15e-01 1.04e+00 -4.08e-01 -3.64e-01 1.74e-01 3.44e-03
-9.30e-03 6.86e-03 -1.72e-05 1.19e-04 8.05e-03 -2.63e-03 3.23e-01 -1.41e-02 -1.00e-03 -5.17e-06 5.00e-04]
[ 5.65e-03 -7.05e-04 3.29e-01 5.70e-02 2.58e-03 1.39e-02 -2.45e+00 -4.22e-01 3.28e-01 -2.99e-01 -5.03e-01 1.19e-02 1.64e-03
-1.01e-02 -2.27e-03 2.52e-06 -4.19e-05 1.52e-02 -1.23e-01 -1.41e-02 1.18e+00 4.51e-03 2.43e-05 2.59e-02]
[ 2.95e-04 -4.33e-05 1.95e-02 9.97e-05 9.80e-05 7.41e-05 -4.76e-02 -2.33e-02 -1.82e-02 6.72e-04 -1.55e-03 8.97e-04 7.59e-06
5.90e-05 -3.99e-05 1.15e-07 -8.82e-07 1.27e-04 2.46e-04 -1.00e-03 4.51e-03 2.72e-04 1.27e-06 4.52e-05]
[ 3.34e-06 -1.79e-07 1.01e-04 1.19e-06 4.91e-07 4.49e-07 -5.62e-04 -2.70e-04 -9.49e-05 -3.32e-05 -8.00e-05 1.30e-05 3.01e-07
-1.36e-06 -3.28e-07 3.20e-10 -8.78e-09 1.65e-06 -1.84e-05 -5.17e-06 2.43e-05 1.27e-06 1.44e-08 5.40e-07]
[ 1.26e-04 -5.92e-06 3.75e-03 1.31e-03 3.73e-05 3.02e-04 -5.68e-02 -3.94e-03 1.99e-02 -1.05e-02 -1.61e-02 -4.55e-04 6.98e-05
-4.36e-04 -1.30e-05 7.50e-08 -2.13e-07 4.06e-04 -4.76e-03 5.00e-04 2.59e-02 4.52e-05 5.40e-07 5.96e-04]]

Second treatment (partial reordering and cleaning)
['omega_b', 'omega_cdm', 'H0', 'A_s', 'n_s', 'tau_reio', 'A_ps_100', 'A_ps_143', 'A_ps_217', 'A_cib_143', 'A_cib_217', 'A_sz', 'r_ps', 'r_cib', 'n_Dl_cib', 'cal_100', 'cal_217', 'xi_sz_cib', 'A_ksz', 'Bm_1_1', '', '', '', '']
[[ 7.76e-04 -4.15e-05 2.36e-02 2.77e-04 1.14e-04 1.04e-04 -1.30e-01 -6.27e-02 -2.21e-02 -7.73e-03 -1.86e-02 3.02e-03 6.99e-05
-3.16e-04 -7.67e-05 7.57e-08 -2.04e-06 3.83e-04 -4.27e-03 -1.20e-03 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -4.15e-05 7.06e-06 -3.06e-03 -1.31e-05 -1.55e-05 -1.15e-05 6.88e-03 3.46e-03 2.89e-03 -2.08e-04 6.14e-05 -1.30e-04 -1.26e-06
-1.29e-05 6.42e-06 -2.32e-08 1.24e-07 -1.69e-05 -4.92e-05 1.63e-04 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 2.36e-02 -3.06e-03 1.42e+00 8.28e-03 7.08e-03 5.45e-03 -3.72e+00 -1.83e+00 -1.33e+00 1.56e-02 -1.83e-01 6.79e-02 6.53e-04
2.59e-03 -2.83e-03 8.12e-06 -6.93e-05 1.03e-02 3.09e-03 -7.24e-02 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 2.77e-04 -1.31e-05 8.28e-03 2.89e-03 8.23e-05 6.65e-04 -1.25e-01 -8.99e-03 4.34e-02 -2.31e-02 -3.54e-02 -1.03e-03 1.53e-04
-9.57e-04 -2.98e-05 1.73e-07 -4.65e-07 8.94e-04 -1.05e-02 1.09e-03 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 1.14e-04 -1.55e-05 7.08e-03 8.23e-05 5.30e-05 3.80e-05 -4.96e-02 -8.78e-03 5.77e-03 -4.46e-03 -7.72e-03 -6.37e-05 3.73e-05
-1.94e-04 -1.36e-05 -9.81e-09 -3.62e-08 1.03e-04 -2.57e-03 -1.94e-04 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 1.04e-04 -1.15e-05 5.45e-03 6.65e-04 3.80e-05 1.66e-04 -3.45e-02 -8.20e-03 2.11e-03 -3.61e-03 -6.44e-03 2.72e-04 2.14e-05
-1.23e-04 -3.16e-05 4.40e-08 -5.85e-07 1.91e-04 -1.56e-03 -2.53e-04 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -1.30e-01 6.88e-03 -3.72e+00 -1.25e-01 -4.96e-02 -3.45e-02 3.56e+03 1.24e+02 -1.07e+02 5.77e+01 8.13e+01 -6.74e+01 -8.13e-01
-7.10e-02 6.61e-01 5.07e-04 -3.48e-04 -2.75e-01 -1.50e+01 1.73e+01 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -6.27e-02 3.46e-03 -1.83e+00 -8.99e-03 -8.78e-03 -8.20e-03 1.24e+02 1.78e+02 2.19e+01 -3.62e+01 5.79e+00 -6.89e+00 -1.64e-01
-1.09e-01 1.03e-01 1.60e-04 9.68e-04 1.01e+00 -1.31e+01 5.15e-01 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -2.21e-02 2.89e-03 -1.33e+00 4.34e-02 5.77e-03 2.11e-03 -1.07e+02 2.19e+01 2.60e+02 -1.70e+01 -1.01e+02 3.67e+00 4.33e-02
-1.45e+00 -8.57e-01 -1.86e-04 -3.39e-03 5.22e-01 -9.95e+00 1.04e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -7.73e-03 -2.08e-04 1.56e-02 -2.31e-02 -4.46e-03 -3.61e-03 5.77e+01 -3.62e+01 -1.70e+01 2.70e+01 1.07e+01 -1.96e+00 1.90e-02
1.88e-01 8.55e-02 -2.31e-05 -4.75e-04 9.08e-02 -5.60e-01 -4.08e-01 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -1.86e-02 6.14e-05 -1.83e-01 -3.54e-02 -7.72e-03 -6.44e-03 8.13e+01 5.79e+00 -1.01e+02 1.07e+01 5.09e+01 -1.67e+00 2.03e-02
6.64e-01 5.19e-01 1.00e-04 1.70e-03 -2.75e-01 -1.78e+00 -3.64e-01 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 3.02e-03 -1.30e-04 6.79e-02 -1.03e-03 -6.37e-05 2.72e-04 -6.74e+01 -6.89e+00 3.67e+00 -1.96e+00 -1.67e+00 7.50e+00 4.16e-02
2.42e-01 -4.08e-02 3.32e-05 -2.04e-04 5.85e-03 -1.94e-01 1.74e-01 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 6.99e-05 -1.26e-06 6.53e-04 1.53e-04 3.73e-05 2.14e-05 -8.13e-01 -1.64e-01 4.33e-02 1.90e-02 2.03e-02 4.16e-02 6.18e-03
-5.10e-03 4.18e-03 4.96e-07 -5.04e-06 3.59e-03 -1.15e-02 3.44e-03 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -3.16e-04 -1.29e-05 2.59e-03 -9.57e-04 -1.94e-04 -1.23e-04 -7.10e-02 -1.09e-01 -1.45e+00 1.88e-01 6.64e-01 2.42e-01 -5.10e-03
4.23e-02 -4.73e-03 3.35e-06 2.56e-05 2.14e-03 -4.25e-02 -9.30e-03 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -7.67e-05 6.42e-06 -2.83e-03 -2.98e-05 -1.36e-05 -3.16e-05 6.61e-01 1.03e-01 -8.57e-01 8.55e-02 5.19e-01 -4.08e-02 4.18e-03
-4.73e-03 1.51e-02 1.10e-06 9.94e-06 -7.95e-04 -8.08e-03 6.86e-03 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 7.57e-08 -2.32e-08 8.12e-06 1.73e-07 -9.81e-09 4.40e-08 5.07e-04 1.60e-04 -1.86e-04 -2.31e-05 1.00e-04 3.32e-05 4.96e-07
3.35e-06 1.10e-06 1.65e-07 2.18e-08 5.66e-07 -4.82e-06 -1.72e-05 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -2.04e-06 1.24e-07 -6.93e-05 -4.65e-07 -3.62e-08 -5.85e-07 -3.48e-04 9.68e-04 -3.39e-03 -4.75e-04 1.70e-03 -2.04e-04 -5.04e-06
2.56e-05 9.94e-06 2.18e-08 1.90e-06 -1.20e-05 6.85e-06 1.19e-04 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 3.83e-04 -1.69e-05 1.03e-02 8.94e-04 1.03e-04 1.91e-04 -2.75e-01 1.01e+00 5.22e-01 9.08e-02 -2.75e-01 5.85e-03 3.59e-03
2.14e-03 -7.95e-04 5.66e-07 -1.20e-05 8.13e-02 2.43e-02 8.05e-03 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -4.27e-03 -4.92e-05 3.09e-03 -1.05e-02 -2.57e-03 -1.56e-03 -1.50e+01 -1.31e+01 -9.95e+00 -5.60e-01 -1.78e+00 -1.94e-01 -1.15e-02
-4.25e-02 -8.08e-03 -4.82e-06 6.85e-06 2.43e-02 7.97e+00 -2.63e-03 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ -1.20e-03 1.63e-04 -7.24e-02 1.09e-03 -1.94e-04 -2.53e-04 1.73e+01 5.15e-01 1.04e+00 -4.08e-01 -3.64e-01 1.74e-01 3.44e-03
-9.30e-03 6.86e-03 -1.72e-05 1.19e-04 8.05e-03 -2.63e-03 3.23e-01 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00
0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00
0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00
0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00]
[ 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00
0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00 0.00e+00]]

Deduced starting covariance matrix:
['omega_b', 'omega_cdm', 'H0', 'A_s', 'n_s', 'tau_reio', 'A_ps_100', 'A_ps_143', 'A_ps_217', 'A_cib_143', 'A_cib_217', 'A_sz', 'r_ps', 'r_cib', 'n_Dl_cib', 'cal_100', 'cal_217', 'xi_sz_cib', 'A_ksz', 'Bm_1_1']
[[ 7.76e-04 -4.15e-05 2.36e-02 2.77e-04 1.14e-04 1.04e-04 -1.30e-01 -6.27e-02 -2.21e-02 -7.73e-03 -1.86e-02 3.02e-03 6.99e-05
-3.16e-04 -7.67e-05 7.57e-08 -2.04e-06 3.83e-04 -4.27e-03 -1.20e-03]
[ -4.15e-05 7.06e-06 -3.06e-03 -1.31e-05 -1.55e-05 -1.15e-05 6.88e-03 3.46e-03 2.89e-03 -2.08e-04 6.14e-05 -1.30e-04 -1.26e-06
-1.29e-05 6.42e-06 -2.32e-08 1.24e-07 -1.69e-05 -4.92e-05 1.63e-04]
[ 2.36e-02 -3.06e-03 1.42e+00 8.28e-03 7.08e-03 5.45e-03 -3.72e+00 -1.83e+00 -1.33e+00 1.56e-02 -1.83e-01 6.79e-02 6.53e-04
2.59e-03 -2.83e-03 8.12e-06 -6.93e-05 1.03e-02 3.09e-03 -7.24e-02]
[ 2.77e-04 -1.31e-05 8.28e-03 2.89e-03 8.23e-05 6.65e-04 -1.25e-01 -8.99e-03 4.34e-02 -2.31e-02 -3.54e-02 -1.03e-03 1.53e-04
-9.57e-04 -2.98e-05 1.73e-07 -4.65e-07 8.94e-04 -1.05e-02 1.09e-03]
[ 1.14e-04 -1.55e-05 7.08e-03 8.23e-05 5.30e-05 3.80e-05 -4.96e-02 -8.78e-03 5.77e-03 -4.46e-03 -7.72e-03 -6.37e-05 3.73e-05
-1.94e-04 -1.36e-05 -9.81e-09 -3.62e-08 1.03e-04 -2.57e-03 -1.94e-04]
[ 1.04e-04 -1.15e-05 5.45e-03 6.65e-04 3.80e-05 1.66e-04 -3.45e-02 -8.20e-03 2.11e-03 -3.61e-03 -6.44e-03 2.72e-04 2.14e-05
-1.23e-04 -3.16e-05 4.40e-08 -5.85e-07 1.91e-04 -1.56e-03 -2.53e-04]
[ -1.30e-01 6.88e-03 -3.72e+00 -1.25e-01 -4.96e-02 -3.45e-02 3.56e+03 1.24e+02 -1.07e+02 5.77e+01 8.13e+01 -6.74e+01 -8.13e-01
-7.10e-02 6.61e-01 5.07e-04 -3.48e-04 -2.75e-01 -1.50e+01 1.73e+01]
[ -6.27e-02 3.46e-03 -1.83e+00 -8.99e-03 -8.78e-03 -8.20e-03 1.24e+02 1.78e+02 2.19e+01 -3.62e+01 5.79e+00 -6.89e+00 -1.64e-01
-1.09e-01 1.03e-01 1.60e-04 9.68e-04 1.01e+00 -1.31e+01 5.15e-01]
[ -2.21e-02 2.89e-03 -1.33e+00 4.34e-02 5.77e-03 2.11e-03 -1.07e+02 2.19e+01 2.60e+02 -1.70e+01 -1.01e+02 3.67e+00 4.33e-02
-1.45e+00 -8.57e-01 -1.86e-04 -3.39e-03 5.22e-01 -9.95e+00 1.04e+00]
[ -7.73e-03 -2.08e-04 1.56e-02 -2.31e-02 -4.46e-03 -3.61e-03 5.77e+01 -3.62e+01 -1.70e+01 2.70e+01 1.07e+01 -1.96e+00 1.90e-02
1.88e-01 8.55e-02 -2.31e-05 -4.75e-04 9.08e-02 -5.60e-01 -4.08e-01]
[ -1.86e-02 6.14e-05 -1.83e-01 -3.54e-02 -7.72e-03 -6.44e-03 8.13e+01 5.79e+00 -1.01e+02 1.07e+01 5.09e+01 -1.67e+00 2.03e-02
6.64e-01 5.19e-01 1.00e-04 1.70e-03 -2.75e-01 -1.78e+00 -3.64e-01]
[ 3.02e-03 -1.30e-04 6.79e-02 -1.03e-03 -6.37e-05 2.72e-04 -6.74e+01 -6.89e+00 3.67e+00 -1.96e+00 -1.67e+00 7.50e+00 4.16e-02
2.42e-01 -4.08e-02 3.32e-05 -2.04e-04 5.85e-03 -1.94e-01 1.74e-01]
[ 6.99e-05 -1.26e-06 6.53e-04 1.53e-04 3.73e-05 2.14e-05 -8.13e-01 -1.64e-01 4.33e-02 1.90e-02 2.03e-02 4.16e-02 6.18e-03
-5.10e-03 4.18e-03 4.96e-07 -5.04e-06 3.59e-03 -1.15e-02 3.44e-03]
[ -3.16e-04 -1.29e-05 2.59e-03 -9.57e-04 -1.94e-04 -1.23e-04 -7.10e-02 -1.09e-01 -1.45e+00 1.88e-01 6.64e-01 2.42e-01 -5.10e-03
4.23e-02 -4.73e-03 3.35e-06 2.56e-05 2.14e-03 -4.25e-02 -9.30e-03]
[ -7.67e-05 6.42e-06 -2.83e-03 -2.98e-05 -1.36e-05 -3.16e-05 6.61e-01 1.03e-01 -8.57e-01 8.55e-02 5.19e-01 -4.08e-02 4.18e-03
-4.73e-03 1.51e-02 1.10e-06 9.94e-06 -7.95e-04 -8.08e-03 6.86e-03]
[ 7.57e-08 -2.32e-08 8.12e-06 1.73e-07 -9.81e-09 4.40e-08 5.07e-04 1.60e-04 -1.86e-04 -2.31e-05 1.00e-04 3.32e-05 4.96e-07
3.35e-06 1.10e-06 1.65e-07 2.18e-08 5.66e-07 -4.82e-06 -1.72e-05]
[ -2.04e-06 1.24e-07 -6.93e-05 -4.65e-07 -3.62e-08 -5.85e-07 -3.48e-04 9.68e-04 -3.39e-03 -4.75e-04 1.70e-03 -2.04e-04 -5.04e-06
2.56e-05 9.94e-06 2.18e-08 1.90e-06 -1.20e-05 6.85e-06 1.19e-04]
[ 3.83e-04 -1.69e-05 1.03e-02 8.94e-04 1.03e-04 1.91e-04 -2.75e-01 1.01e+00 5.22e-01 9.08e-02 -2.75e-01 5.85e-03 3.59e-03
2.14e-03 -7.95e-04 5.66e-07 -1.20e-05 8.13e-02 2.43e-02 8.05e-03]
[ -4.27e-03 -4.92e-05 3.09e-03 -1.05e-02 -2.57e-03 -1.56e-03 -1.50e+01 -1.31e+01 -9.95e+00 -5.60e-01 -1.78e+00 -1.94e-01 -1.15e-02
-4.25e-02 -8.08e-03 -4.82e-06 6.85e-06 2.43e-02 7.97e+00 -2.63e-03]
[ -1.20e-03 1.63e-04 -7.24e-02 1.09e-03 -1.94e-04 -2.53e-04 1.73e+01 5.15e-01 1.04e+00 -4.08e-01 -3.64e-01 1.74e-01 3.44e-03
-9.30e-03 6.86e-03 -1.72e-05 1.19e-04 8.05e-03 -2.63e-03 3.23e-01]]
Traceback (most recent call last):
File "montepython/MontePython.py", line 41, in
sys.exit(run())
File "/home/akhilesh/cmbsofts/july_14/montepython/montepython/run.py", line 43, in run
sampler.run(cosmo, data, command_line)
File "/home/akhilesh/cmbsofts/july_14/montepython/montepython/sampler.py", line 41, in run
mcmc.chain(cosmo, data, command_line)
File "/home/akhilesh/cmbsofts/july_14/montepython/montepython/mcmc.py", line 285, in chain
loglike = sampler.compute_lkl(cosmo, data)
File "/home/akhilesh/cmbsofts/july_14/montepython/montepython/sampler.py", line 430, in compute_lkl
cosmo.get_current_derived_parameters(data)
File "classy.pyx", line 768, in classy.Class.get_current_derived_parameters (classy.c:8320)
TypeError: Deprecated

I will be grateful if some one can kidly help me in this regard.
Akhilesh

Problem with Planck and CosmoHammer

On behalf of Sebastian Bocquet,

when running CosmoHammer with Planck, at the end of the burn-in phase, the following error is thrown.

 /!\ invalid value encountered in subtract
 /!\ invalid value encountered in greater
Traceback (most recent call last):
  File "montepython/MontePython.py", line 40, in <module>
    sys.exit(run())
  File "/home/moon/bocquet/software/montepython_2.1/montepython/run.py", line 44, in run
    sampler.run(cosmo, data, command_line)
  File "/home/moon/bocquet/software/montepython_2.1/montepython/sampler.py", line 48, in run
    hammer.run(cosmo, data, command_line)
  File "/home/moon/bocquet/software/montepython_2.1/montepython/cosmo_hammer.py", line 146, in
+run
    sampler_hammer.startSampling()
  File "build/bdist.linux-x86_64/egg/cosmoHammer/sampler/CosmoHammerSampler.py", line 117, in
+startSS
ampling
  File "build/bdist.linux-x86_64/egg/cosmoHammer/sampler/CosmoHammerSampler.py", line 160, in
+startSS
ampleBurnin
  File "build/bdist.linux-x86_64/egg/cosmoHammer/sampler/CosmoHammerSampler.py", line 188, in
+samplee
Burnin
  File "build/bdist.linux-x86_64/egg/cosmoHammer/util/SampleFileUtil.py", line 45, in persistB
+urninVV
alues
  File "/home/moon/bocquet/software/montepython_2.1/montepython/cosmo_hammer.py", line 199, in
+persii
stValues
    [[a for a in elem.itervalues()] for elem in data])
AttributeError: 'list' object has no attribute 'itervalues'

Pb generating .pdf plots from chains

I found that I was getting a series of cryptic error messages when trying to generate .pdf plots from precomputed mcmc chains, for example:

kpathsea: Running mktexmf phvr7t
! I can't find file `phvr7t'.

The solution was to update my texlive fonts:

port install texlive-fonts-recommended +doc +src.

Mismatch of parameter names when using nest sampling

I am trying nest sampling with some fixed cosmological parameters and some varying ones. For example,

data.cosmo_arguments['P_k_ini type'] = 'external_Pk'
data.cosmo_arguments['command'] = 'python $(pwd)/generate_Pk.py'

# params: 1: pivot. 2: As. 3: ns
data.parameters['custom1'] = [0.05, 0.05, 0.05, 0, 1, 'cosmo']
data.parameters['custom2'] = [2.16, 2.16, 2.16, 0, 1e-9, 'cosmo']
data.parameters['custom3'] = [0.961,  -1, -1, 0, 1,   'cosmo']

# params: 4: log(xc_kr). 5: xc_p
data.parameters['custom4'] = [-3.0, -4.0, -2.0, 0.1, 1, 'cosmo'] 
data.parameters['custom5'] = [0.0, -15.0, 15.0, 0.3, 1, 'cosmo'] 

Note that the custom1, custom2 and custom3 are fixed, while custom4 and custom5 are varying. After doing nest sampling and using MontePython.py --info to analyze the result, the code crashes and there is an error message

File "/home/wangyi/PublicCode/montepython/montepython/nested_sampling.py", line 358, in from_NS_output_to_chains
columns_reorder = [NS_param_names.index(param) for param in param_names]
ValueError: 'custom1' is not in list

In nested_sampling.py, the variable param_name is defined by searching all lines in log.param file, with 'data.parameters[...]'. Note that both fixed and variable parameters are included. However, NS_param_names only includes variable parameters. Thus one can not create a one-to-one map between them. Currently I am using a workaround: Delete the lines in log.param which contains fixed parameters. Then the code works. Nevertheless, it would be nice if this issue may be fixed.

Import error of cosmohammer modules

Hello,

I am trying to use cosmohammer in montepython.
I have installed emcee(2.1.0) and cosmohammer(0.5.0) from pip but montepython can not import modules.

I have tested the installation by typing following command in python and this gives no error.

>>> from cosmoHammer import LikelihoodComputationChain

Here is the error message from montepython.

$ python montepython/MontePython.py run -p input/lcdm.ini -o chains/test -m CH

run
Running Monte Python v2.1.4

with CLASS v2.4.3

Testing likelihoods for:
 -> fake_planck_bluebook

Traceback (most recent call last):
  File "montepython/MontePython.py", line 40, in <module>
    sys.exit(run())
  File "/home/osatokn/work/montepython/montepython/run.py", line 44, in run
    sampler.run(cosmo, data, command_line)
  File "/home/osatokn/work/montepython/montepython/sampler.py", line 49, in run
    import cosmo_hammer as hammer
  File "/home/osatokn/work/montepython/montepython/cosmo_hammer.py", line 25, in <module>
    from cosmoHammer.likelihood.chain.LikelihoodComputationChain import (
ImportError: No module named likelihood.chain.LikelihoodComputationChain

Many thanks,
Ken

Is there a way of printing out the parameters Montepython uses?

Hi all,

I'm using Montepython to estimate the cosmological parameters through the density angular power spectrum (dCl). I've discovered that the angular spectrum that I get from Montepython is not exactly equal to the one I get directly from Class (I mean, fixing all the parameters to the same values). There's a 3% difference.

I'm afraid this difference can be due to the fact that Montepython might be using a set of parameters that are slightly different respect to the set I give to Class. My question is then: is there a way of printing out the values of the parameters Montepython actually uses?

Many thanks,
Adriana

Planck + JLA

Hi

I'm trying to run the MomtePython code with the data experiments:

data.experiments=['Planck_highl_TTTEEE','Planck_lowl','Planck_lensing','JLA','bao','WiggleZ_bao','Hz']

I'm putting the JLA nuisance parameters before the Planck ones, but I'm getting an error message about a non-successful initialisation

Which is the best over-sampling configuration for this experiments?
data.over_sampling=[1, ???]

Thanks a lot of

incompatibility with python 2.6

When running from python2.6, executing python montepython/MontePython.py --help, the following error message appears:
(thanks Motonari Tonegawa for finding it)

Traceback (most recent call last):
  File "montepython/MontePython.py", line 13, in <module>
    from run import run
  File "/home/tone/montepython/montepython/run.py", line 7, in <module>
    from initialise import initialise
  File "/home/tone/montepython/montepython/initialise.py", line 8, in <module>
    import parser_mp   # parsing the input command line
  File "/home/tone/montepython/montepython/parser_mp.py", line 156
    helpdict = {k: v for k, v in zip(keys, descriptions)}
                       ^
SyntaxError: invalid syntax

Importance sampling

Dear Benjamin,

We are trying to use importance sampling by using the following command line

python montepython/MontePython.py --conf class.conf -p input.param -o chains/output -m IS --IS-starting-folder chains/input

where chains/input contains previous chains and input.param contains additional experiments to those in the log file found in chains/input. Is it correct or we are doing something wrong ?

We are running MontePython 2.0.5

Thanks in advance,

Is it compulsory to install Multinest? Montepython not working

Hello,
I just installed montepython via
git clone https://github.com/baudren/montepython_public
cd montepython_public
python setup.py install --user

then I do "python montepython/MontePython.py run --help" and I obtain this

ERROR: Could not load MultiNest library "libmultinest.so"
ERROR: You have to build it first,
ERROR: and point the LD_LIBRARY_PATH environment variable to it!
ERROR: manual: http://johannesbuchner.github.com/PyMultiNest/install.html

ERROR: Could not load MultiNest library: libmultinest.so
ERROR: You have to build MultiNest,
ERROR: and point the LD_LIBRARY_PATH environment variable to it!
ERROR: manual: http://johannesbuchner.github.com/PyMultiNest/install.html

problem: libmultinest.so: cannot open shared object file: No such file or directory

Am I doing something wrong?
Thank you

WiggleZ BAO data post-reconstruction

I would like to see the post-reconstruction WiggleZ BAO measurements included in Montepython. Using the reconstruction technique, they reduce the error bars of the original measurements, and hence they are more powerful in constraining cosmological parameters.
They can be found in this reference: http://arxiv.org/abs/1401.0358.

Importance Sampling with New Likelihood

Dear All,

I would like to use the importance sampling (IS) method with a personal experiment. Everything works fine for the metropolis hasting sampler, however when I am using IS I get the following error message:

 -> reading  COM_CosmoParams_fullGrid_R2.00/base_w/plikHM_TT_lowTEB__4.txt

Exception in thread Thread-3:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 810, in __bootstrap_inner
    self.run()
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 763, in run
    self.__target(*self.__args, **self.__kwargs)
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/pool.py", line 380, in _handle_results
    task = get()
TypeError: ('__init__() takes exactly 2 arguments (1 given)', <class 'io_mp.CosmologicalModuleError'>, ())

If any one is familiar with this issue I would be glad to hear what can be going wrong.
Thank you very much,

Boris

Name 'data' is not defined

I am trying to run Monte Python for the first time (am using the Planck 2015 data release), but running the command
python montepython/MontePython.py -o test/ -p base.param

throws the following error:

Traceback (most recent call last):
File "montepython/MontePython.py", line 40, in
sys.exit(run())
File "/home/koldrakan/Cosmological_Codes/montepython_public-2.2.2/montepython/run.py", line 31, in run
custom_command)
File "/home/koldrakan/Cosmological_Codes/montepython_public-2.2.2/montepython/run.py", line 188, in safe_initialisation
cosmo, data, command_line, success = initialise(custom_command)
File "/home/koldrakan/Cosmological_Codes/montepython_public-2.2.2/montepython/initialise.py", line 33, in initialise
path = recover_local_path(command_line)
File "/home/koldrakan/Cosmological_Codes/montepython_public-2.2.2/montepython/initialise.py", line 124, in recover_local_path
exec(line)
File "", line 1, in
NameError: name 'data' is not defined
`

I have made sure all the likelihoods are properly linked with clik, and am using the predefined parameter file. What could the problem be?

Acceptance rate comment

Hi,
Would it be possible to indicate some kind of "quality" of a given acceptance rate after running an MCMC chain? E.g., if I get an acceptance rate of '1', just a brief comment that this indicates I should probably re-run my chain with a different step. I understand the scale is subject to interpretation, but something with broad guidelines, especially for numbers that everyone agrees are bad news.
Thanks .

montepython info fails with chains obtained using --update option

Hi,

I was trying out the new --update option in v.2.2.0 (which I'm very excited about!) and I'm running some Planck2015 chains as a test.
However when running montepython info in the output chains I got the following error:

ValueError: invalid literal for float(): 0.49]

I looked it up in the chain file, and I found that the line causing the error was this comment in the middle of the chain file:

# After 121 accepted steps: update proposal with R-1 = [ 0.92  0.5   0.3   0.44  2.21  0.77  0.38  0.53  0.71  0.13  0.76  0.83  0.56  0.34  0.32  0.48  0.32  0.23  0.24  0.43  0.11  0.16  0.13  1.64  0.77  
0.39  0.13  0.31  0.2   0.58  0.18  0.31  1.41  0.8   0.78  0.78  0.31  0.43  0.49]

Not sure if the expected behavior is to include such comments in the chain files, but the problem here is that the commented line itself is broken into two lines, therefore montepython info did not find any problem with the first line (which starts with a '#' sign), but it did found a problem when reading the second line (which starts from "0.39" and ends with "0.49]" ).

Anyway, I guess the fix for this is simple but I wanted to flag it anyway before more people encounter the same error.

Thanks, and congrats on the new version!!
Antonio J. Cuesta

Running Monte Python with MPI

Hello!

First, merci beaucoup for all the useful information provided in the website.

I am trying to run MP with mpirun following the indications given in "Example of a complete work session".

I am working with the cluster CC-in2p3.

Here is the command:

$mpirun -np 4 python montepython/MontePython.py run -o chains --conf default.conf -p  base2015.param -c covmat/base.covmat -N 5

Here is the output in the terminal:

run
Running Monte Python v2.1.4

with CLASS v2.4.2

Testing likelihoods for:
 -> Planck_highl

run
 /!\ Appending to an existing folder: using the log.param instead of
     base2015.param
Running Monte Python v2.1.4

Traceback (most recent call last):
  File "montepython/MontePython.py", line 40, in <module>
    sys.exit(run())
  File "montepython/run.py", line 31, in run
    custom_command)
  File "montepython/run.py", line 208, in safe_initialisation
    " Alternatively, there could be a problem with "+e.message)
io_mp.ConfigurationError: 

Configuration Error:
 /|\   You are running in a folder that was created following a non-successful
/_o_\  initialisation (wrong parameter name, wrong likelihood, etc...). If you
       have solved the issue, you should remove completely the output folder,
       and try again. Alternatively, there could be a problem with cosmo

.
[same 4 times, as I did np -4]
.

Then the code runs, but only creates one chain:

clik version 6dc2a8cf3965
  smica
Checking likelihood '/sps/lsst/data/bbolliet/PlanckMCMC/plc-2.0/../plc_2.0/hi_l/plik/plik_dx11dr2_HM_v18_TT.clik' on test data. got -380.979 expected -380.979 (diff -8.68545e-09)


Creating chains/2015-08-25_5__1.txt
.
[blabbla]
.

Deduced starting covariance matrix:
['n_s', 'A_planck']
[[  5.30e-05   0.00e+00]
 [  0.00e+00   6.25e-02]]

  -LogLkl   n_s             1e+02A_planck   
5  2037.4   9.508254e-01    1.003987e+02    

5 steps done, acceptance rate: 0.2
/sps/lsst/data/bbolliet/PlanckMCMC(1)>ls chains/
2015-08-25_5__1.txt  log.param

The doc about mpi_run in montepython says:

def mpi_run(custom_command=""):
    """
    Launch a simple MPI run, with no communication of covariance matrix

    Each process will make sure to initialise the folder if needed. Then and
    only then, it will send the signal to its next in line to proceed. This
    allows for initialisation over an arbitrary cluster geometry (you can have
    a single node with many cores, and all the chains living there, or many
    nodes with few cores). The speed loss due to the time spend checking if the
    folder is created should be negligible when running decently sized chains.

    Each process will send the number that it found to be the first available
    to its friends, so that the gathering of information post-run is made
    easier. If a chain number is specified, this will be used as the first
    number, and then incremented afterwards with the rank of the process.
    """

Either I am misusing the module, or It seems it is not doing the job...

Is there a way to fix this problem simply?

Many thanks
Boris

Document the implementation of PyMultiNest

Especially:

  • credits to Johannes Buchner for having done all the heavy work
  • the new flags for the command_line arguments
  • Remarks on installing MultiNest on mac (problem: dlopen(libmultinest.so, 6): image not found: solution: simply create a symbolic link between .dylib et .so)

No diagonal posteriors on triangle plots and custom legend

Dear All,

I am trying to customize the contours plots given by Montepython.

  1. Is it possible to disable the display of the 1-D likelihood (diagonal of the triangle plot) ?
  2. Can we get the output as a python file, in order to modify infividual plots afterwards ?

Thank you very much,
Boris

Adding derived parameters to a run without running the full simulation

One should be able to add nuisance parameters to an existing run, rewriting the chains with these added parameters. It would be pointless to do another Markov Chain process, as the analysis will not change.

It should then be possible to rewrite the chains automatically with the added derived parameters.

Generic way to handle parameters like non-degenerate neutrino masses

From Signe Riemer:

Currently, the way to have the standard neutrino from LCDM is to have in the parameter file:

data.cosmo_arguments['N_eff'] = 2.03351
data.cosmo_arguments['N_ncdm'] = 1
data.cosmo_arguments['m_ncdm'] = 0.06
data.cosmo_arguments['T_ncdm'] = 0.715985

But as soon one wants to have non-degenerate masses, the procedure is unclear. Currently, one needs to modify code/data.py with an additional test case. It would be better to have a generic way to handle these situations.

Warning while running Montepython in parallel

Hi
I tried to run montepython in parallel using mpi4py.py in my cluster the program was running fine but and I got the following warning.

" An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.

The process that invoked fork was:

Local host: satpura (PID 37516)
MPI_COMM_WORLD rank: 0

If you are absolutely sure that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
"
Do I need to worry about this warning?

Thanks,
Akhilesh

Euclid fiducial likelihoods

  • The two Euclid fiducial likelihoods were not brought to the newest version of the classy wrapper.
  • the nuisance parameters are defined by default.
  • When creating the fiducial model, the execution does not stop after one step.

These three issues should be fixed in the next release.

Problem with scf_parameters

Hello,

I'm trying to use Monte Python with quintessence but I'm having troubles passing the scalar field parameters. I'm trying something like this

data.cosmo_arguments['scf_parameters'] = 10.0, 0.0, 0.0, 0.0, 100.0, 0.01

and returns the error

Cosmological Module Error:
/|\ Something went wrong when calling CLASS
/o\ Error in Class: input_init(L:402) :error in
input_read_parameters(&(fzw.fc), ppr, pba, pth, ppt, ptr, ppm, psp, pnl,
ple, pop, errmsg);
=>input_read_parameters(L:1004) :error in
parser_read_list_of_doubles(pfc, "scf_parameters",
&(pba->scf_parameters_size), &(pba->scf_parameters), &flag1, errmsg);
=>parser_read_list_of_doubles(L:432) :condition
(sscanf(string_with_one_value,"%lg",&(list[i-1])) != 1) is true; could
not read 1th value of list of parameters scf_parameters in file NOFILE

So, how should I do it?

Thanks!

Parameter arrays as MCMC parameters

Hey

I just wondered what the best way is to handle parameters that are arrays in CLASS, when I want to use them as MCMC parameters?

For example, the array of m_ncdm masses. If I wanted two or three separate masses as independent MCMC parameters, how would I write this in the parameter file?

Thanks
Dan

Likelihoods can not be found.

Hi, I've been trying to install this on two different systems and am getting the same error with the likelihoods:

with CLASS v2.5.0
 /!\ Detecting empty folder, logging the parameter file

Testing likelihoods for:
 -> JLA

Configuration Error:
 /|\   Trying to import the JLA likelihood as asked in the parameter file, and
/_o_\  failed. Please make sure it is in the `montepython/likelihoods` folder,
       and is a proper python module. Check also that the name of the class
       defined in the __init__.py matches the name of the folder. In case this
       is not enough, here is the original message: No module named
       likelihoods.JLA

Traceback (most recent call last):
  File "montepython/MontePython.py", line 40, in <module>
    sys.exit(run())
  File "/home/ben.thorne/montepython_public/montepython/run.py", line 31, in run
    custom_command)
  File "/home/ben.thorne/montepython_public/montepython/run.py", line 195, in safe_initialisation
    "The initialisation was not successful, resulting in a "
io_mp.ConfigurationError: 

Configuration Error:
 /|\   The initialisation was not successful, resulting in a potentially half
/_o_\  created `log.param`. Please see the above error message. If you run the
       exact same command, it will not work. You should solve the problem, and
       try again.

The names of folders and classes certainly match. I have just followed the installation docs and am using python 2.7. The problem persists for any choice of the default likelihoods, including the downloaded plc.
Thanks!

adding a module for primordial spectrum

Hi,

I am new to class and the c language. I had worked before in fortran and I need to input a primordial spectrum different from a power law in slow-roll to class, and later to montepython, to study initial conditions different from those of chaotic inflation. Is this possible to do in class? Also, could I somehow use the module I already have written in fortran directly into class so that it inputs the power spectrum it calculates? I am aware this might not be possible. Another question, the system of differential equations I have for perturbations is stiff and I was wondering if the algorithm ndf15 that you developed could be better than using a Runge--Kutta for these cases.

Sorry for asking so many questions at once.
Thank you in advance for any advice on this.

Regards,
Erandy Ramirez

Montepython stops after a few hundreds steps & mpirun starts chains too fast

When I run montepython on the cluster (cc-in2p3) with mpirun, the mcmc stops after afew hundreds steps (without any error message). The only way (thanks to @dirian) to bypass the problem seems to use the -r option, restarting the mcmc where it had stopped...

Did anyone stumble upon the same kind of issue?

Auxiliary question: Is there a way to use an environment variable in order to identify each process thrown by mpirun ? This is in order to use it as an argument to the option "--chain-number".

For instance with "mpirun -np 8 [...]" such variable "$Process_ID" would be an integer from 1 to 8 (or generally a set of 8 different integers). Then, with the option "--chain-number $Process_ID", the 8 different chains would be named "blabla_$Process_ID.txt"

Thank you

Issue with n_dims when running MontePython with MultiNest

Hi,
When running MontePython with MultiNest, there seems to be a compiler-dependent issue with the handling of the n_dims variable when it is passed between pyMultiNest and MultiNest.
The value of n_dims is correctly initialised, but becomes random when it is passed later as an argument of the functions prior() and loglike().

As a temporary fix for this, one can hard-code the value of n_dims in the functions prior() and loglike() in the nested_sampling.py file, but this needs to be done for every run. Here is an example of our hard-coded fix assuming 6 parameters (what matters here is the dimension of the free parameter space, cosmological+nuisance; derived parameters do not count): between

def prior(cube, ndim, *args):

and the loop

    for i, name in zip(range(ndim), NS_param_names):
        cube[i] = data.mcmc_parameters[name]['prior']\
            .map_from_unit_interval(cube[i])

we added the line

   ndim=6

and the same just after

def loglike(cube, ndim, *args):

An issue has been opened about this in the pyMultiNest git as well (JohannesBuchner/PyMultiNest#72).

Interpreting new likelihood parameters

Line 145 of montepython/likelihood_class.py (public release v2.0.2) should read

regexp = re.match("%s.(.)\w=\w_(._)" % self.name, line)

instead of

regexp = re.match("%s.(.) = (.)" % self.name, line)

in order to reflect different whitespace styles in user-written parameter files (e.g., "likelihood.name=value" or "likelihood.name{tab(s)}= value"). Maybe there are even more cases which need to be taken into account.

Covariance matrix is not generated

Hi,

I just would like to ask whether there is an option for 'info' command for generating covariance matrix, since my naive usage of 'info' as the manual does not generate .covmat file, but generate plots, .bestfit, .h_info, .v_info.

I used Monte Python 2.2 and class 2.4.5.

thanks a lot for the time.

best,

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.