Giter Club home page Giter Club logo

execlim / isca Goto Github PK

View Code? Open in Web Editor NEW
137.0 21.0 123.0 45.73 MB

Idealized GCM from the University of Exeter

Home Page: https://execlim.github.io/IscaWebsite

License: GNU General Public License v3.0

Perl 0.20% HTML 6.12% Shell 0.11% Python 1.97% Fortran 75.82% C 6.55% C++ 7.85% Pawn 1.34% Groovy 0.01% Dockerfile 0.01% NASL 0.02% SourcePawn 0.01%
planetary-atmospheres atmospheric-science atmospheric-modelling geophysical-fluid-dynamics climate-model

isca's People

Contributors

alex-r-p avatar daw538 avatar dennissergeev avatar eviatarbach avatar gkvallis avatar gregcolyer avatar jamesp avatar lqxyz avatar matthewjhenry avatar mckimb avatar mjucker avatar mp586 avatar natgeo-wong avatar ntlewis avatar penmaher avatar pitmonticone avatar rosscastle avatar sit23 avatar wanyingkang avatar wseviour avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

isca's Issues

Making Isca conda-installable

Given the success of running Isca with conda libraries (as first done by @spencerahill in #108 ), it looks like it's possible to make the model itself conda-installable. Even if there is a performance hit as @sit23 pointed out (remains to be tested), it would be still useful.

I made a separate environment on my Ubuntu machine by installing all the libraries from the conda-forge channel and Isca indeed seems to run fine, at least the Held-Suarez test.

The next necessary step to making Isca available via conda is continuous integration (CI) with automated tests. As a proof of concept, I connected my Isca fork to Travis CI. Travis uses the conda environment mentioned above to install everything and then runs a test in the tests/ directory using pytest.

If this is useful, I'll create a PR or a branch for this development.

Exposing MOS surface layer

Hi Isca people,

Congratulations on a great modelling setup and thank you very much for making it publicly available. I really appreciate it! You've made everything very clear and easy to use.

May I ask about the surface layer? I'm particularly interested in winds, at say 10m? Since Monin-Obukhov is being solved already, presumably these would be easy to access somehow. I admit that I am just studying the code now and I figure it might be something I can do myself. I just thought since you are presumably very familiar with the codebase you might have an easy fix.

Thank you very much again!

P-level interpolator is Earth-centric

The script that interpolates data from sigma levels to pressure levels, Isca/postprocessing/plevel_interpolation/src/postprocessing/plevel/run_pressure_interp.F90 gets its gravitational constants, etc, from plev_constants.F90 in that same folder. These are parameters and are set to Earth values. For proper interpolation in planetary applications, these values should be made to be input parameters.

horiz_interp_conserve_mod:no latitude index found

Hi all,

I have been running Isca on a machine that has recently had new nodes installed. Before the new nodes all was fine. Now, when running I get a fatal error returned from all PEs like the one below. It's hard to report to sys admin without specific request (I suspect something wasn't done when installing the new nodes but could easily be wrong). Before I go digging into this interpolation module that is triggering the error, I just wanted to check with you whether you had seen this before and/or had an idea what might be the trigger.

Thank you very much indeed for any possible help in advance,

2019-01-28 10:23:49,905 - isca - DEBUG - FATAL from PE 0: horiz_interp_conserve_mod:no latitude index found: n,sph= 1 NaN 2019-01-28 10:23:49,905 - isca - DEBUG - 2019-01-28 10:23:49,905 - isca - DEBUG - application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0

Quick question on "JULIAN" Calendar Type

Given that the Julian Calendar Type means that some years have 365 days and others 366 days, how do you specify specifically to save daily data for that year only, or does the nature of FMS mean that it is naturally impossible to keep the data from each year separate (i.e. ncfiles corresponding to year 2005 will have some data from year 2004 inside because 2004 is a leap year and does not have 365 days)?

convection mods

Module List:

  • lscale_cond_mod
  • ras_mod
  • betts_miller_mod (and Q moist convection)
  • dry_convection_mod

Unclear Issue at End of Compiling held-suarez Case

After clearing up the previous issues with compiling the held-suarez test case, it seems to have finally reached the point where it officially compiles. However, the python code seems to hit some issue shortly after that isn't clearly explained in the terminal output. I've pasted in the output starting from just before the finish of the compilation. One part seems to suggest it's an issue with a git command, but I'm not really sure.

2019-04-01 14:10:20,025 - isca - INFO - mpiifort nc-config --libs spectral_dynamics.o mpp_data.o affinity.o gradient_c2l.o atmosphere.o fms_io.o ic_from_external_file.o edt.o time_interp_external.o every_step_diagnostics.o tracer_manager.o mg_drag.o mpp_memutils.o strat_cloud.o diag_util.o mosaic_util.o time_interp.o random_numbers.o mosaic.o memuse.o polvani_2004.o memutils.o jablonowski_2006.o threadloc.o horiz_interp_bicubic.o astronomy.o gaussian_topog.o vert_diff.o diag_axis.o implicit.o diag_manager.o qe_moist_convection.o two_stream_gray_rad.o transforms.o vert_coordinate.o mpp_utilities.o spherical.o mpp_pset.o fms.o test_mpp_io.o idealized_moist_phys.o create_xgrid.o interp.o diag_output.o horiz_interp_conserve.o mpp_parameter.o axis_utils.o surface_flux.o platform.o fv_advection.o rad_utilities.o mpp_domains.o cg_drag.o diag_table.o diag_grid.o diffusivity.o time_manager.o spherical_fourier.o vert_advection.o tridiagonal.o atmos_model.o mpp_io.o sat_vapor_pres_k.o stable_bl_turb.o polvani_2007.o horiz_interp_spherical.o damping_driver.o topography.o monin_obukhov_kernel.o spectral_damping.o horiz_interp.o monin_obukhov.o gradient.o spectral_initialize_fields.o topo_drag.o fm_util.o fft.o rayleigh_bottom_drag.o qflux.o entrain.o horiz_interp_bilinear.o diag_data.o grid.o matrix_invert.o get_cal_time.o read_mosaic.o betts_miller.o hs_forcing.o my25_turb.o constants.o tracer_type.o sat_vapor_pres.o test_mpp_pset.o test_fms_io.o spec_mpp.o shallow_conv.o field_manager.o interpolator.o grid_fourier.o vert_turb_driver.o lscale_cond.o dry_convection.o gauss_and_legendre.o nsclock.o leapfrog.o global_integral.o ras.o topog_regularization.o mpp.o fft99.o press_and_geopot.o MersenneTwister.o horiz_interp_type.o test_mpp_domains.o mixed_layer.o water_borrowing.o test_mpp.o spectral_init_cond.o -o held_suarez.x -L/u/local/intel/11.1/libs/netcdf/4.1.3-shared/lib -lnetcdff -lnetcdf -lmpi -shared-intel -traceback -nowarn
2019-04-01 14:10:22,000 - isca - WARNING - ipo: warning #11010: file format not recognized for /u/local/compilers/gcc/4.9.3/lib/libgcc_s.so
2019-04-01 14:10:22,316 - isca - INFO - ld: skipping incompatible /u/local/compilers/gcc/4.9.3/lib/libgcc_s.so when searching for -lgcc_s
2019-04-01 14:10:26,471 - isca - INFO - make: `held_suarez.x' is up to date.
2019-04-01 14:10:26,472 - isca - INFO - Compilation complete.
2019-04-01 14:10:26,475 - isca - DEBUG - Making directory '/u/flashscratch/m/mmckinne/isca_work/experiment/held_suarez_default'
2019-04-01 14:10:26,537 - isca - WARNING - Tried to remove run directory but it doesnt exist
2019-04-01 14:10:26,570 - isca - INFO - Emptied run directory '/u/flashscratch/m/mmckinne/isca_work/experiment/held_suarez_default/run'
Traceback (most recent call last):
File "held_suarez_test_case.py", line 107, in
exp.run(1, num_cores=NCORES, use_restart=False)
File "/u/home/m/mmckinne/hab_proj/Isca/src/extra/python/isca/helpers.py", line 22, in _destructive
return fn(*args, **kwargs)
File "/u/home/m/mmckinne/hab_proj/Isca/src/extra/python/isca/helpers.py", line 38, in _useworkdir
return fn(*args, **kwargs)
File "/u/home/m/mmckinne/hab_proj/Isca/src/extra/python/isca/experiment.py", line 221, in run
self.codebase.write_source_control_status(P(self.rundir, 'git_hash_used.txt'))
File "/u/home/m/mmckinne/hab_proj/Isca/src/extra/python/isca/codebase.py", line 146, in write_source_control_status
source_status = self.git.status("-b", "--porcelain").stdout.decode('utf8')
File "/u/home/m/mmckinne/miniconda/envs/isca_env/lib/python3.7/site-packages/sh.py", line 1427, in call
return RunningCommand(cmd, call_args, stdin, stdout, stderr)
File "/u/home/m/mmckinne/miniconda/envs/isca_env/lib/python3.7/site-packages/sh.py", line 774, in init
self.wait()
File "/u/home/m/mmckinne/miniconda/envs/isca_env/lib/python3.7/site-packages/sh.py", line 792, in wait
self.handle_command_exit_code(exit_code)
File "/u/home/m/mmckinne/miniconda/envs/isca_env/lib/python3.7/site-packages/sh.py", line 815, in handle_command_exit_code
raise exc
sh.ErrorReturnCode_129:

RAN: /usr/bin/git --no-pager --git-dir=/u/flashscratch/m/mmckinne/isca_work/codebase/_u_home_m_mmckinne_hab_proj_Isca/code/.git --work-tree=/u/flashscratch/m/mmckinne/isca_work/codebase/_u_home_m_mmckinne_hab_proj_Isca/code status -b --porcelain

STDOUT:

STDERR:
error: unknown switch `b'
usage: git status [options] [--] ...

-v, --verbose         be verbose
-s, --short           show status concisely
--porcelain           show porcelain output format
-z, --null            terminate entries with NUL
-u, --untracked-files[=<mode>]
                      show untracked files, optional modes: all, normal, no. (Default: all)

Issue with socrates aerosol_model_pcf.f90 while compiling

While compiling the test case, I encountered the following error:
soc_error.txt

It seems like the model is expecting something else in that line of the file, which may be due to using a different version of socrates. If that is the case would it be best to try using the same version of socrates the Isca developers are using to avoid similar issues going forward?

RAS lacking CAPE and depth_change_conv

We recently have been trying the RAS convection scheme, but found that it could not output CAPE. Additionally, we found that it seemed to lack a calculation for the "depth_change_conv" variable, which lead to odd values later on when the bucket depth was calculated. Are these expected behaviors or has something gone wrong on our end?

damping_driver_mod

Additional Modules

  • mg_drag (needs fixing)
  • cg_drag (needs tidying)
  • topo_drag (needs removing)

Successfully running Isca on HPC using conda builds of libraries; looking for feedback

In trying to port Isca to Columbia University's Terremoto HPC cluster, I discovered that Terremoto's builtin netcdf-c library was broken. So, while awaiting a fix from Terremoto's helpful sysadmins, I got the crazy idea of installing all of the needed libraries with conda in my local directory instead. And it worked! I've run both the Held Suarez and Frierson test cases successfully, the latter in full parallel on 8 nodes. This seems like it could be of interest and so I thought I'd share.

I've pasted the contents of the pertinent files below. Before that, my thoughts:

  1. This would seem to point toward a future wherein Isca is as easily installed on any (linux-64) HPC cluster as (something like) conda install isca: all of the dependencies, including the c and fortran libraries, mpi, and netcdf, are installable via a combination of conda and pip. Yes/no? (Assuming this works on other machines.)
  2. At the same time, perhaps there are optimizations in the builtin c and other libraries, or other good reasons why using local, conda-installed libraries for such HPC tasks is undesirable?

Anyways, Isca is great, and thanks for that!


My conda env:

sah2249@bake:~$ conda list --export
# This file may be used to create an environment using:
# $ conda create --name <env> --file <this file>
# platform: linux-64
_libgcc_mutex=0.1=conda_forge
_openmp_mutex=4.5=0_gnu
binutils_impl_linux-64=2.33.1=h53a641e_8
binutils_linux-64=2.33.1=h9595d00_16
bzip2=1.0.8=h516909a_2
ca-certificates=2019.11.28=hecc5488_0
certifi=2019.11.28=py38_0
curl=7.68.0=hf8cf82a_0
expat=2.2.9=he1b5a44_2
f90nml=1.1.2=pypi_0
gcc_impl_linux-64=7.3.0=hd420e75_5
gcc_linux-64=7.3.0=h553295d_16
gettext=0.19.8.1=hc5be6a0_1002
gfortran_impl_linux-64=7.3.0=hdf63c60_5
gfortran_linux-64=7.3.0=h553295d_16
hdf4=4.2.13=hf30be14_1003
hdf5=1.10.5=mpi_mpich_ha7d0aea_1004
isca=0.2=dev_0
jinja2=2.11.1=py_0
jpeg=9c=h14c3975_1001
krb5=1.16.4=h2fd8d38_0
ld_impl_linux-64=2.33.1=h53a641e_8
libblas=3.8.0=14_openblas
libcblas=3.8.0=14_openblas
libcurl=7.68.0=hda55be3_0
libedit=3.1.20170329=hf8c457e_1001
libffi=3.2.1=he1b5a44_1006
libgcc-ng=9.2.0=h24d8f2e_2
libgfortran-ng=7.3.0=hdf63c60_5
libgomp=9.2.0=h24d8f2e_2
liblapack=3.8.0=14_openblas
libnetcdf=4.7.3=mpi_mpich_h755db7c_1
libopenblas=0.3.7=h5ec1e0e_6
libpng=1.6.37=hed695b0_0
libssh2=1.8.2=h22169c7_2
libstdcxx-ng=9.2.0=hdf63c60_2
libuuid=2.32.1=h14c3975_1000
libxcb=1.13=h14c3975_1002
markupsafe=1.1.1=py38h516909a_0
mpi=1.0=mpich
mpich=3.3.2=hc856adb_0
ncurses=6.1=hf484d3e_1002
ncview=2.1.7=h8ec25ab_3
netcdf-fortran=4.5.2=mpi_mpich_hd560429_3
numpy=1.18.1=py38h95a1406_0
openssl=1.1.1d=h516909a_0
pandas=1.0.1=py38hb3f55d8_0
pip=20.0.2=py_2
pkg-config=0.29.2=h516909a_1006
pthread-stubs=0.4=h14c3975_1001
python=3.8.1=h357f687_2
python-dateutil=2.8.1=py_0
pytz=2019.3=py_0
readline=8.0=hf8c457e_0
setuptools=45.2.0=py38_0
sh=1.12.14=py38_1001
six=1.14.0=py38_0
sqlite=3.30.1=hcee41ef_0
tk=8.6.10=hed695b0_0
tqdm=4.42.1=py_0
udunits2=2.2.27.6=h4e0c4b3_1001
wheel=0.34.2=py_1
xarray=0.15.0=py_0
xorg-kbproto=1.0.7=h14c3975_1002
xorg-libice=1.0.10=h516909a_0
xorg-libsm=1.2.3=h84519dc_1000
xorg-libx11=1.6.9=h516909a_0
xorg-libxau=1.0.9=h14c3975_0
xorg-libxaw=1.0.13=h14c3975_1002
xorg-libxdmcp=1.1.3=h516909a_0
xorg-libxext=1.3.4=h516909a_0
xorg-libxmu=1.1.3=h516909a_0
xorg-libxpm=3.5.13=h516909a_0
xorg-libxt=1.1.5=h516909a_1003
xorg-xextproto=7.3.0=h14c3975_1002
xorg-xproto=7.0.31=h14c3975_1007
xz=5.2.4=h14c3975_1001
zlib=1.2.11=h516909a_1006

My Isca test run script:

(isca_conda) sah2249@bake:~/testing/isca-testing/held-suarez$ cat run_isca_terremoto.sh
#!/bin/bash -l

#SBATCH --account=apam
#SBATCH --partition=free
#SBATCH --job-name=held_suarez_test_case
#SBATCH --time=1:00:00
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=16
#SBATCH --cpus-per-task=1
#SBATCH --output=slurm_%j.out

echo Running on host `hostname`
echo Time is `date`
echo Directory is `pwd`

module purge
source $HOME/.bashrc
source $GFDL_BASE/src/extra/env/terremoto
conda activate isca_conda

python $GFDL_BASE/exp/test_cases/held_suarez/held_suarez_test_case.py

My terremoto env file:

(isca_conda) [first-try !?]sah2249@bake:~/Isca/src/extra/env$ cat terremoto
echo loadmodules for terremoto machine at Columbia University

module purge
module load shared
module load slurm/17.11.8

# Need to source the `conda.sh` file in order to activate a conda env from
# within a script.  See https://github.com/conda/conda/issues/7980#issuecomment-441358406.
source ${HOME}/miniconda3/etc/profile.d/conda.sh
conda activate isca_conda

export F90=mpifort
export CC=mpicc
export GFDL_MKMF_TEMPLATE=terremoto-conda
export LD_LIBRARY_PATH=${HOME}/miniconda3/envs/isca_conda/lib:${LD_LIBRARY_PATH}
export CPPFLAGS=-I${HOME}/miniconda3/envs/isca_conda/include
export CFLAGS=-D__IFC ${CPPFLAGS}

And my mkmf templates:

sah2249@bake:~/Isca/src/extra/python/isca/templates$ cat mkmf.template.terremoto-conda
# template for the Columbia University "Terremoto" machine using conda
# typical use with mkmf
# mkmf -t template.ifc -c" -Duse_libMPI -Duse_netCDF" path_names /usr/local/include
CPPFLAGS = -I${HOME}/miniconda3/envs/isca_conda/include
NETCDF_LIBS = -L${HOME}/miniconda3/envs/isca_conda/lib

# FFLAGS:
#  -cpp: Use the fortran preprocessor
#  -fcray-pointer: Cray pointers don't alias other variables.
#  -O2: Level 2 speed optimisations
#  -ffree-line-length-none -fno-range-check: Allow arbitrarily long lines
#  -fdefault-real-8: 8 byte reals (compatability for some parts of GFDL code)
#  -fdefault-double-8: 8 byte doubles (compat. with RRTM)
FFLAGS = $(CPPFLAGS) $(NETCDF_LIBS) -cpp -fcray-pointer \
          -O2 -ffree-line-length-none -fno-range-check \
          -fdefault-real-8 -fdefault-double-8

LDFLAGS = $(NETCDF_LIBS) -lhdf5 -lhdf5_hl -lhdf5_fortran -lhdf5hl_fortran \
           -lnetcdff -lnetcdf -lmpi
CFLAGS = -D__IFC $(CPPFLAGS)

FC = $(F90)
LD = $(F90)

Running the model without 'module'

Hi,

I want to know if it would be possible to run the model without using the 'module' program. If so, which file should I tweak in order to do this?

Thank you.

Run MiMA with annual-mean insolation?

Hi,

I don't see a straightforward way of setting up the MiMA simulations with annual-mean insolation (and not insolation of a particular day of the year). Is there a workaround?

Cheers,

-Matthew

Possibly losing stratosphere inversion in experimental Isca runs

Hello, we've still been having issues with runs crashing when using higher values of the ES0 constant. We recently found that one possible reason for this could be a vertical expansion of the troposphere beyond the model's upper bound. To demonstrate this, I'm attaching a couple figures showing 1: a case with a clear inversion in the upper levels; and 2: one without any inversion. To note, there are a couple cases with no inversions that do not crash, but most do. Additionally, even in the cases that do not crash, the inversion is very high relative to our expected troposphere height. Is this just a normal property of Isca, or indicative of another issue on our end? Also, is the lack of an inversion/stratosphere actually a problem, or just a quirk of the model that we're misinterpreting as related to our issue?

1:
image

2:
image

Equinox_day inconsistency

equinox_day (the fraction of the year at which autumn equinox occurs) is set as default to 0.0 in two_stream_gray_nml but 0.75 in rrtm_radiation_nml and socrates_nml.

Should the value in two_stream_gray_nml be set to the correct value of 0.75?

time_manager_mod

Notes

  • Big Job!
  • Also including: time_interp, get_cal_time, calendar_calc

Issue in cg_drag.f90 file

Hello, I am attempting to run the held_suarez test case and ran into an issue when the process reached the cg_drag.f90 file. Specifically, the problem is with the column_diagnostics_mod, which seems to not exist anywhere I can find it. The output showing the error is:

2019-03-18 15:38:12,094 - isca - INFO - /u/flashscratch/m/mmckinne/isca_work/codebase/_u_home_m_mmckinne_hab_proj_Isca/code/src/atmos_param/cg_drag/cg_drag.f90(20): error #7002: Error in opening the compiled module file. Check INCLUDE paths. [COLUMN_DIAGNOSTICS_MOD]
2019-03-18 15:38:12,095 - isca - INFO - use column_diagnostics_mod, only: column_diagnostics_init, &

I don't know if this module is supposed to be generated by something and was not, or if there is a file that should be there but isn't.

Ability to write to file in hour increments and smaller

I attempted to modify the diag.add_file line in the runscript to output in 1-hour increments as such: diag.add_file('atmos_daily', 1, 'hours', time_units='hours') and run the model for a single day. It ran without crashing, but only output a file with a single data entry (rather than 24 as expected). Is it possible to do what I'm attempting, or is the smallest time-increment only 1 day?

compile error : mpi and netcdf

I am constantly getting the following message when I try to compile.

/usr/bin/ld: mppnccombine.o: undefined reference to symbol 'ncvarget'
/usr/lib/libnetcdf.so: error adding symbols: DSO missing from command line

I do have netcdf and netcdf-fortran properly installed.

Thank you.

Issue compiling socrates test case

I reinstalled Isca to make sure I had the full socrates-capable version, and for the most part seem to have things back in place. However when trying the socrates test case, socrates_aquaplanet.py, it quickly errors due to what appears to be a missing file. I have pasted the full output of the test case up to where it crashes below. The important line looks like "2019-08-09 22:09:28,495 - isca - INFO - ....ERROR opening file /u/flashscratch/m/mmckinne/isca_work/codebase/_u_home_m_mmckinne_hab_proj_Isca/code/src/atmos_param/socrates/src/trunk/src/um/out_nml.f90 of object out_nml.o: No such file or directory", where the model is trying to open a file I don't have in my socrates source code. I don't know if I have the wrong version of socrates or if there's some step I missed, but I haven't been able to find this file elsewhere, or other mentions of it or the /um/ directory in code files.

2019-08-09 22:09:26,990 - isca - INFO - RRTM compilation disabled.
2019-08-09 22:09:26,994 - isca - INFO - Socrates source code already in correct place. Continuing.
2019-08-09 22:09:27,117 - isca - INFO - Emptied run directory '/u/flashscratch/m/mmckinne/isca_work/experiment/soc_test_aquaplanet/run'
2019-08-09 22:09:27,151 - isca - INFO - Writing path_names to '/u/flashscratch/m/mmckinne/isca_work/codebase/_u_home_m_mmckinne_hab_proj_Isca/build/soc_isca/path_names'
2019-08-09 22:09:27,204 - isca - INFO - Running compiler
2019-08-09 22:09:27,230 - isca - INFO - loadmodules for emps-hoff machines
2019-08-09 22:09:27,301 - isca - INFO - The 'gcc/4.9.3' module is being loaded
2019-08-09 22:09:27,529 - isca - INFO - Unloading the conflicting module 'intel/18.0.3'
2019-08-09 22:09:27,809 - isca - INFO - /u/flashscratch/m/mmckinne/isca_work/codebase/_u_home_m_mmckinne_hab_proj_Isca/build/soc_isca/path_names
2019-08-09 22:09:28,495 - isca - INFO - ....ERROR opening file /u/flashscratch/m/mmckinne/isca_work/codebase/_u_home_m_mmckinne_hab_proj_Isca/code/src/atmos_param/socrates/src/trunk/src/um/out_nml.f90 of object out_nml.o: No such file or directory
2019-08-09 22:09:28,507 - isca - INFO - echo soc_isca.x does not exist.
2019-08-09 22:09:28,512 - isca - INFO - soc_isca.x does not exist.
2019-08-09 22:09:28,523 - isca - INFO - echo soc_isca.x does not exist.
2019-08-09 22:09:28,528 - isca - INFO - soc_isca.x does not exist.
2019-08-09 22:09:28,530 - isca - INFO - Compilation complete.
2019-08-09 22:09:28,594 - isca - INFO - Emptied run directory '/u/flashscratch/m/mmckinne/isca_work/experiment/soc_test_aquaplanet/run'
2019-08-09 22:09:29,256 - isca - INFO - Writing namelist to '/u/flashscratch/m/mmckinne/isca_work/experiment/soc_test_aquaplanet/run/input.nml'
2019-08-09 22:09:29,264 - isca - INFO - Writing field_table to '/u/flashscratch/m/mmckinne/isca_work/experiment/soc_test_aquaplanet/run/field_table'
2019-08-09 22:09:29,299 - isca - INFO - Writing diag_table to '/u/flashscratch/m/mmckinne/isca_work/experiment/soc_test_aquaplanet/run/diag_table'
2019-08-09 22:09:29,344 - isca - INFO - Running without restart file
2019-08-09 22:09:29,362 - isca - INFO - Beginning run 1
2019-08-09 22:09:29,380 - isca - INFO - process running as 31685
2019-08-09 22:09:29,396 - isca - DEBUG - loadmodules for emps-hoff machines
2019-08-09 22:09:29,471 - isca - DEBUG -
2019-08-09 22:09:29,471 - isca - DEBUG - The 'gcc/4.9.3' module is being loaded
2019-08-09 22:09:29,471 - isca - DEBUG -
2019-08-09 22:09:29,690 - isca - DEBUG - Unloading the conflicting module 'intel/18.0.3'
2019-08-09 22:09:29,690 - isca - DEBUG -
2019-08-09 22:09:29,896 - isca - DEBUG - cp: cannot stat `/u/flashscratch/m/mmckinne/isca_work/codebase/_u_home_m_mmckinne_hab_proj_Isca/build/soc_isca/soc_isca.x': No such file or directory

Pressure level interpolation - NETCDF ERROR

Hi, I tried to interpolate my output from the Held & Suarez test case (with slightly modified parameters) on the standard pressure levels set described in the run_plevel.py script. However, I get an error when trying to do this after compiling the initial script. The error message I get is :
ERROR: required field does not exist: zsurf

I guess that this is because the interpolator needs the geopotential at the surface, but in my fields, I have the geopotential at every vertical level. I tried to go around this problem by adding the option -0 to the interpolate script to just set it to 0, but this wasn't successful either and I got a message saying NETCDF ERROR with no comments. I looked into the codes to try to find where this message could come from, but I can't tell. Do you have any idea on why this problem happens? Thank you for your help.

qflux_mod

Notes

  • and the scripts in .../python/scripts

Compile issues with Intel 19?

When trying to compile Isca (i.e., by running the Held Suarez test case following the readme), I'm running into a series of errors of the following form:

2020-01-17 19:53:29,870 - isca - INFO - /home/nfeldl/Isca_work/codebase/_home_nfeldl_Isca/code/src/atmos_spectral/init/ic_from_external_file.F90(34): error #6580: Name in only-list does not exist or is not accessible.   [GRID_DOMAIN]
2020-01-17 19:53:29,870 - isca - INFO - trans_spherical_to_grid, grid_domain, spectral_domain, get_grid_domain, &
2020-01-17 19:53:29,870 - isca - INFO - ----------------------------------------------------------^
2020-01-17 19:53:29,870 - isca - INFO - /home/nfeldl/Isca_work/codebase/_home_nfeldl_Isca/code/src/atmos_spectral/init/ic_from_external_file.F90(34): error #6580: Name in only-list does not exist or is not accessible.   [SPECTRAL_DOMAIN]
2020-01-17 19:53:29,870 - isca - INFO - trans_spherical_to_grid, grid_domain, spectral_domain, get_grid_domain, &
2020-01-17 19:53:29,870 - isca - INFO - -----------------------------------------------------------------------^
2020-01-17 19:53:29,872 - isca - INFO - /home/nfeldl/Isca_work/codebase/_home_nfeldl_Isca/code/src/atmos_spectral/init/ic_from_external_file.F90(124): error #6404: This name does not have a type, and must have an explicit type.   [GRID_DOMAIN]
2020-01-17 19:53:29,872 - isca - INFO - call read_data(file_name, u_name,   ug, domain=grid_domain)
2020-01-17 19:53:29,872 - isca - INFO - -----------------------------------------------^
2020-01-17 19:53:29,872 - isca - INFO - /home/nfeldl/Isca_work/codebase/_home_nfeldl_Isca/code/src/atmos_spectral/init/ic_from_external_file.F90(124): error #6285: There is no matching specific subroutine for this generic subroutine call.   [READ_DATA]
2020-01-17 19:53:29,872 - isca - INFO - call read_data(file_name, u_name,   ug, domain=grid_domain)
2020-01-17 19:53:29,872 - isca - INFO - -----^
2020-01-17 19:53:29,872 - isca - INFO - /home/nfeldl/Isca_work/codebase/_home_nfeldl_Isca/code/src/atmos_spectral/init/ic_from_external_file.F90(125): error #6285: There is no matching specific subroutine for this generic subroutine call.   [READ_DATA]
2020-01-17 19:53:29,872 - isca - INFO - call read_data(file_name, v_name,   vg, domain=grid_domain)
2020-01-17 19:53:29,872 - isca - INFO - -----^
2020-01-17 19:53:29,872 - isca - INFO - /home/nfeldl/Isca_work/codebase/_home_nfeldl_Isca/code/src/atmos_spectral/init/ic_from_external_file.F90(126): error #6285: There is no matching specific subroutine for this generic subroutine call.   [READ_DATA]
2020-01-17 19:53:29,872 - isca - INFO - call read_data(file_name, t_name,   tg, domain=grid_domain)
2020-01-17 19:53:29,872 - isca - INFO - -----^
2020-01-17 19:53:29,872 - isca - INFO - /home/nfeldl/Isca_work/codebase/_home_nfeldl_Isca/code/src/atmos_spectral/init/ic_from_external_file.F90(127): error #6285: There is no matching specific subroutine for this generic subroutine call.   [READ_DATA]
2020-01-17 19:53:29,872 - isca - INFO - call read_data(file_name, ps_name, psg, domain=grid_domain)
2020-01-17 19:53:29,872 - isca - INFO - -----^
2020-01-17 19:53:29,872 - isca - INFO - /home/nfeldl/Isca_work/codebase/_home_nfeldl_Isca/code/src/atmos_spectral/init/ic_from_external_file.F90(137): error #6285: There is no matching specific subroutine for this generic subroutine call.   [READ_DATA]
2020-01-17 19:53:29,872 - isca - INFO - call read_data(file_name, tr_name, grid_tracers(:,:,:,ntr), domain=grid_domain)
2020-01-17 19:53:29,872 - isca - INFO - -------^
2020-01-17 19:53:29,874 - isca - INFO - compilation aborted for /home/nfeldl/Isca_work/codebase/_home_nfeldl_Isca/code/src/atmos_spectral/init/ic_from_external_file.F90 (code 1)
2020-01-17 19:53:29,877 - isca - INFO - make: *** [ic_from_external_file.o] Error 1
2020-01-17 19:53:29,877 - isca - INFO - ERROR: mkmf failed for held_suarez.x

The error message is the same as this bug reported for Intel 18: https://software.intel.com/en-us/forums/intel-fortran-compiler/topic/759376. It seems as though the fortran compiler for some reason is unable to access the object file that's in the same directory. I'm using Intel 19.

Has anyone encountered this bug or found a way to move past it?

Missing namelist-writes to logfile

Continued from pull request #28

Namelist-writes still outstanding (some may not be needed; list may also be incomplete; sorted plausibly from most to least important):

qflux_nml
horiz_interp_spherical_nml
interpolator_nml
missingVar_nml
badType1_nml
badType2_nml
test_axis_utils_nml
test_horiz_interp_nml
test_nml

Error checking of namelists

Not all of the namelists are being error checked when they are read in. This means a nml read-in can fail without alerting the user, meaning the default values are used without the user knowing.

This is an issue when the compiler option INTERNAL_FILE_NML is set, which we are using.

Examples of ones that aren't error checked:
https://github.com/sit23/Isca/blob/master/src/coupler/surface_flux.F90#L816
https://github.com/sit23/Isca/blob/master/src/atmos_spectral/driver/solo/idealized_moist_phys.F90#L306

Examples of ones that are error checked:
https://github.com/sit23/Isca/blob/master/src/atmos_spectral/model/spectral_dynamics.F90#L253

difficulty running realistic continents setup at higher resolution

Hi!

I am a relatively new user of Isca and have been trying to extend some of the provided test cases (all of which work in their original form; I understand there is no support available for porting the model) to scenarios which are more relevant to my work. I have recently been trying to extend the "realistic_continents_fixed_sst_test_case" provided with the model to a higher resolution (T85), with some other minor adjustments. Trying to start in this resolution from a cold start (use_restart=False) leads to a failure during model initialization, with the log file reporting the following: "compute_lambda: Iterative scheme for computing lambda may not work unless initial values of lambda_1 and lambda_2 are reduced." I couldn't find any options for a namelist for these lambda parameters, and I suspect this may be a symptom rather than the root cause of crashing. Since I am new, I suspect the problem is actually user error, but I'm having a hard time understanding what the problem is and was hoping this page could provide some assistance. I will provide more detailed description of the steps I took below, as well as attaching python scripts, log files, and namelists. I appreciate any assistance you can provide!

To generate this run, I have used the land_generator_fn.py to create a land input at a T85 resolution. I used land_mode='continents' with 'all' and topo_mode='sauliere2012' with 'all'. The land input file is attached (land_t85.nc). I updated the provided namelist to run the RAS convection scheme with entrainment and with a slightly higher vertical top. The sea-ice concentration and SST input files were the same as the realistic_continents_fixed_sst_case, as was ozone. The model runs perfectly fine at T42, but T85 does not. I did not decrease the time step, but I did not think I would hit CFL conditions during model initialization and the error did not seem suggestive of this. Thus it appears to me something about the change in resolution appears to be causing the problem. I have done more troubleshooting, including trying to run the T85 from a T42 restart (which fails with a different error about ozone), and would be happy to provide more information about that if it would be helpful. Thank you again!
isca_failed_run.zip

mkmf template instructions in readme

Hey Isca crew, I am new to using your model and I am so impressed with all of the python interfacing. I was able to get things going very easily and it's cool to see so much work to make things user friendly.

I wanted to make a small suggestion based on my attempts to get held_suarez running, as recommended in the README. I needed to make a customized template file to get this to run, but it took a little bit of digging around to figure out which template file is being called and how to modify it. I am now getting things working by modifying the file src/extra/python/isca/templates/mkmf.template.ia64. It also wasn't clear how the template files in bin/ are used, whether this is what I should modify or not.

Would you guys be able to put something in the README mentioning which mkmf template is called, and your recommendations for how to change this etc?

Let me know what you think and if I'm modifying the correct file (or if I should modify a file within bin/ and call this somehow). Thanks!

Run Isca - AttributeError: 'DiagTableFile' object has no attribute 'files'

Hi, I am trying to run the test1 case (default run in the run_isca directory as described in the README.md file there) and the model runs through the first month without any error message being written. However, when it comes time to write the output in the output files, I get an error message that I never got before. Here is what it says (I removed the normal lines that are not useful here) :

2018-08-31 15:02:07,751 - isca - INFO - Emptied run directory '/scratch/aaudette/gfdl_work/experiment/test1/run'
[...]
2018-08-31 15:03:48,637 - isca - INFO - Run 1 complete
Traceback (most recent call last):
File "./isca", line 51, in
exp.run(1, use_restart=False, num_cores=args.num_cores)
File "/home/aaudette/Isca/src/extra/python/isca/helpers.py", line 22, in _destructive
return fn(*args, **kwargs)
File "/home/aaudette/Isca/src/extra/python/isca/helpers.py", line 38, in _useworkdir
return fn(*args, **kwargs)
File "/home/aaudette/Isca/src/extra/python/isca/experiment.py", line 293, in run
for file in self.diag_table.files:
AttributeError: 'DiagTableFile' object has no attribute 'files'

Edit: I was able to run correctly all the test cases I tried before.

Thank you for your help.
Alex

Issues with implementing variable heat capacity calculation with bucket depth

I have attempted to make a small change in the heat capacity calculation within the mixed layer module. The change simply adjusts the calculation to use the actual bucket depth if it is lower than the specified mixed-layer depth, with the land value used as a minimum value. However, when I attempt to run the model with this edit, it crashes immediately due to bad temperatures. Trying to investigate the issue has given me little insight into what might be wrong. I am attaching here the edited mixed layer file, with my edits marked with "!MMM" and located on line 524.
mixed_layer.txt

Getting strange precipitation patterns in Isca

Hello again,

We've recently started getting strange precipitation patterns in Isca, and it's starting to seem like an issue with the model rather than some new settings we've been trying. We initially saw it after adding a variable heat capacity option for bucket depth over land grid cells. However, I recently decided to rerun an older experiment that uses the default static heat capacity calculation and still received strange precipitation patterns. To demonstrate what we're seeing, I'm attaching first the 1-year precipitation from a previous run using the default heat capacity, and then the recent run of the same experiment (values are in mm/day). I'm not sure if you will know what's going on, but thought it best to at least ask.

shc_y110
shc_y6

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.