Giter Club home page Giter Club logo

openmc_workshop's People

Contributors

shimwell avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

openmc_workshop's Issues

dose rate task 9 result conversion may be incorrect

task 9 dose rate on a surface states
"The cell tally has units of pSv cm² per source particle (p is pico). Therefore, the tally result must be divided by the surface area of the sphere to make the units into pSv, and then multiplied by the activity (in Bq) of the source to get pSv per second."

I believe it should not be activity but gammas emitted per second, so in the case of Co60 where there are approximately two gamma rays emitted per decay it should be twice the activity.

testing notebooks with the CI

The current CI is minimal and doesn't fully test the notebooks

Got some more help from @AGoose on this topic

There are some approaches you might take to test a notebook without the user seeing the tests:

  • Create a set of "blueprint" notebooks that contain test cells. Add "test" cell tags to these test cells, and then use nbformat to generate a production-ready notebook without these cells using the TagRemovePreprocessor
  • Create a set of "test" notebooks, which execute the production notebooks in the current namespace (e.g. using the %run magic)

Concerning how to test statistical outputs, well, that's another problem entirely, and I suppose one approach is to either parametrise your test mechanism to allow a higher number of samples when running your tests, or to perhaps identify the distribution on the various quantities you observe and check that your result lies within that window. I suppose you might already be doing that!

two source_sampling.so files in docker image

When the parametric_plasma_source is compiled a source_sampling.so file is created in the /build directory. However, another source_sampling.so file is also created in the parametric_plasma_source directory (I assume when the plasma source is pip installed). SOURCE_SAMPLING_PATH points to the incorrect source_sampling.so file - it should point to /build/source_sampling.so but it points to /parametric_plasma_source/source_sampling.so

Updating the CAD based simulations

Tasks 10 and task 12 aim to provide an example of CAD based simulations
Task 10 tries to explain simple faceted based geometry simulations and currently uses Trelis. This should be migrated to use the PPP

Task 12 tries to explain how to perform unstructured mesh based simulations and makes use of Trelis for the first facet based part of the geometry then makes use of Trelis for the tet mesh part as well. This should be updated so the the PPP is used for the first stage

This can be done on the master branch as these are probably going to be some of the last tasks migrated across to the new jupyter notebook based workflow

Task 10 https://github.com/ukaea/openmc_workshop/tree/master/tasks/task_10
Task 12 https://github.com/ukaea/openmc_workshop/tree/master/tasks/task_12

CAD not showing on cloud based solution

In a recently created cloud run version of the workshop the .solids don't show up as 3d shapes in the jupyter notebook

Got some nice suggestions on how to fix this from @AGoose

The rendering failure of the solid objects is due to HTTPS/HTTP mixed content blocking. You can patch the rich display hook (for all of these solid types) to replace http with https (hacky, yes!) e.g.

_repr_html = type(my_shape.solid).repr_html
def repr_html(self):
return _repr_html(self).replace("http:", "https:")
type(my_shape.solid).repr_html = repr_html

You can also get Jupyter Lab/Notebook interactive STL viewers, e.g. https://github.com/K3D-tools/K3D-jupyter

However, it looks like there is a Jupyter Lab extension for CadQuery: https://github.com/bernhard-42/jupyter-cadquery

These particular examples are currently only JupyterLab 2.x

Plotting neutron locations for parametric plasma source

Dan has been working on the parametric plasma source an has a branch that allows source point extraction

https://github.com/DanShort12/parametric-plasma-source/tree/source_sampling_executable

This could help us update task 3, which doesn't work at the moment

It is possible to make a source and extract the points for plotting in the following manner

import os
import uuid
import h5py
import matplotlib.pyplot as plt
import math
from parametric_plasma_source import Plasma
import numpy as np
import pathlib

def make_plasma_plot(elongation,
                     minor_radius,
                     major_radius,
                     triangularity,
                     ion_density_pedistal,
                     ion_density_seperatrix,
                     ion_density_origin,
                     ion_temperature_pedistal,
                     ion_temperature_seperatrix,
                     ion_temperature_origin,
                     pedistal_radius,
                     ion_density_peaking_factor,
                     ion_temperature_peaking_factor,
                     shafranov_shift,
                     output_name='plot.png'):

    unique_id = str(uuid.uuid4())
    print('elongation', elongation)
    my_plasma = Plasma(elongation=elongation,
                    minor_radius=minor_radius,
                    major_radius=major_radius,
                    triangularity = triangularity)
    my_plasma.export_plasma_source(unique_id+'.so')
    print(unique_id)
    nps = 4000
    os.system('./source_generator -l '+unique_id+'.so -o '+unique_id+'.h5 -n '+str(nps))

    f = h5py.File(unique_id+'.h5initial_source.h5', 'r')

    print('h5 keys', list(f.keys()))

    dset = f['source_bank']

    print('number of particles', dset.shape)

    x_values = []
    y_values = []
    z_values = []
    r_values = []
    e_values = []

    for entry in dset:
        y_values.append(entry[0][0])
        x_values.append(entry[0][1])
        z_values.append(entry[0][2])
        r_values.append(math.sqrt(math.pow(entry[0][0],2)+math.pow(entry[0][1],2)))
        e_values.append(entry[2])

    p = pathlib.Path(output_name)
    p.parent.mkdir(parents=True, exist_ok=True)

    fig=plt.figure()
    ax=fig.add_axes([0,0,1,1])
    ax.scatter(r_values,z_values, c=e_values, s=1)
    # ax.scatter(grades_range, boys_grades, color='b')
    ax.set_xlabel('Horizontal radius (cm)')
    ax.set_xlim(0, 600)
    ax.set_ylabel('Vertical radius (cm)')
    ax.set_ylim(-600, 600)
    ax.set_title('scatter plot')
    plt.savefig(output_name)
    os.system('rm '+unique_id+'.so')
    os.system('rm '+unique_id+'.h5initial_source.h5')

number_of_frames = 30

for i, major_radius in enumerate(np.linspace(1., 1.5, number_of_frames)):
    make_plasma_plot(elongation=2.9,
                    minor_radius=1.118,
                    major_radius=major_radius,
                    triangularity = 0.55,
                    ion_density_pedistal = 1.09e+20,
                    ion_density_seperatrix = 3e+19,
                    ion_density_origin = 1.09e+20,
                    ion_temperature_pedistal = 6.09,
                    ion_temperature_seperatrix = 0.1,
                    ion_temperature_origin = 45.9,
                    pedistal_radius = 0.8,
                    ion_density_peaking_factor = 1,
                    ion_temperature_peaking_factor = 8.06,
                    shafranov_shift = 0.0,
                    output_name='major_radius/'+str(i).zfill(4)+'.png')

os.system("convert -delay 40 minor_radius/*.png minor_radius.gif")

Burn up

OpenMC is implementing a burn up feature for fixed source simulations.

openmc-dev/openmc#1628

Once this is merged in we should make an example that shows tbr as a function of time. Li6 burn up should mean tbr decreases as a function of time in regular blankets. Perhaps a layered solid blanket would be the best to highlight this effect

Parametric Plasma Source

compile.sh script in parametric plasma source compiles to create source_sampling.so file.
.so file is copied into task_3 directory.
Source library is quoted as source._source_library = '.source/sampling.so' in 3_plot_neutron_birth_locations_plasma.py script, as part of task 3.

Running this script does not produce expected parametric plasma source output - instead, a point source emitting neutrons isotropically is produced.
source._source_library = '/path/to/source/file' can be changed to the incorrect filepath without returning an error.

improve workshop for targeted audience

There are a few ways the workshop can be improved to maximise the learning of fusion neutronics

First add fusion context and learning points to the tasks, perhaps in the form of a task conclusion

Trim the tasks to less time is spent geometry making

Task 10 mbconvert error

There is a small typo in Task 10, part 2

os.system('mbconvert dagmc.h5m dagmc.vtk')

should say

os.system('/MOAB/bin/mbconvert dagmc.h5m dagmc.vtk')

This can be corrected in the notebook

Docker containers on Windows

When using a docker container on Windows, an error is sometimes raised when pulling the docker image:

"image operating system "linux" cannot be used on this platform

To solve this problem, the docker "container mode" must be changed to 'linux'

TODO: Add to instructions

task number 9 gp opt

from skopt import gp_minimize

res = gp_minimize(make_materials_geometry_tallies,
[
(0., 1.0),
(0., 100.0),
],
n_calls = 20
)

images

Instead of bloating the repository by including the readme images, the images can be uploaded to issues and linked to the readme.

This is the same procedure as used in the paramak project

imporved install videos

adding some specific os video install videos, here are some icons for the updated readme.

Screenshot from 2021-04-08 10-47-59

linux

500px-Windows_logo_-_2012 svg

reducing size of the dockerfile

It appears that now we are including double-down in the dagmc compile that the github action now runs out of space

for example these two github actions failed due to no space left on the device.
https://github.com/ukaea/openmc_workshop/actions/runs/518581916
https://github.com/ukaea/openmc_workshop/actions/runs/518476822

So I was wondering if we can do something like this in a few more places

rm -rf /DAGMC/DAGMC /DAGMC/build

These lines rm -rf /DAGMC/DAGMC /DAGMC/build delete some of the unessecary files after the build

Potentially we can delete some folders from these compiles

  • double-down
  • openmc
  • njoy
  • Embree
  • MOAB

Any tips from the experts always welcome @AI-Pranto @pshriwise

Task for angular distribution cross section plot

Currently we don't have a task for angular distribution scattering from different isotopes

The best image I could find is this one.

Screenshot from 2020-11-23 15-13-30

Perhaps task 1 can be extended to include a angular distribution plot for different incident energies

Perhaps code block 20 (In [20]:) on this pages is helpful

https://docs.openmc.org/en/stable/examples/nuclear-data.html

Reproducing this plot as a a plotly ribbon plot would be great

https://plotly.com/python/v3/ribbon-plots/

task links

It would be useful to have a link to the next task at the bottom of each task README so users didn't have to navigate back to the main repository page or task directory to access the next task

Windows and Mac support

Getting docker installed and getting the display server (for visuals) is needed to bring the workshop to a larger audience

It appears Xming (windows) and XQuartz (mac) will be needed for X11

Also the docker run command will be different

task12 not working with trelis 17.1, with fix provided

I tested out Trelis 17 does not support command geometric sizing on for openmc_workshop task12
#cubit.cmd('volume all scheme tetmesh proximity layers off geometric sizing on') # Trelis 16.5
cubit.cmd('volume all scheme tetmesh proximity layers off') # Trelis 17.1

Then openmc can start and complete the task

is that safe to just remove that geometric sizing on for trelis 16.5? shall we just remove that geometric sizing on

Moving from Master to main branch

Currently the workshop is a bit of a mess

We have a master branch with about 7 fully working tasks, these form the "core workshop tasks". There are "optional tasks" that are in different degrees of completeness
We also have a development branch that takes a completely new approach (jupyter notebooks)

We should make a new main branch and migrate development branch into this main branch

Once ready the main branch should become the new default branch and if all of the tasks are implemented then the current master can be deleted

Moving the workshop to jupyter notebook

Delivering this workshop a few times now and the main difficulty has been using X11 in windows. this is easier in Linux as X11 comes preinstalled. However the majority of users are Windows based.

We started on a migration of the whole workshop to be build on the jupyter/minimal-notebook docker image

You can see from the two installation and getting started guides that this has significant advantages and is much easier for everyone
current install instructions https://github.com/ukaea/openmc_workshop/blob/master/README.md
new install instructions https://github.com/ukaea/openmc_workshop/blob/develop/README.md

The other bonus is that the instructions can be embedded into the notebook. We have to take care with the run order of the notebook code blocks as this can become a problem

Downsides are:

  • The loss of vs code highlighting the syntax
  • paraview and vtk files are difficult to view in notebooks (not impossible) but could do with some help adding vtk to the docker file and getting it working in jupyter notebook

The tasks themselves need migrating across and the readme instructions for each task need to be moved into the notebook as well.

  • Task 1 - Cross section plotting - 25 minutes
  • Task 2 - Building and visualizing the model geometry - 25 minutes
  • Task 3 - Visualizing neutron tracks - 20 minutes
  • Task 4 - Finding neutron interactions with mesh tallies - 15 minutes
  • Task 5 - Finding the neutron and photon spectra - 15 minutes
  • Task 6 - Finding the tritium production - 15 minutes
  • Task 7 - Finding the neutron damage and stochastic volume calculation - 15 minutes
  • Task 8 - Survey breeder blanket designs for tritium production - 25 minutes
  • Task 9 - Optimize a breeder blanket for tritium production - 25 minutes
  • Task 10 - Using CAD geometry - 30 minutes
  • Task 11 - Options for making materials - 20 minutes
  • Task 12 - Unstructured mesh on CAD geometry- 20 minutes

Creation of example with nuclides heating in a homogenised material

Occasionally we need the heating on components in a blanket.

Ideally there would be a detailed 3d model with all the blanket components separately defined as cell volumes.

As an early stage is is common to make a homogenised material with the seperate componets (e.g steel, coolant and breeder) mixed together into one material and assigned to one cell volume.

From this mixed material it can be useful to get a tally score (e.g. heating) on separate nuclides.

So we should add an example for this and I think the relevant openmc feature is Tally.nuclides =[ ]

https://docs.openmc.org/en/latest/pythonapi/generated/openmc.deplete.abc.NormalizationHelper.html#openmc.deplete.abc.NormalizationHelper.nuclides

A useful example is also here https://docs.openmc.org/en/latest/examples/tally-arithmetic.html

Adding layers to the model allows the score to be found per nuclide as a function of depth which can also be interesting

simple docker run command

I think we can add the following to the Dockerfile and make the docker run command easier for users

ENV PORT 8888
CMD ["jupyter", "notebook", "--port=8888", "--no-browser", "--ip=0.0.0.0", "--allow-root"]

just testing this out in a few places first

task learning objectives

Some tasks don't have learning objectives for each part. Each notebook should have learning objectives at the end.

Task 8 - plot_interpolated_results

Using radial basis function to obtain interpolated values only works correctly for random and adaptive results.
Halton results return an "ill-conditioned matrix" which returns a poor contour plot.
Grid results return a "matrix is singular" error which prevents the html graph from being output.

MPI Configuration in Dockerfile

Not sure if building an mpi-enabled version of OpenMC works in the current iteration of the Dockerfile. OpenMC documentation says the cxx compiler needs to be mpicxx, which isn't defined currently in the Dockerfile. Additionally, the ENV FC=mpif90 line can probably be removed now the codebase is no longer written in Fortran.

Parametric plasma source input units should be meters

It looks like length dimensions are being passed into the parametric plasma source in task 12 in centimeters:

plasma_params = {
    "elongation": 2.9,
    "ion_density_origin": 1.09e20,
    "ion_density_peaking_factor": 1,
    "ion_density_pedestal": 1.09e20,
    "ion_density_separatrix": 3e19,
    "ion_temperature_origin": 45.9,
    "ion_temperature_peaking_factor": 8.06,
    "ion_temperature_pedestal": 6.09,
    "ion_temperature_separatrix": 0.1,
    "major_radius": 1.9*100,
    "minor_radius": 1.118*100,
    "pedestal_radius": 0.8 *  1.118*100,
    "plasma_id": 1,
    "shafranov_shift": 0.44789,
    "triangularity": 0.270,
    "ion_temperature_beta": 6,
}

As described on the parametric plasma source, the input units should be meters, so I suggest the above code block should be:

plasma_params = {
    "elongation": 2.9,
    "ion_density_origin": 1.09e20,
    "ion_density_peaking_factor": 1,
    "ion_density_pedestal": 1.09e20,
    "ion_density_separatrix": 3e19,
    "ion_temperature_origin": 45.9,
    "ion_temperature_peaking_factor": 8.06,
    "ion_temperature_pedestal": 6.09,
    "ion_temperature_separatrix": 0.1,
    "major_radius": 1.9,
    "minor_radius": 1.118,
    "pedestal_radius": 0.8 *  1.118,
    "plasma_id": 1,
    "shafranov_shift": 0.44789,
    "triangularity": 0.270,
    "ion_temperature_beta": 6,
}

I believe this relates to makeclean/parametric-plasma-source#14, raised by @qingfengxia as in this case particles will be very likely to be generated with starting points outside of the defined geometry. Some of the comments around the PR to introduce the serialisation (makeclean/parametric-plasma-source#8) might have caused some confusion here, but we settled on input length units of meters with a conversion to cm in the OpenMC sampling to allow for portability.

Windows Docker not working

After typing the docker run -p 8888:8888 ukaea/openmcworkshop command in the terminal (powershell) Windows can sometimes report that no container is running

Screenshot from 2021-02-25 17-53-25

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.