Giter Club home page Giter Club logo

diuca's Introduction

diuca's People

Contributors

adrienwehrle avatar giudgiud avatar

Watchers

 avatar  avatar

Forkers

giudgiud

diuca's Issues

Reconcile AD and FV solves on `iceslab` problem

The iceslab problem is currently being solved in AD and FV. The AD implementation took some more effort due to OS-specific issues (e.g. floating point errors and pivots).

The results from the two methods should be similar, let's make sure they are!

Internal logic error on icestream problem

./diuca-opt -i problems/icestream_ad_3d.i

gives

Setting Up
  Initializing
    Finished Initializing Equation Systems                                               [  0.31 s] [  339 MB]
  Finished Initializing                                                                  [  0.32 s] [  339 MB]
Finished Setting Up                                                                      [  0.43 s] [  340 MB]
Framework Information:
MOOSE Version:           git commit 0b297bbb93 on 2024-02-15
LibMesh Version:         
PETSc Version:           3.20.3
SLEPc Version:           3.20.1
Current Time:            Tue Feb 20 15:49:40 2024
Executable Timestamp:    Mon Feb 19 20:00:14 2024

Parallelism:
  Num Processors:          1
  Num Threads:             1

Mesh: 
  Parallel Type:           replicated
  Mesh Dimension:          3
  Spatial Dimension:       3
  Nodes:                   18447
  Elems:                   2016
  Num Subdomains:          3

Nonlinear System:
  Num DOFs:                57959
  Num Local DOFs:          57959
  Variables:               "velocity" "p" 
  Finite Element Types:    "LAGRANGE_VEC" "LAGRANGE" 
  Approximation Orders:    "SECOND" "FIRST" 

Auxiliary System:
  Num DOFs:                7854
  Num Local DOFs:          7854
  Variables:               { "vel_x" "vel_y" "vel_z" } 
  Finite Element Types:    "LAGRANGE" 
  Approximation Orders:    "FIRST" 

Execution Information:
  Executioner:             Transient
  TimeStepper:             ConstantDT
  TimeIntegrator:          ImplicitEuler
  Solver Mode:             NEWTON
  PETSc Preconditioner:    svd 
  MOOSE Preconditioner:    SMP

[local_path]/envs/moose/libmesh/include/metaphysicl/dynamic_std_array_wrapper.h, line 125, compiled Feb 16 2024 at 17:15:43
libMesh terminating:
Error in MetaPhysicL internal logic
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=1
:
system msg for write_line failure : Bad file descriptor

Ice slab converges on unexpected time unit

iceslab_ad_2d.i and the material ADIceMaterial.C used in this problem are in MPa.a. Hence the system should converge in a few years maximum (usual time scale for ice), e.g. dt=0.1 and time=10 for convergence. However, dt=0.1 or even dt=10 are associated with very high steady-state relative differential norms (e.g. 1e-02) and it would take tens of thousands of steps to reach 1e-08.

Convergence can be reached (with final velocities of the right order for an ice slab) and low steady-state relative differential norms on the first timestep (1e-08), having a dt of a year but in seconds i.e. dt = 1 * 3600 * 24 * 365 = 31536000 (convergence reached at time=93)... Although the system should be in year and not second.

On dt = 315360000 (10 years in seconds), it takes time=16 to reach convergence.

Dimensions, especially time dimensions, need to be checked and fixed if needed.

Maybe the velocity gradients are still computed in seconds although my viscosity is in years, explaining the dt requiring to be in seconds?

Add `FVSedimentMaterial`

We need this object to have a better representation of basal slip in case we use FV instead of FE in the final setup.

I'll write it shortly based on FVIceMaterial.

`icestream` FV doesn't converge (yet)

The problem icestream_fv_3d_SI.i is getting very close to the problem I want to run for my project (which is exciting!). I'm currently working with this input file that I work with in #6.

Framework Information:
MOOSE Version:           git commit 0b297bbb93 on 2024-02-15
LibMesh Version:         
PETSc Version:           3.20.3
SLEPc Version:           3.20.1
Current Time:            Wed Feb 28 09:54:08 2024
Executable Timestamp:    Tue Feb 27 19:26:38 2024

Parallelism:
  Num Processors:          1Reach convergence in
  Num Threads:             1

Mesh: 
  Parallel Type:           replicated
  Mesh Dimension:          3
  Spatial Dimension:       3
  Nodes:                   41769
  Elems:                   4712
  Num Subdomains:          3

Nonlinear System:
  Num DOFs:                18848
  Num Local DOFs:          18848
  Variables:               { "vel_x" "vel_y" "vel_z" "pressure" } 
  Finite Element Types:    "MONOMIAL" 
  Approximation Orders:    "CONSTANT" 

Execution Information:
  Executioner:             Transient
  TimeStepper:             ConstantDT
  TimeIntegrator:          ImplicitEuler
  Solver Mode:             Preconditioned JFNK
  PETSc Preconditioner:    lu 


Time Step 0, time = 0

Time Step 1, time = 3.1536e+06, dt = 3.1536e+06
Currently Executing
    Computing Initial Residual
      Computing Residual
        Finished Computing Rhie-Chow coefficients                                        [  1.12 s] [  461 MB]
      Finished Computing Residual                                                        [  1.64 s] [  528 MB]
    Finished Computing Initial Residual                                                  [  1.64 s] [  528 MB]
    |residual|_2 of individual variables:
                   vel_x:    2424.59
                   vel_y:    0
                   vel_z:    487678
                   pressure: 29484.2
 0 Nonlinear |R| = 4.885749e+05
    Computing Jacobian.....                                                              [ 33.77 s] [  626 MB]
  Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
                 PC failed due to FACTOR_NUMERIC_ZEROPIVOT 
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
 Solve Did NOT Converge!
  Finished Solving                                                                       [ 49.94 s] [  626 MB]
Aborting as solve did not converge

Time Step 1, time = 1.5768e+06, dt = 1.5768e+06
    |residual|_2 of individual variables:
                   vel_x:    2424.59
                   vel_y:    0
                   vel_z:    487678
                   pressure: 29484.2
 0 Nonlinear |R| = 4.885749e+05
  Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
                 PC failed due to FACTOR_NUMERIC_ZEROPIVOT 
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
 Solve Did NOT Converge!
  Finished Solving                                                                       [ 18.01 s] [  626 MB]
Aborting as solve did not converge

Time Step 1, time = 788400, dt = 788400
    |residual|_2 of individual variables:
                   vel_x:    2424.59
                   vel_y:    0
                   vel_z:    487678
                   pressure: 29484.2
 0 Nonlinear |R| = 4.885749e+05
  Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
                 PC failed due to FACTOR_NUMERIC_ZEROPIVOT 
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
 Solve Did NOT Converge!
  Finished Solving                                                                       [ 21.07 s] [  626 MB]
Aborting as solve did not converge

Time Step 1, time = 394200, dt = 394200
    |residual|_2 of individual variables:
                   vel_x:    2424.59
                   vel_y:    0
                   vel_z:    487678
                   pressure: 29484.2
 0 Nonlinear |R| = 4.885749e+05
  Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
                 PC failed due to FACTOR_NUMERIC_ZEROPIVOT 
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
 Solve Did NOT Converge!
  Finished Solving                                                                       [ 20.01 s] [  626 MB]
Aborting as solve did not converge

Time Step 1, time = 197100, dt = 197100
    |residual|_2 of individual variables:
                   vel_x:    2424.59
                   vel_y:    0
                   vel_z:    487678
                   pressure: 29484.2
 0 Nonlinear |R| = 4.885749e+05

At the moment it doesn't reach convergence with LU nor with SVD. Let's try and understand why and give a chance to other strategies.

I'll try and investigate e.g. velocity scaling since the mesh is of a different scale now (we're working with 1km long 100m thick system in iceslab, now we have a glacier geometry that's closer to reality, here 10km long, minimum 500m thick).

Sediment/ice coupling not working in `icestream` FV

In icestream_fv_3d_SI_ru_slip.i is implemented a sediment layer with controllable thickness and constant viscosity as a function of the sediment thickness and a slipperiness coefficient (a different formulation from the one I initially used, this one being much simpler). At the moment in main, the sediment layer in 50m thick with an equivalent viscosity of 1e8 Pas. A null Dirichlet BC is set at the base of the sediments, so that the sediments are being deformed by the glacier motion on top of it.

The velocity in the sediment layer (block_0) is wrong. vel_z of ~ - 0.1 m.s-1 showing the sediments are clearly collapsing while there is a null dirichlet BC in z (and x and y) at their base... At the same time, vel_x and vel_y of 0 while they should be affected by the glacier motion in x...
See below vel_x and vel_z in the sediment layer.

Screenshot from 2024-04-04 15-17-27
Screenshot from 2024-04-04 15-17-21

The glacier system solves nicely as if there was no slip at the base, while the whole layer is actually sinking super fast...
See below vel_x on the glacier block.

image

Any idea why this might be happening @GiudGiud? I thought this could come from boundary penetration that we discussed a few weeks ago, considering the high overburden pressure from the glacier having a much higher viscosity than the sediments. But in this case 1) the glacier would also sink down in reaction, which is not the case and 2) the negative velocity in z is beyond any physical reaction to overburden pressure...

I tried different inlet and outlet BCs in the sediments but it always ends up in the same situation, or doesn't converge.

Solve did not converge on iceslab AD 2D SI

./diuca-opt -i problems/iceslab_ad_2d_SI.i

gives

Framework Information:
MOOSE Version:           git commit 0b297bbb93 on 2024-02-15
LibMesh Version:         
PETSc Version:           3.20.3
SLEPc Version:           3.20.1
Current Time:            Fri Feb 23 15:49:55 2024
Executable Timestamp:    Fri Feb 23 10:06:33 2024

Parallelism:
  Num Processors:          1
  Num Threads:             1

Mesh: 
  Parallel Type:           replicated
  Mesh Dimension:          2
  Spatial Dimension:       2
  Nodes:                   231
  Elems:                   50
  Num Subdomains:          1

Nonlinear System:
  Num DOFs:                528
  Num Local DOFs:          528
  Variables:               "velocity" "p" 
  Finite Element Types:    "LAGRANGE_VEC" "LAGRANGE" 
  Approximation Orders:    "SECOND" "FIRST" 

Auxiliary System:
  Num DOFs:                132
  Num Local DOFs:          132
  Variables:               { "vel_x" "vel_y" } 
  Finite Element Types:    "LAGRANGE" 
  Approximation Orders:    "FIRST" 

Execution Information:
  Executioner:             Transient
  TimeStepper:             ConstantDT
  TimeIntegrator:          ImplicitEuler
  Solver Mode:             NEWTON
  PETSc Preconditioner:    svd 
  MOOSE Preconditioner:    SMP


Time Step 0, time = 0

Time Step 1, time = 3.1536e+06, dt = 3.1536e+06
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
 Solve Did NOT Converge!
Aborting as solve did not converge

Time Step 1, time = 1.5768e+06, dt = 1.5768e+06
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
 Solve Did NOT Converge!
Aborting as solve did not converge

Time Step 1, time = 788400, dt = 788400
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
 Solve Did NOT Converge!
Aborting as solve did not converge

and the following till minimum timestep.

yet @GiudGiud mentioned it was converging on their machine in #4 (and I'm using this version of the input file). Were you converging on a different version of iceslab_ad_2d_SI.i ?
If not, it may be because of a different version of PETSC/MOOSE...

Add sediment layer with variable thickness through MOOSE

Let's try and implement a sediment viscosity simpler than the Drucker-Prager formulation currently in FVSedimentMaterialSI (highly non linear).

An implementation we've used in the past in my team is in terms of slipperiness, such that mu = Cslip / h with Cslip a slipperiness coefficient and h the thickness of the sediment layer. (from e.g. here).
That means the thickness of the sediment layer can vary. And I'd like to try and avoid creating one mesh per simulation. So I've been thinking I might be ale to stitch a one-element layer at the base of the glacier directly within MOOSE as a projection of the bed boundary, even if the bed is not flat. I'd have to assign the side sets according to the glacier above but I feel like that might be possible.

I've been going through the mesh generator objects but couldn't find one that could do what I'm describing here. Do you know if such an object exists @GiudGiud ?
If I don't manage to do it within MOOSE, I'll get back to generating my meshes for each simulation, but doing it within MOOSE would also allow to pass the sediment thickness automatically both in the mesh generator and in the sediment object.

Use `SedimentMaterial` in `icestream`

At the moment the icestream base is ice (with a null Dirichlet on velocity). Once either/both icestream FV and FE converge, add SedimentMaterial first with constant velocity, then with a velocity as a function of the second invariant and friction coeffiicent (already implemented in ADSedimentMaterial but kept constant for now).

Missing parameter in `icestream` FV

for now problems/icestream_fv_3d_SI.i proposed in #6 gives

*** ERROR ***
[local_path]/projects/diuca/problems/icestream_fv_3d_SI.i:252: missing required parameter 'Functions/FunctorMaterials/type'
	Doc String: "A string representing the Moose Object that will be built by this Action"

Segmentation fault on `icestream` FV with `FVSedimentMaterialSI`

Every time I've been using FVSedimentMaterialSI (variable viscosity) e.g. in https://github.com/AdrienWehrle/diuca/blob/PR_sedimentbase/problems/icestream_fv_3d_SI_ru_sediment.i I've been getting a segmentation fault that I find very hard to debug. I thought it could come from the viscosity evaluated at time t-1 but getting rid of it doesn't fix the segmentation fault.

Framework Information:
MOOSE Version:           git commit 778cc52c3e on 2024-03-02
LibMesh Version:         
PETSc Version:           3.20.3
SLEPc Version:           3.20.1
Current Time:            Fri Mar 22 18:35:02 2024
Executable Timestamp:    Fri Mar 22 18:16:20 2024

Parallelism:
  Num Processors:          1
  Num Threads:             15

Mesh: 
  Parallel Type:           replicated
  Mesh Dimension:          3
  Spatial Dimension:       3
  Nodes:                   2618
  Elems:                   2016
  Num Subdomains:          3

Nonlinear System:
  Num DOFs:                8064
  Num Local DOFs:          8064
  Variables:               { "vel_x" "vel_y" "vel_z" "pressure" } 
  Finite Element Types:    "MONOMIAL" 
  Approximation Orders:    "CONSTANT" 

Execution Information:
  Executioner:             Transient
  TimeStepper:             ConstantDT
  TimeIntegrator:          ImplicitEuler
  Solver Mode:             Preconditioned JFNK
  PETSc Preconditioner:    lu 


Time Step 0, time = 0

Time Step 1, time = 2.3652e+06, dt = 2.3652e+06
Segmentation fault (core dumped)

any idea @GiudGiud?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.