Comments (8)
And on super resolution when I test my dataset which is 6464 grid with 20 examples and 100 timesteps I get a error at
test_a = test_a.reshape(ntest,S,S,1,T_in).repeat([1,1,1,T,1])
saying shape '[20, 256, 256, 1, 10]' is invalid for input of size 819200
so is there any specification for super resolution code like for say grid 6464 we have to keep indent value this and S and T value this?
And the model you have provided on drive can I evaluate the results on any grid with any boundary conditions for naiver stokes?
from neuraloperator.
- Yes, it could be used to model different boundary conditions. If the mesh is still
64*64
, the same model should apply. - For super-resolution, assume we train on
64*64
and want to evaluate at256*256
, there are two scenarios: (a) if your input is256*256
, then you may directly apply the same model (that trained on64*64
). (b) if your input is still64*64
, in this case there is actually no much we can do. Since if the boundary/initial conditon of PDE is low-resolution, then the solution is under-determined at higher-resolution. You may use the code below to evaluate at a higher resolution, where the addition parameter 'size' will be 256.
def forward(self, x, size=None):
if size==None:
size = x.size(-1)
batchsize = x.shape[0]
#Compute Fourier coeffcients up to factor of e^(- something constant)
x_ft = torch.fft.rfftn(x, dim=[2,3])
# Multiply relevant Fourier modes
out_ft = torch.zeros(batchsize, self.out_channels, size, size//2 + 1, device=x.device, dtype=torch.cfloat)
out_ft[:, :, :self.modes1, :self.modes2] = \
compl_mul2d(x_ft[:, :, :self.modes1, :self.modes2], self.weights1)
out_ft[:, :, -self.modes1:, :self.modes2] = \
compl_mul2d(x_ft[:, :, -self.modes1:, :self.modes2], self.weights2)
#Return to physical space
x = torch.fft.irfftn(out_ft, s=(size, size), dim=[2,3])
return x
But it's not much different to directly evaluate the solution at 64*64 and do an interpolation.
from neuraloperator.
Thanks that was useful I was getting error in eval.py script while testing for 64*64 mesh.
How do you change boundary conditions in psuedospectral method I tried changing stream function value by setting zero at points where I have to fix a wall but I think it does not work so how set boundary condition there and does setting boundary condition means also setting it in Fourier space do we have to make any changes there?
I have been comparing results of FNO with solver method right now I just took periodic boundary conditions and got results with openFoam solver. Now I have to change boundary condition so wanted to know it can be done.
I m also going try this super resolution and compare those with openFoam.
from neuraloperator.
Sorry for the delayed response! To address boundary we have two methods in mind:
-
Fourier continuation: the domain is non-perodic, but we can create a large periodic domain that contains the real domain. It is just a few line of code: for example, if my input and output is of the shape (batch, x, y, channels):
x_in = F.pad(x, (0,0,0,5,0,5), "constant", 0) # create a large domain
out = model(x_in).reshape(batch_size,S+5,S+5,channel)
out = out[:,:5,:5,:] # take out the original domain -
Multiply the output by a mask, for example to enforce a zero boundary
x = torch.linspace(0,np.pi,64+1)[:-1]
mask = torch.outer(torch.sin(x), torch.sin(x))
out = model(x_in) * mask.reshape(1,64,64,1)
Thanks!
from neuraloperator.
For super resolution for 2-D mesh the script mention does not work as it is for 3-d. So what changes should be done in script for getting super resolution for 2-d mesh?
from neuraloperator.
Sorry I am not very sure I understand the issue. What is working and what is not?
from neuraloperator.
So trained a model on 6464 mesh using fourier_2D.py . Wanted to evaluate at 256256 mesh but got error. So what chnages are to be made in super resolution script. As I think the super resolution script is a 3D one.
from neuraloperator.
If the fourier_2d.py
or fourier_2d_time.py
model works on 64x64, I think you can directly apply it on 256x256. Just to make sure the size parameter s
is updated, and your input (u0
/ train_a
/ test_a
) are 256x256.
from neuraloperator.
Related Issues (20)
- Explicityly Handle Higher-dimensional Convolutions HOT 1
- Bayesian Inverse Problems HOT 1
- FNO for complex-valued spatial data
- NS dataset question HOT 2
- .\neuraloperator\neuralop\training\callbacks.py error HOT 3
- OutputEncoderCallback problem HOT 3
- Inverse Problem in FNO HOT 2
- RuntimeError: The size of tensor a (2) must match the size of tensor b (16) at non-singleton dimension 3 HOT 1
- RuntimeError: The size of tensor a (2) must match the size of tensor b (16) at non-singleton dimension 3
- Fixing gradient backprop in #233
- MLP dropout > 0.5 causes error HOT 1
- DDP wireup requires calling from MPI
- import error in Training a TFNO on Darcy-Flow example HOT 1
- Training the neural operator in 3D spatial domain HOT 3
- Reproducing the published results
- Potential models to implement and merge
- Reproducing GINO
- loss sometimes jumps at restart
- How to create darcy dataset on different resolution? HOT 1
- U-NO on Darcy-Flow (Gallery of example) don't run on cuda
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from neuraloperator.