calebrob6 / land-cover Goto Github PK
View Code? Open in Web Editor NEWCode for training and testing deep learning based land cover models.
Code for training and testing deep learning based land cover models.
Hello @calebrob6 ,
Thanks for publishing your training code. I was going through your code and noticed something.
The color augmentation function assumes the array to be num_channels x height x width
judging by the following line and the entire function code:
land-cover/landcover/datagen.py
Lines 10 to 12 in e38f4f3
But it is called here with the array sized height x width x num_channels
here:
land-cover/landcover/datagen.py
Lines 133 to 137 in e38f4f3
Is it really an error or did I miss something?
Hello,
Thank you for providing the code for your land cover classification algorithm and in particular, the ingenious super-resolution loss function. I have tried to downlead Iowa NDR 2009 land cover dataset with 1 m resolution. Unfortunately, I was not able to find any downloadable map/tif file. They only provide interactive webtools (here and here), which are not useful for measuring accuracy and miou. I was wondering if you had a link to download Iowa dataset.
Thanks,
Hi,
I have a question about variance in sr_loss. Could you please elaborate on why variance is calculated this way?
# Mean var of predicted distribution:
var = K.sum(masked_probs * (1.0 - masked_probs), axis=(1, 2)) / (
c_mask_size * c_mask_size
) # (16x5) / (16,1) --> shape 16x5`
Basically, in the above line, we calculate sigma(x(1-x)) / (n^2)
. However, since we had calculated mean (mu
) in the previous line, we could use mu
in the calculation of variance.
Definition of variance is sigma((x-mu)^2) / n
. So I was thinking something like below:
mean = K.sum(masked_probs, axis=(1, 2)) / c_mask_size # (16x5) / (16,1) --> shape 16x5
var = K.sum((K.pow(masked_probs - mean, 2) * c_mask), axis=(1, 2)) / ( c_mask_size ) # (16x5) / (16,1) --> shape 16x5
In the above line, * c_mask
sets to zero the irrelevant values outside the current class's mask.
So why not stick to the exact variance formula here? Am I missing something?
Thanks,
Hi,
When I ran the code again, a similar problem occurred. I want to know how to solve it, that would be greatly appreciated.
Traceback (most recent call last):
File "C:/Users/57282/Desktop/land-cover-master/run.py", line 203, in
main()
File "C:/Users/57282/Desktop/land-cover-master/run.py", line 145, in main
train.run_experiment()
File "C:\Users\57282\Desktop\land-cover-master\landcover\train_model_landcover.py", line 441, in run_experiment
model = self.get_model()
File "C:\Users\57282\Desktop\land-cover-master\landcover\train_model_landcover.py", line 378, in get_model
model = models.unet(self.input_shape, self.classes, optimizer, self.loss)
File "C:\Users\57282\Desktop\land-cover-master\landcover\models.py", line 115, in unet
return make_model(i, o, optimizer, loss)
File "C:\Users\57282\Desktop\land-cover-master\landcover\models.py", line 168, in make_model
model.compile(
File "E:\Anaconda\envs\land_cover\lib\site-packages\tensorflow\python\keras\engine\training.py", line 565, in compile
self._validate_compile(optimizer, metrics, **kwargs)
File "E:\Anaconda\envs\land_cover\lib\site-packages\tensorflow\python\keras\engine\training.py", line 2626, in _validate_compile
raise ValueError(
ValueError: Variable (<tf.Variable 'conv2d/kernel:0' shape=(3, 3, 4, 64) dtype=float32, numpy=
array([[[[-0.05316697, 0.01695759, 0.06781096, ..., 0.04036037,
-0.02615297, -0.05482 ],
[-0.03512351, 0.00207546, 0.08749197, ..., -0.03793961,
-0.0256898 , -0.02477491],
[-0.02103255, -0.02145103, -0.09544748, ..., 0.08490782,
0.00683988, -0.00993489],
[ 0.07889044, 0.03161094, -0.02841607, ..., 0.07148486,
-0.08556601, 0.01502845]],
[[ 0.02166805, -0.0465791 , 0.03786724, ..., -0.05684982,
-0.02745763, -0.02497146],
[-0.0321082 , -0.07753698, 0.08001576, ..., 0.03958925,
-0.01430742, 0.07854632],
[ 0.03598402, -0.009101 , -0.08477397, ..., 0.04529131,
-0.0007516 , 0.07106098],
[-0.01125589, 0.02705219, -0.02472276, ..., 0.01507821,
-0.086858
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.