Giter Club home page Giter Club logo

Comments (7)

yu4u avatar yu4u commented on July 28, 2024

Hi @hazxone,

I didn't have tried training using images with the size of 128x128.
How about changing the learning rate?
I'd like to try it if I have a time...

from age-gender-estimation.

hazxone avatar hazxone commented on July 28, 2024

Hi @yu4u
Yup I've tried changing the learning rate from 1e-3, -2, -1. All end up with the same issue as above. Weirdly, if I trained using image size of 64, they end up nicely at loss 3.25.
(I'm using adience and UTKFace dataset because I need to detect age range from 1-100 since I noticed the pretrained weight doesnt detect age 20 and below.)

Since the network use pure cnn and not face embeddings (like facenet), is it better to use dataset with random different head pose or dataset with all face have been aligned properly?

Ground Truth : 26 F
second left-64all

Ground Truth : 29 F
lofa-64all-07

from age-gender-estimation.

yu4u avatar yu4u commented on July 28, 2024

I'm afraid that the current WideResNet model is not good for this resolution because it assumes the CIFAR-10 dataset (32x32 inputs).
Could you see whether the following modification of wide_resnet.py resolves the issue or not.

before

        pool = AveragePooling2D(pool_size=(8, 8), strides=(1, 1), padding="same")(relu)
        flatten = Flatten()(pool)
        predictions_g = Dense(units=2, kernel_initializer=self._weight_init, use_bias=self._use_bias,
                              kernel_regularizer=l2(self._weight_decay), activation="softmax",
                              name="pred_gender")(flatten)
        predictions_a = Dense(units=101, kernel_initializer=self._weight_init, use_bias=self._use_bias,
                              kernel_regularizer=l2(self._weight_decay), activation="softmax",
                              name="pred_age")(flatten)

after

        flatten = GlobalAveragePooling2D()(relu)
        predictions_g = Dense(units=2, kernel_initializer=self._weight_init, use_bias=self._use_bias,
                              kernel_regularizer=l2(self._weight_decay), activation="softmax",
                              name="pred_gender")(flatten)
        predictions_a = Dense(units=101, kernel_initializer=self._weight_init, use_bias=self._use_bias,
                              kernel_regularizer=l2(self._weight_decay), activation="softmax",
                              name="pred_age")(flatten)

and import GlobalAveragePooling2D as

from keras.layers import Input, Activation, add, Dense, Flatten, Dropout, GlobalAveragePooling2D

from age-gender-estimation.

hazxone avatar hazxone commented on July 28, 2024

Hi,
Update from the modified GlobalAveragePooling2D using 128 image size.
Still no success. Even though the loss drops to 4.73 after 17 epochs, it will only predict 32 F on all faces.

How do I add more size layers to the wideresnet? Now it is 128 --> 64 --> 32 -->prediction.
I want to add 128 --> 64 --> 32 -->16 --> prediction

This is the log file if you are interested.
log (2).zip

from age-gender-estimation.

nyck33 avatar nyck33 commented on July 28, 2024

@hazxone I'm tring to train on 128 image size for gender only with smaller model of depth 10 and width 4. I guess this is a bad idea? Btw can you put the .mat somewhere for download? I keep getting a memory error when trying to create_db.py with image size 128.

Traceback (most recent call last):
  File "create_db.py", line 64, in <module>
    main()
  File "create_db.py", line 55, in main
    img = cv2.imread(root_path + str(full_path[i][0]))
cv2.error: OpenCV(4.1.1) /io/opencv/modules/core/src/alloc.cpp:72: error: (-4:Insufficient memory) Failed to allocate 750000 bytes in function 'OutOfMemoryError'

Was able to create wiki_db.mat with 128 size.

So now I saw this: https://www.mathworks.com/matlabcentral/answers/63010-concatenating-mat-files-into-a-single-file

And am thinking of doing create_db.py for half of the imdb dataset into a .mat and then the second half into another .mat and concatenating them in matlab or octave. Is there a better way?

from age-gender-estimation.

nyck33 avatar nyck33 commented on July 28, 2024

But I got another memory error.

Traceback (most recent call last):
  File "create_db.py", line 106, in <module>
    main()
  File "create_db.py", line 65, in main
    output1 = {"image": np.array(out_imgs), "gender": np.array(out_genders), \
numpy.core._exceptions.MemoryError: Unable to allocate array with shape (88451, 128, 128, 3) and data type uint8

Even though I did the following:

sudo su
root@deeplearning-test:/home/ubuntu/pixelation/age-gender-estimation# echo 1 > /proc/sys/vm/overcommit_memory

and

sudo su
root@deeplearning-test:/home/ubuntu/pixelation/age-gender-estimation/data# echo "2" > /proc/sys/vm/overcommit_memory

from age-gender-estimation.

nyck33 avatar nyck33 commented on July 28, 2024

I reduced size to 64 but still got a memory error although imdb_db.mat was created in data folder.

python create_db.py --output data/imdb_db.mat --db imdb --img_size 64
100%|█████████████████████████████████| 460723/460723 [02:56<00:00, 2608.98it/s]
Traceback (most recent call last):
  File "create_db.py", line 108, in <module>
    main()
  File "create_db.py", line 68, in main
    scipy.io.savemat(output_path, output)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/scipy/io/matlab/mio.py", line 219, in savemat
    MW.put_variables(mdict)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/scipy/io/matlab/mio5.py", line 849, in put_variables
    self._matrix_writer.write_top(var, asbytes(name), is_global)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/scipy/io/matlab/mio5.py", line 590, in write_top
    self.write(arr)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/scipy/io/matlab/mio5.py", line 629, in write
    self.write_numeric(narr)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/scipy/io/matlab/mio5.py", line 655, in write_numeric
    self.write_element(arr)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/scipy/io/matlab/mio5.py", line 496, in write_element
    self.write_regular_element(arr, mdtype, byte_count)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/scipy/io/matlab/mio5.py", line 512, in write_regular_element
    self.write_bytes(arr)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/scipy/io/matlab/mio5.py", line 480, in write_bytes
    self.file_stream.write(arr.tostring(order='F'))
MemoryError

from age-gender-estimation.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.