Comments (5)
In training stage three, in your paper, Finally, we train the full model with an initial learning rate of 0.02 and 0.002, respectively, for the weights in the backbone and DA-CSPN++.
. But every iter, the using of adjust_learning_rate
will adjust all params (backbone and DA-CSPN++) with same lr?
from penet_icra2021.
The optimizer in stage 3 with different learning rate corresponding to different parameters is defined in main.py
:
elif (args.network_model == 'pe'):
model_bone_params = [
p for _, p in model.backbone.named_parameters() if p.requires_grad
]
model_new_params = [
p for _, p in model.named_parameters() if p.requires_grad
]
model_new_params = list(set(model_new_params) - set(model_bone_params))
optimizer = torch.optim.Adam([{'params': model_bone_params, 'lr': args.lr / 10}, {'params': model_new_params}],
lr=args.lr, weight_decay=args.weight_decay, betas=(0.9, 0.99))
In training stage three, in your paper,
Finally, we train the full model with an initial learning rate of 0.02 and 0.002, respectively, for the weights in the backbone and DA-CSPN++.
. But every iter, the using ofadjust_learning_rate
will adjust all params (backbone and DA-CSPN++) with same lr?
from penet_icra2021.
I know it. But in iterate
function, it will using
if mode == 'train':
model.train()
lr = helper.adjust_learning_rate(args.lr, optimizer, actual_epoch, args)
When the code first run at here. It't will apply adjust_learning_rate
. And this function.
def adjust_learning_rate(lr_init, optimizer, epoch, args):
"""Sets the learning rate to the initial LR decayed by 10 every 5 epochs"""
#lr = lr_init * (0.5**(epoch // 5))
#'''
lr = lr_init
if (args.network_model == 'pe' and args.freeze_backbone == False):
if (epoch >= 10):
lr = lr_init * 0.5
if (epoch >= 20):
lr = lr_init * 0.1
if (epoch >= 30):
lr = lr_init * 0.01
if (epoch >= 40):
lr = lr_init * 0.0005
if (epoch >= 50):
lr = lr_init * 0.00001
else:
if (epoch >= 10):
lr = lr_init * 0.5
if (epoch >= 15):
lr = lr_init * 0.1
if (epoch >= 25):
lr = lr_init * 0.01
#'''
for param_group in optimizer.param_groups:
param_group['lr'] = lr
return lr
It will update all learning params with a lr of lr_init
.
from penet_icra2021.
The optimizer has two groups params with different learning rate as defined in main.py
. But in iterate
, the function adjust_learning_rate
updates the two groups params simultaneously with the same learning rate.
from penet_icra2021.
I think you're right so the parameters are actually updated with the same learning rate. It does be a mistake. The design of different learning rates comes from a common practice of some semantic segmentation networks that the parameters in the pretrained backbone are updated with 1/10 learning rate. Now I don't know whether it will work. Maybe you could try it.
from penet_icra2021.
Related Issues (20)
- Normalization Input Data HOT 2
- How to inference with different input sizes? HOT 1
- Use other datasets to train PENet HOT 11
- Using the NYU dataset
- Experimental results using the NYU dataset HOT 11
- Projection result confirmation HOT 3
- Questions about KITTI-odometry HOT 2
- Runtime measurement HOT 4
- How to use the sparse depth?
- How to use the sparse depth? HOT 4
- Modify the backbone network HOT 3
- lightweight deployment of PENet network
- How to infer PENet for KITTI object task? HOT 3
- broken PNG file HOT 1
- the difference intrinsic parameters between train and test HOT 4
- Modify DA-CSPN++ for single branch input HOT 1
- some questions about the implement of CSPN HOT 4
- Model mismatch at inference time HOT 1
- Tensor Dimension Mismatch HOT 1
- How many parameters dose PENet have? HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from penet_icra2021.