Giter Club home page Giter Club logo

Comments (28)

xinntao avatar xinntao commented on July 28, 2024 1

@JiaweiShiCV 它没有接着输出了么...

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

这个问题我没有遇到过, 需要更多信息

  1. 四卡就没有问题?
  2. pytorch版本?

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

@xinntao 感谢您的回复,我暂时还未能上四卡。
pytorch = '1.8.0+cu111'
之前同样的环境单卡train esrgan没有类似问题。
我发现出问题的好像只是log出不来,训练照常进行,模型也保存了,wandb也一切正常。
如果您之前双卡没出现类似情况,那应该是环境不匹配造成的吧

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

补充一下,esrgan原本就是单卡训练,所以没有问题。但basicsr内有其他项目是四卡的,改双卡就出现类似问题。

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

这个问题确实很奇怪, 我在 pytorch 1.8 cuda10.2下 没有遇到这个问题。

那你的程序, 它有保存 .log的文件吗?

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

@xinntao 我刚又检查了,确实没有.log文件,终端也是没输出
2021-07-07 11-23-06 的屏幕截图
其他一切正常。

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

@xinntao 您好,我有一些关于网络改进的想法,想跟您讨论一下是否可行。您能否给我一个联系方式。我的邮箱:[email protected]

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

八卡情况下,log正常。

from gfpgan.

syfbme avatar syfbme commented on July 28, 2024

same issue...

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

@syfbme @JiaweiShiCV
I cannot reproduce this issue. Could you guys help me to debug it?

It may be caused by the logging mechanism in BasicSR.

In the basicsr folder: basicsr/utils/logger.py Line106 -Line40
Could you please add these lines and post the outputs here? Thanks

image

def get_root_logger(logger_name='basicsr', log_level=logging.INFO, log_file=None):
    """Get the root logger.

    The logger will be initialized if it has not been initialized. By default a
    StreamHandler will be added. If `log_file` is specified, a FileHandler will
    also be added.

    Args:
        logger_name (str): root logger name. Default: 'basicsr'.
        log_file (str | None): The log filename. If specified, a FileHandler
            will be added to the root logger.
        log_level (int): The root logger level. Note that only the process of
            rank 0 is affected, while other processes will set the level to
            "Error" and be silent most of the time.

    Returns:
        logging.Logger: The root logger.
    """
    print('Enter get_root_logger')
    logger = logging.getLogger(logger_name)
    # if the logger has been initialized, just return it
    if logger.hasHandlers():
        return logger

    print('logger: add handlers')
    format_str = '%(asctime)s %(levelname)s: %(message)s'
    logging.basicConfig(format=format_str, level=log_level)
    rank, _ = get_dist_info()
    if rank != 0:
        logger.setLevel('ERROR')
    elif log_file is not None:
        file_handler = logging.FileHandler(log_file, 'w')
        file_handler.setFormatter(logging.Formatter(format_str))
        file_handler.setLevel(log_level)
        logger.addHandler(file_handler)
    print('logger: last return')
    return logger

from gfpgan.

syfbme avatar syfbme commented on July 28, 2024

Hi @xinntao
Only output "Enter get_root_logger"

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

@syfbme Thanks
It is strange...

Could you please modify this function to the follows, and post the outputs?

def get_root_logger(logger_name='basicsr', log_level=logging.INFO, log_file=None):
    """Get the root logger.

    The logger will be initialized if it has not been initialized. By default a
    StreamHandler will be added. If `log_file` is specified, a FileHandler will
    also be added.

    Args:
        logger_name (str): root logger name. Default: 'basicsr'.
        log_file (str | None): The log filename. If specified, a FileHandler
            will be added to the root logger.
        log_level (int): The root logger level. Note that only the process of
            rank 0 is affected, while other processes will set the level to
            "Error" and be silent most of the time.

    Returns:
        logging.Logger: The root logger.
    """
    print('Enter get_root_logger')
    logger = logging.getLogger(logger_name)
    # if the logger has been initialized, just return it
    # if logger.hasHandlers():
    #    return logger
    if log_file is None:
        return logger

    print('logger: add handlers')
    format_str = '%(asctime)s %(levelname)s: %(message)s'
    logging.basicConfig(format=format_str, level=log_level)
    rank, _ = get_dist_info()
    if rank != 0:
        logger.setLevel('ERROR')
    elif log_file is not None:
        file_handler = logging.FileHandler(log_file, 'w')
        file_handler.setFormatter(logging.Formatter(format_str))
        file_handler.setLevel(log_level)
        logger.addHandler(file_handler)
    print('logger: last return')
    return logger

image

from gfpgan.

syfbme avatar syfbme commented on July 28, 2024

Hi @xinntao
i only used 1 gpu to make display cleaner. And below is the output:
image
Only the first time enter has "add handlers" and "last return"

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

@syfbme If it prints "add handlers" and "last return", then the issue has been solved.

So, you could see the screen outputs, and also have a log file in the experiments file, right?

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

@JiaweiShiCV 确实没有.log文件,终端也是没输出 这个问题,你现在还遇到么

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

@xinntao 我目前八卡以及四卡都没问题,双卡的话应该还是没输出

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

@JiaweiShiCV
能帮忙在两卡上(即不能输出log 的case) 测试下面的解决方案吗? (我这边没法复现,所以没法debug)

在 BasicSR folder: basicsr/utils/logger.py Line106 -Line40
修改为:

def get_root_logger(logger_name='basicsr', log_level=logging.INFO, log_file=None):
    """Get the root logger.

    The logger will be initialized if it has not been initialized. By default a
    StreamHandler will be added. If `log_file` is specified, a FileHandler will
    also be added.

    Args:
        logger_name (str): root logger name. Default: 'basicsr'.
        log_file (str | None): The log filename. If specified, a FileHandler
            will be added to the root logger.
        log_level (int): The root logger level. Note that only the process of
            rank 0 is affected, while other processes will set the level to
            "Error" and be silent most of the time.

    Returns:
        logging.Logger: The root logger.
    """
    print('Enter get_root_logger')
    logger = logging.getLogger(logger_name)
    # if the logger has been initialized, just return it
    # if logger.hasHandlers():
    #    return logger
    if log_file is None:
        return logger

    print('logger: add handlers')
    format_str = '%(asctime)s %(levelname)s: %(message)s'
    logging.basicConfig(format=format_str, level=log_level)
    rank, _ = get_dist_info()
    if rank != 0:
        logger.setLevel('ERROR')
    elif log_file is not None:
        file_handler = logging.FileHandler(log_file, 'w')
        file_handler.setFormatter(logging.Formatter(format_str))
        file_handler.setLevel(log_level)
        logger.addHandler(file_handler)
    print('logger: last return')
    return logger

谢谢!

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

@xinntao 好的

from gfpgan.

syfbme avatar syfbme commented on July 28, 2024

@syfbme If it prints "add handlers" and "last return", then the issue has been solved.

So, you could see the screen outputs, and also have a log file in the experiments file, right?

Yes. Thanks~

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

@xinntao 双卡终端输出:

(BasicSR) ➜  GFPGAN git:(master) ✗ python -m torch.distributed.launch --nproc_per_node 2 --master_port 8888 train.py -opt train_gfpgan_v1.yml --launcher pytorch
*****************************************
Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. 
*****************************************
Path already exists. Rename it to /home/sjw/文档/SR/GFPGAN/experiments/train_GFPGANv1_512_2gpu_archived_20210712_132006
Path already exists. Rename it to /home/sjw/文档/SR/GFPGAN/tb_logger/train_GFPGANv1_512_2gpu_archived_20210712_132006
Enter get_root_logger
logger: add handlers
logger: last return
Enter get_root_logger
logger: add handlers
logger: last return
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_loggerEnter get_root_logger

Enter get_root_logger
Enter get_root_logger
Enter get_root_loggerEnter get_root_logger

Enter get_root_loggerEnter get_root_logger

Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
WARNING:basicsr:Current net - loaded net:
WARNING:basicsr:  bn1.num_batches_tracked
WARNING:basicsr:  bn4.num_batches_tracked
WARNING:basicsr:  bn5.num_batches_tracked
WARNING:basicsr:  layer1.0.bn0.num_batches_tracked
WARNING:basicsr:  layer1.0.bn1.num_batches_tracked
WARNING:basicsr:  layer1.0.bn2.num_batches_tracked
WARNING:basicsr:  layer1.1.bn0.num_batches_tracked
WARNING:basicsr:  layer1.1.bn1.num_batches_tracked
WARNING:basicsr:  layer1.1.bn2.num_batches_tracked
WARNING:basicsr:  layer2.0.bn0.num_batches_tracked
WARNING:basicsr:  layer2.0.bn1.num_batches_tracked
WARNING:basicsr:  layer2.0.bn2.num_batches_tracked
WARNING:basicsr:  layer2.0.downsample.1.num_batches_tracked
WARNING:basicsr:  layer2.1.bn0.num_batches_tracked
WARNING:basicsr:  layer2.1.bn1.num_batches_tracked
WARNING:basicsr:  layer2.1.bn2.num_batches_tracked
WARNING:basicsr:  layer3.0.bn0.num_batches_tracked
WARNING:basicsr:  layer3.0.bn1.num_batches_tracked
WARNING:basicsr:  layer3.0.bn2.num_batches_tracked
WARNING:basicsr:  layer3.0.downsample.1.num_batches_tracked
WARNING:basicsr:  layer3.1.bn0.num_batches_tracked
WARNING:basicsr:  layer3.1.bn1.num_batches_tracked
WARNING:basicsr:  layer3.1.bn2.num_batches_tracked
WARNING:basicsr:  layer4.0.bn0.num_batches_tracked
WARNING:basicsr:  layer4.0.bn1.num_batches_tracked
WARNING:basicsr:  layer4.0.bn2.num_batches_tracked
WARNING:basicsr:  layer4.0.downsample.1.num_batches_tracked
WARNING:basicsr:  layer4.1.bn0.num_batches_tracked
WARNING:basicsr:  layer4.1.bn1.num_batches_tracked
WARNING:basicsr:  layer4.1.bn2.num_batches_tracked
WARNING:basicsr:Loaded net - current net:
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
Enter get_root_logger
[W reducer.cpp:1050] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters, consider turning this flag off. Note that this warning may be a false positive your model has flow control causing later iterations to have unused parameters. (function operator())
/home/sjw/anaconda3/envs/BasicSR/lib/python3.8/site-packages/torch/nn/functional.py:3499: UserWarning: The default behavior for interpolate/upsample with float scale_factor changed in 1.6.0 to align with other frameworks/libraries, and now uses scale_factor directly, instead of relying on the computed output size. If you wish to restore the old behavior, please set recompute_scale_factor=True. See the documentation of nn.Upsample for details. 
  warnings.warn(
[W reducer.cpp:1050] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters, consider turning this flag off. Note that this warning may be a false positive your model has flow control causing later iterations to have unused parameters. (function operator())
/home/sjw/anaconda3/envs/BasicSR/lib/python3.8/site-packages/torch/nn/functional.py:3499: UserWarning: The default behavior for interpolate/upsample with float scale_factor changed in 1.6.0 to align with other frameworks/libraries, and now uses scale_factor directly, instead of relying on the computed output size. If you wish to restore the old behavior, please set recompute_scale_factor=True. See the documentation of nn.Upsample for details. 
  warnings.warn(

log文件内容:

2021-07-12 13:20:14,046 WARNING: Current net - loaded net:
2021-07-12 13:20:14,046 WARNING:   bn1.num_batches_tracked
2021-07-12 13:20:14,046 WARNING:   bn4.num_batches_tracked
2021-07-12 13:20:14,046 WARNING:   bn5.num_batches_tracked
2021-07-12 13:20:14,046 WARNING:   layer1.0.bn0.num_batches_tracked
2021-07-12 13:20:14,046 WARNING:   layer1.0.bn1.num_batches_tracked
2021-07-12 13:20:14,046 WARNING:   layer1.0.bn2.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer1.1.bn0.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer1.1.bn1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer1.1.bn2.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer2.0.bn0.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer2.0.bn1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer2.0.bn2.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer2.0.downsample.1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer2.1.bn0.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer2.1.bn1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer2.1.bn2.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer3.0.bn0.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer3.0.bn1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer3.0.bn2.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer3.0.downsample.1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer3.1.bn0.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer3.1.bn1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer3.1.bn2.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer4.0.bn0.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer4.0.bn1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer4.0.bn2.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer4.0.downsample.1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer4.1.bn0.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer4.1.bn1.num_batches_tracked
2021-07-12 13:20:14,047 WARNING:   layer4.1.bn2.num_batches_tracked
2021-07-12 13:20:14,047 WARNING: Loaded net - current net:

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

@syfbme Thanks for your feedback!

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

@JiaweiShiCV
It seems that this issue could be solved by the above modification!

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

@xinntao 。。。 输出不是还是只有这么点吗

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

@JiaweiShiCV 它没有接着输出了么...

没......

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

@JiaweiShiCV
This bug has been fixed in BasicSR: XPixelGroup/BasicSR@bf93f27
It should be OK now!

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

@xinntao 重新装basicsr=1.3.3.5就ok了是吗

from gfpgan.

xinntao avatar xinntao commented on July 28, 2024

@xinntao 重新装basicsr=1.3.3.5就ok了是吗

这个目前是改在master分支上, 还没有新的版本, 我现在发一个新版 1.3.3.6

from gfpgan.

SimKarras avatar SimKarras commented on July 28, 2024

ok!

from gfpgan.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.