Skip to content

Gradient vanishing of G in the DCGAN example #822

Open
@zhan4817

Description

@zhan4817

Hello,

I have trained the DCGAN with the default hyper-parameter settings on the downloaded "img_align_celeba" dataset (recommended in the tutorial). However, the results reveal strong gradient vanishing of G. While Loss_D keeps decreasing towards 0, Loss_G grows high (towards 100).

It seems that D is trained so well, preventing a good training on G. I didn't do any modifications on the code. Do you know what happened?

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions