Hi all,
I wonder if anyone encountered gradient explosion after training for a few epochs?
I am trying to use the CAN losses on a sequence generation model.
But, the gradient of generator just kept becoming too large for that the sequence generated can still be classified too well.
For now, I kept training even if the gradients are too large every 2 epochs and I don't see the result of my model being anywhere near creative.
Any thought on this are appreciated!
Thank you
Hi all,
I wonder if anyone encountered gradient explosion after training for a few epochs?
I am trying to use the CAN losses on a sequence generation model.
But, the gradient of generator just kept becoming too large for that the sequence generated can still be classified too well.
For now, I kept training even if the gradients are too large every 2 epochs and I don't see the result of my model being anywhere near creative.
Any thought on this are appreciated!
Thank you