You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dropout probabilistically keeps a neuron activated usually only during training time. I guess the code here is for inference, where the dropout keep probability is usually set to 1.0
"Following the set of these up-sampling blocks, dropout is applied and succeeded by a final convolutional layer yielding the prediction."
However, in the fcrn.py code, the value of keep_prop is always equal to 1.
.dropout(name='drop', keep_prob=1.)
FIX:
.dropout(name='drop', keep_prob=self.keep_prob)
Is that correct?
The text was updated successfully, but these errors were encountered: