118
This is the same logic behind the rest of the code, where the output of the activation
layer for the old is now the new definition of x. This new x then goes to the next layer,
where a function is applied after it goes through a layer and then that data becomes the
new definition of x, and so on.
So
x = x.view(-1, 12*12*64)
performs the same function as the flatten layer in the Keras example.
Now you can move on to training your data (Figure
3-63
).
PRGHO &11 WR GHYLFH
FULWHULRQ QQ&URVV(QWURS\/RVV
RSWLPL]HU WRUFKRSWLP$GDP PRGHOSDUDPHWHUV OU OHDUQLQJBUDWH
WRWDOBVWHS OHQ WUDLQBORDGHU
Do'stlaringiz bilan baham: