342
Sparse Categorical Cross Entropy
keras.losses.sparse_categorical_crossentropy(y_true, y_pred)
Sparse categorical cross entropy is basically the same as categorical cross entropy,
but the distinction between them is in how their true labels are formatted. For
categorical cross entropy, the labels are
one-hot encoded. For an example of this, refer
to Figure
A-18
, if you had your y_train formatted originally as the following, with six
maximum classes.
Figure A-18. An example of how y_train can be formatted. The value in each
index is the class value that corresponds to the value at that index in x_train
Figure A-19. The y_train in Figure
A-18
is converted into a one-hot encoded
format
You can call keras.utils.to_categorical(y_train, n_classes) with n_classes as
6 to convert y_train to that shown in Figure
A-19
.
So now your y_train looks like Figure
A-20
.
Appendix A intro to KerAs
343
This type of truth label formatting (
Do'stlaringiz bilan baham: