Blog posts

2024

Latex test

less than 1 minute read

Published:

Cross Entropy Loss

We can ce loss as \(H(y, \hat{y}) = - \sum_{i} y_i \log(\hat{y}_i)\)