Skip to content

Faulty Loss calculation in nn_scratch_end.py #5

@tribp

Description

@tribp

I think there is a fault + inconsistency in how the Loss is calculated:
self.L_train -> is only calculate for 1 point in the train set -> should be the average for all training points
self.L_test -> is calculated as the sum of losses in the test set, but NOT divided by the nr of test samples.

Proposed solution:

class NeuralNetworkFromScratch:

       ...............

  def train(self, ITERATIONS=100):

        ...........

         ```
        # calc training loss
          L = np.sum(np.square(y_train_pred - y_train_true)).     ->>> remove
          self.L_train.append(L)    ->>> remove
        ```

        # Calculate the loss on the train set
        L_sum = 0
        for i in range(len(self.X_train)):
            y_true = self.y_train[i]
            y_pred = self.forward_pass(self.X_train[i])
            L_sum += np.sum((y_pred - y_true)**2)
        
        Nr_train_samples = len(self.X_train)
        self.Loss_train.append(L_sum/Nr_train_samples)

        # Calculate the loss on the test set
        L_sum = 0
        for i in range(len(self.X_test)):
            y_true = self.y_test[i]
            y_pred = self.forward_pass(self.X_test[i])
            L_sum += np.sum((y_pred - y_true)**2)
        
        Nr_test_samples = len(self.X_test)
        self.Loss_test.append(L_sum/Nr_test_samples)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions