diff --git a/recognition/README.md b/recognition/README.md index 5c646231c..70b9c59dc 100644 --- a/recognition/README.md +++ b/recognition/README.md @@ -1,10 +1,90 @@ -# Recognition Tasks -Various recognition tasks solved in deep learning frameworks. - -Tasks may include: -* Image Segmentation -* Object detection -* Graph node classification -* Image super resolution -* Disease classification -* Generative modelling with StyleGAN and Stable Diffusion +# Multi-layer GCN Model (Kaia Santosha) + +This folder contains the implementation of a semi-supervised multi-layer GCN (Graph Convolutional Network) for Facebook's Large Page-Page Network dataset +Hyperlink to dataset: https://snap.stanford.edu/data/facebook-large-page-page-network.html + + + +## Instructions +1. Download the dataset from the hyperlink above, unzip it and place it in the multi-layer_GCN_model_s4696681 folder. This is required otherwise the model wont be able to find the dataset. + +2. (Optional) +Due to the extreme size of the dataset, the adjacency matrix compution takes a copius amount of system memory to execute. Hence, Rangpur's vgpu40 had to be used in order to run this part of the code since even my 16GB of system memory was not enough. To mitigate this for users who do not have sufficient memory, I tried providing a local copy of this computation as a numpy file. However the file is 3.9GB which is far too large to be uploaded to github. To resolve this I have attached a shareable link to this file below. If you do not have sufficient memory on your system or virtual GPU, please download the file from this link: https://drive.google.com/file/d/1s1oZKkCDb9WA-IAjvSCuqsjWW0_goGvR/view?usp=sharing, and put it in the multi-layer_GCN_model_s4696681 folder. +You then will have to modify the dataset.py script slightly by uncommenting the line where it imports the adjacency file, and comment out the line that calls the normalise_adjacency_matrix function. + +However, by default, the code will create the adjacency matrix each time you run the model, the above forewarning is purely for users who might have trouble running the code due to lack of memory. + +3. All that needs to be done is to run the predict.py script since it calls all the other required components from the different scripts. The output of running predict.py should be a log of the hyperparameter tuning and evaluation, the best hyperparameters, the best accuracy, and a TSNE plot for the data before and after it has been run through the model. + +## Overview +Semi-supervised Graph Convolutional Networks (GCNs) for node classification leverage labeled and unlabeled data within a graph to classify nodes. They operate by learning a function that maps a node's features and its topological structure within the graph to an output label. During training, the model uses the available labels in a limited subset of nodes to optimize the classification function, while also considering the graph structure and feature similarity among neighboring nodes. This approach allows the model to generalize and predict labels for unseen or unlabeled nodes in the graph, enhancing performance particularly when labeled data are scarce. + +The Facebook Large Page-Page Network dataset has nodes that represent official Facebook sites, with the edges being mutual likes between the sites. The Nodes are classified into one of four categories, these being: politicians, governmental organizations, television shows, and companies. By leveraging both node feature information (from site descriptions) and topological information (from mutual likes between pages), the GCN can exploit local graph structures and shared features to infer the category of a page. This classification allows for a more efficient organization, retrieval, or understanding of the pages without manually labeling each one. In essence, it enables the automatic categorization of Facebook pages based on both their content and their relationships with other pages. + + +## Data Preprocessing +The preprocessing undertaken ensures that the data is in a suitable format and structure to be used with a GCN. It handles the irregularities in the dataset, normalize crucial components, and prepare training/testing subsets. The following is an outline of the data preprocessing done. + +Adjacency Matrix Creation: +The function create_adjacency_matrix reads edge data from the CSV file musae_facebook_edges.csv to create an adjacency matrix. +It initializes the matrix with zeros and fills it based on the relationships (edges) given in the CSV file. +Diagonal entries are set to 1 since every node links to itself. + +Adjacency Matrix Normalization: +The function normalise_adjacency_matrix normalizes the adjacency matrix. This normalization uses the inverse square root of the degree matrix D. + +Feature Vector Creation: +The function create_feature_vectors reads node features from a JSON file musae_facebook_features.json. +As the features for nodes are inconsistent, it creates an n-dimensional "bag of words" feature vector for each node, where each dimension corresponds to the presence (or absence) of a particular feature. + +Label Conversion: +Labels given in the file musae_facebook_target.csv are in string format. The function convert_labels converts these string labels to integer format. +It maps "politician" to 0, "government" to 1, "tvshow" to 2, and "company" to 3. + +Data Splitting and Tensor Creation: +The function create_tensors prepares data for training and testing. This involves splitting node IDs into training and test sets and creating masks to select training and test nodes. +All the data are then converted into PyTorch tensors and transferred to the GPU. + + +## Model Architecture +The GCNLayer class represents a single layer of the GCN, taking in node features and an adjacency matrix, performing matrix multiplication with a learnable weight, and propagating information through the graph using the adjacency matrix. +The weights of this layer are initialized using the Xavier uniform initialization method. + +The GCN class represents the entire network, which consists of two consecutive GCNLayers: +- the first one applying a ReLU activation function after its operation, followed by a dropout layer, +- the second one without any activation. + +The final output of the GCN is the log softmax of the output of the second layer, and the intermediate embeddings from the first layer are also be retrieved using the get_embeddings method. + + +## Training and Validation +Evidence of training can be seen in the hyperparameter_tuning.txt file. This logs the output of the nested 10-fold cross validation enacted on the model to tune the hyperparameters and compute the best model. Nested cross validation was chosen as it provides a way to not only tune the hyperparameters but evaluate each set of hyperparameters in a nested loop thus there is no need to have a seperate validation loop. Since the nested cross validation trains the model for every combination of possible hyperparameters, several cases for each hyperparameter were chosen instead of a range of values. This significantly reduced the computation time of the nested validation. By logging the progress of this nested cross validation, we gain insight onto how the model reacts to different sets of hyperparameters thus allowing us to understand our model better. + +The outcome of this hyperparameter tuning showed that the best hyperparameters to use were the following: +{'epochs': 300, 'hidden_features': 64, 'learning_rate': 0.01} + +Doing 10-fold cross validation on the model with these hyperparameters yielded an accuracy of 91.046% + +To gain more insight on the performance of the model, A TSNE plot was created to visualise the data before and after being inputted into the model. This is how the dataset looked before: +![TSNE Before Model Training](multi-layer_GCN_model_s4696681/tsne_before.png) + +And this is the TSNE visualisation after being trained by the model: +![TSNE After Model Training](multi-layer_GCN_model_s4696681/tsne_after.png) + +As can be seen, the model has transformed the data into four distinct regions representing the four classes of the dataset. Though there are some outliers that exist in other classes decision regions, the overwhelming majority of the datapoints have been clustered into the correct class region. This clearly shows that my GCN model is effective at node classification. + +Improvements to the model: +Some possible improvements to the model would be to attempt to reduce the number of outliers that are misclassified. Despite the 91% accuracy, it still could be improved as shown in the TSNE plot. One way to go about this is to experiment with the models architecture. Though I did perform hyperparameter tuning, I did not try utilise nested cross-validation to optimise the model architecture itself. More layers could be added or some sort of dropout could be applied. There are so many different options for modifying the models architecture so the model may in fact be improved if this is done. + +## Libray packages and dependencies +My project utilised the following packages: +torch 2.0.1+cu118 +torchaudio 2.0.2+cu118 +torchvision 0.15.2+cu118 +scikit-learn 1.3.2 +scipy 1.11.3 +matplotlib 3.8.0 +numpy 1.24.1 + + + diff --git a/recognition/multi-layer_GCN_model_s4696681/README.md b/recognition/multi-layer_GCN_model_s4696681/README.md new file mode 100644 index 000000000..70dbf7b7b --- /dev/null +++ b/recognition/multi-layer_GCN_model_s4696681/README.md @@ -0,0 +1,90 @@ +# Multi-layer GCN Model (s4696681) + +This folder contains the implementation of a semi-supervised multi-layer GCN (Graph Convolutional Network) for Facebook's Large Page-Page Network dataset +Hyperlink to dataset: https://snap.stanford.edu/data/facebook-large-page-page-network.html + + + +## Instructions +1. Download the dataset from the hyperlink above, unzip it and place it in the multi-layer_GCN_model_s4696681 folder. This is required otherwise the model wont be able to find the dataset. + +2. (Optional) +Due to the extreme size of the dataset, the adjacency matrix compution takes a copius amount of system memory to execute. Hence, Rangpur's vgpu40 had to be used in order to run this part of the code since even my 16GB of system memory was not enough. To mitigate this for users who do not have sufficient memory, I tried providing a local copy of this computation as a numpy file. However the file is 3.9GB which is far too large to be uploaded to github. To resolve this I have attached a shareable link to this file below. If you do not have sufficient memory on your system or virtual GPU, please download the file from this link: https://drive.google.com/file/d/1s1oZKkCDb9WA-IAjvSCuqsjWW0_goGvR/view?usp=sharing, and put it in the multi-layer_GCN_model_s4696681 folder. +You then will have to modify the dataset.py script slightly by uncommenting the line where it imports the adjacency file, and comment out the line that calls the normalise_adjacency_matrix function. + +However, by default, the code will create the adjacency matrix each time you run the model, the above forewarning is purely for users who might have trouble running the code due to lack of memory. + +3. All that needs to be done is to run the predict.py script since it calls all the other required components from the different scripts. The output of running predict.py should be a log of the hyperparameter tuning and evaluation, the best hyperparameters, the best accuracy, and a TSNE plot for the data before and after it has been run through the model. + +## Overview +Semi-supervised Graph Convolutional Networks (GCNs) for node classification leverage labeled and unlabeled data within a graph to classify nodes. They operate by learning a function that maps a node's features and its topological structure within the graph to an output label. During training, the model uses the available labels in a limited subset of nodes to optimize the classification function, while also considering the graph structure and feature similarity among neighboring nodes. This approach allows the model to generalize and predict labels for unseen or unlabeled nodes in the graph, enhancing performance particularly when labeled data are scarce. + +The Facebook Large Page-Page Network dataset has nodes that represent official Facebook sites, with the edges being mutual likes between the sites. The Nodes are classified into one of four categories, these being: politicians, governmental organizations, television shows, and companies. By leveraging both node feature information (from site descriptions) and topological information (from mutual likes between pages), the GCN can exploit local graph structures and shared features to infer the category of a page. This classification allows for a more efficient organization, retrieval, or understanding of the pages without manually labeling each one. In essence, it enables the automatic categorization of Facebook pages based on both their content and their relationships with other pages. + + +## Data Preprocessing +The preprocessing undertaken ensures that the data is in a suitable format and structure to be used with a GCN. It handles the irregularities in the dataset, normalize crucial components, and prepare training/testing subsets. The following is an outline of the data preprocessing done. + +Adjacency Matrix Creation: +The function create_adjacency_matrix reads edge data from the CSV file musae_facebook_edges.csv to create an adjacency matrix. +It initializes the matrix with zeros and fills it based on the relationships (edges) given in the CSV file. +Diagonal entries are set to 1 since every node links to itself. + +Adjacency Matrix Normalization: +The function normalise_adjacency_matrix normalizes the adjacency matrix. This normalization uses the inverse square root of the degree matrix D. + +Feature Vector Creation: +The function create_feature_vectors reads node features from a JSON file musae_facebook_features.json. +As the features for nodes are inconsistent, it creates an n-dimensional "bag of words" feature vector for each node, where each dimension corresponds to the presence (or absence) of a particular feature. + +Label Conversion: +Labels given in the file musae_facebook_target.csv are in string format. The function convert_labels converts these string labels to integer format. +It maps "politician" to 0, "government" to 1, "tvshow" to 2, and "company" to 3. + +Data Splitting and Tensor Creation: +The function create_tensors prepares data for training and testing. This involves splitting node IDs into training and test sets and creating masks to select training and test nodes. +All the data are then converted into PyTorch tensors and transferred to the GPU. + + +## Model Architecture +The GCNLayer class represents a single layer of the GCN, taking in node features and an adjacency matrix, performing matrix multiplication with a learnable weight, and propagating information through the graph using the adjacency matrix. +The weights of this layer are initialized using the Xavier uniform initialization method. + +The GCN class represents the entire network, which consists of two consecutive GCNLayers: +- the first one applying a ReLU activation function after its operation, followed by a dropout layer, +- the second one without any activation. + +The final output of the GCN is the log softmax of the output of the second layer, and the intermediate embeddings from the first layer are also be retrieved using the get_embeddings method. + + +## Training and Validation +Evidence of training can be seen in the hyperparameter_tuning.txt file. This logs the output of the nested 10-fold cross validation enacted on the model to tune the hyperparameters and compute the best model. Nested cross validation was chosen as it provides a way to not only tune the hyperparameters but evaluate each set of hyperparameters in a nested loop thus there is no need to have a seperate validation loop. Since the nested cross validation trains the model for every combination of possible hyperparameters, several cases for each hyperparameter were chosen instead of a range of values. This significantly reduced the computation time of the nested validation. By logging the progress of this nested cross validation, we gain insight onto how the model reacts to different sets of hyperparameters thus allowing us to understand our model better. + +The outcome of this hyperparameter tuning showed that the best hyperparameters to use were the following: +{'epochs': 300, 'hidden_features': 64, 'learning_rate': 0.01} + +Doing 10-fold cross validation on the model with these hyperparameters yielded an accuracy of 91.046% + +To gain more insight on the performance of the model, A TSNE plot was created to visualise the data before and after being inputted into the model. This is how the dataset looked before: +![TSNE Before Model Training](tsne_before.png) + +And this is the TSNE visualisation after being trained by the model: +![TSNE After Model Training](tsne_after.png) + +As can be seen, the model has transformed the data into four distinct regions representing the four classes of the dataset. Though there are some outliers that exist in other classes decision regions, the overwhelming majority of the datapoints have been clustered into the correct class region. This clearly shows that my GCN model is effective at node classification. + +Improvements to the model: +Some possible improvements to the model would be to attempt to reduce the number of outliers that are misclassified. Despite the 91% accuracy, it still could be improved as shown in the TSNE plot. One way to go about this is to experiment with the models architecture. Though I did perform hyperparameter tuning, I did not try utilise nested cross-validation to optimise the model architecture itself. More layers could be added or some sort of dropout could be applied. There are so many different options for modifying the models architecture so the model may in fact be improved if this is done. + +## Libray packages and dependencies +My project utilised the following packages: +torch 2.0.1+cu118 +torchaudio 2.0.2+cu118 +torchvision 0.15.2+cu118 +scikit-learn 1.3.2 +scipy 1.11.3 +matplotlib 3.8.0 +numpy 1.24.1 + + + diff --git a/recognition/multi-layer_GCN_model_s4696681/dataset.py b/recognition/multi-layer_GCN_model_s4696681/dataset.py new file mode 100644 index 000000000..4ee9f2712 --- /dev/null +++ b/recognition/multi-layer_GCN_model_s4696681/dataset.py @@ -0,0 +1,166 @@ +import csv +import numpy as np +from scipy.linalg import sqrtm +import json +from sklearn.model_selection import train_test_split +from sklearn.manifold import TSNE +import matplotlib.pyplot as plt +from sklearn.model_selection import train_test_split +import torch +device = torch.device("cuda" if torch.cuda.is_available() else "cpu") + + +"""Create Adjacency matrix from the musae_facebook_edges.csv file""" +def create_adjacency_matrix(): + data = [] + + # Read the CSV data from musae_facebook_edges.csv file + with open('facebook_large/musae_facebook_edges.csv', 'r') as file: + reader = csv.reader(file, delimiter=',') + next(reader) + data = list(reader) + + # Determine all unique nodes, it will convert it to a list and sort it, then create a mapping of node to its index + nodes = set() + for row in data: + nodes.add(row[0]) + nodes.add(row[1]) + nodes = sorted(list(nodes), key=int) + node_to_index = {node: idx for idx, node in enumerate(nodes)} + + # Initialize adjacency matrix with zeros + matrix_size = len(nodes) + adjacency_matrix = np.zeros((matrix_size, matrix_size), dtype=int) + + # Fill the matrix + for row in data: + i, j = node_to_index[row[0]], node_to_index[row[1]] + adjacency_matrix[i][j] = 1 + # Because the links are symmetric + adjacency_matrix[j][i] = 1 + + # Every node links to itself so the following must be done + np.fill_diagonal(adjacency_matrix, 1) + return adjacency_matrix + +"""Normalise adjanceny matrix. Where D is the Degree matrix""" +def normalise_adjacency_matrix(adjacency_matrix): + D = np.zeros_like(adjacency_matrix) + np.fill_diagonal(D, np.asarray(adjacency_matrix.sum(axis=1)).flatten()) + D_invsqrt = np.linalg.inv(sqrtm(D)) + adjacency_normed = D_invsqrt @ adjacency_matrix @ D_invsqrt + return adjacency_normed + + +adjacency_normed = normalise_adjacency_matrix(create_adjacency_matrix()) +## Due to extremely high memory usage, it is recommended that systems with low memory uncomment +## the line below and comment out the line above. Ensure the adjacency_normed.npy file is in the multi-layer_GCN_model_s4696681 directory +## if you do this +#adjacency_normed = np.load('adjacency_normed.npy') +adjacency_normed_tensor = torch.FloatTensor(adjacency_normed).to(device) + + + +"""Since the feature .json file has an inconsistent number of features for each node. + We need to make them consistent for training. + Hence I will create an n-dimensional bag of words feature vector for each node""" +def create_feature_vectors(): + print("Test") + # Load data from a JSON file + with open('facebook_large/musae_facebook_features.json', 'r') as file: + data = json.load(file) + + # Convert string keys to integers + data = {int(k): v for k, v in data.items()} + + # Get unique features + all_features = set() + for features in data.values(): + all_features.update(features) + all_features = sorted(list(all_features)) + + # Create n-dimensional bag of words feature vector for each node + feature_list = [] + for node, features in data.items(): + vector = [node] + [1 if feature in features else 0 for feature in all_features] + feature_list.append(vector) + + # Convert to 2D numpy array + feature_vectors = np.array(feature_list, dtype=int) + return feature_vectors + +feature_vectors = create_feature_vectors() +# print(feature_vectors) + + + +"""I will create a mapper function for the class labels since they are in string format. + 0 = politician, 1 = governmental orginisations, 2 = television shows and 3 = companies""" +def mapper(x): + if x == "politician": + y = 0 + elif x == "government": + y = 1 + elif x == "tvshow": + y = 2 + elif x == "company": + y = 3 + return y + +"""Labels in musae_facebook_target.csv are in string format thus we will convert them to an integer value so they can be used with the model""" +def convert_labels(): + processed_data = [] + + # Read the CSV data + with open('facebook_large/musae_facebook_target.csv', 'r', encoding='utf-8') as file: + reader = csv.DictReader(file) + + for row in reader: + id_ = int(row['id']) + page_type = mapper(row['page_type']) + processed_data.append([id_, page_type]) + + # Convert the processed data into a numpy array + node_labels = np.array(processed_data, dtype=int) + return node_labels + +node_labels = convert_labels() +# print(node_labels[:5]) + + + +""" t-SNE plot created to visualize the initial, high-dimensional node features in a 2D space, + giving insights into their structure and relationships prior to any transformation by the GCN.""" +# Extract feature vectors for t-SNE +X = feature_vectors[:, 1:] # Exclude node IDs + +# Apply t-SNE to reduce dimensions +tsne = TSNE(n_components=2, random_state=42) +X_reduced = tsne.fit_transform(X) + +# Visualize +plt.figure(figsize=(10, 8)) +for label in range(4): # Assuming you have 4 classes + indices = np.where(node_labels[:, 1] == label) + plt.scatter(X_reduced[indices, 0], X_reduced[indices, 1], label=str(label)) +plt.legend() +plt.title('t-SNE visualization of original feature vectors') +plt.show() + + +"""Takes the preprocessed data and creates tensors for them so they can be used in the GCN model""" +def create_tensors(): + node_ids = feature_vectors[:, 0] + train_ids, test_ids = train_test_split(node_ids, test_size=0.2, random_state=42) + + train_mask = np.isin(node_ids, train_ids) + test_mask = np.isin(node_ids, test_ids) + + all_features_tensor = torch.FloatTensor(feature_vectors[:, 1:]).to(device) + train_labels_tensor = torch.LongTensor(node_labels[train_mask, 1]).to(device) + test_labels_tensor = torch.LongTensor(node_labels[test_mask, 1]).to(device) + train_tensor = torch.BoolTensor(train_mask).to(device) + test_tensor = torch.BoolTensor(test_mask).to(device) + return all_features_tensor, train_labels_tensor, test_labels_tensor, train_tensor, test_tensor, test_mask +all_features_tensor, train_labels_tensor, test_labels_tensor, train_tensor, test_tensor, test_mask = create_tensors() + diff --git a/recognition/multi-layer_GCN_model_s4696681/modules.py b/recognition/multi-layer_GCN_model_s4696681/modules.py new file mode 100644 index 000000000..719baf75e --- /dev/null +++ b/recognition/multi-layer_GCN_model_s4696681/modules.py @@ -0,0 +1,37 @@ +import torch +from torch.nn import Module, Parameter +import torch.nn.functional as F +import torch.nn as nn +'''updated GCN layer trying to incorporate techniques learnt from model expo CON session''' +class GCNLayer(Module): + def __init__(self, in_features, out_features): + super(GCNLayer, self).__init__() + self.in_features = in_features + self.out_features = out_features + self.weight = Parameter(torch.FloatTensor(in_features, out_features)) + torch.nn.init.xavier_uniform_(self.weight) + + def forward(self, features, adj, active=True): + support = torch.mm(features, self.weight) + output = torch.spmm(adj, support) + if active: + output = F.relu(output) + return output + +'''GCN represents the entire Graph Convolutional Network consisting of two GCNLayer''' +class GCN(nn.Module): + def __init__(self, in_features, hidden_features, out_features): + super(GCN, self).__init__() + self.layer1 = GCNLayer(in_features, hidden_features) + self.layer2 = GCNLayer(hidden_features, out_features) + + def forward(self, x, adj): + # Use the first GNNLayer with activation function (ReLU) + self.embeddings = self.layer1(x, adj, active=True) + h = F.dropout(self.embeddings, p=0.5, training=self.training) + # Use the second GNNLayer without activation function + h = self.layer2(h, adj, active=False) + return F.log_softmax(h, dim=1) + + def get_embeddings(self): + return self.embeddings diff --git a/recognition/multi-layer_GCN_model_s4696681/new_gcd_model_architecture_hyperparameter_tuning.txt b/recognition/multi-layer_GCN_model_s4696681/new_gcd_model_architecture_hyperparameter_tuning.txt new file mode 100644 index 000000000..97d12cb64 --- /dev/null +++ b/recognition/multi-layer_GCN_model_s4696681/new_gcd_model_architecture_hyperparameter_tuning.txt @@ -0,0 +1,5672 @@ +Epoch 0, Loss: 1.3874791860580444 +Epoch 10, Loss: 1.2162939310073853 +Epoch 20, Loss: 1.0165807008743286 +Epoch 30, Loss: 0.8005154728889465 +Epoch 40, Loss: 0.6162823438644409 +Epoch 50, Loss: 0.48595067858695984 +Epoch 60, Loss: 0.4027647078037262 +Epoch 70, Loss: 0.34095102548599243 +Epoch 80, Loss: 0.29307955503463745 +Epoch 90, Loss: 0.2572976052761078 +Test accuracy: 91.19% +Epoch 0, Loss: 1.3863012790679932 +Epoch 10, Loss: 1.1938523054122925 +Epoch 20, Loss: 0.9815076589584351 +Epoch 30, Loss: 0.7840375900268555 +Epoch 40, Loss: 0.6141805648803711 +Epoch 50, Loss: 0.4804762303829193 +Epoch 60, Loss: 0.40345096588134766 +Epoch 70, Loss: 0.3406449556350708 +Epoch 80, Loss: 0.28815552592277527 +Epoch 90, Loss: 0.25292009115219116 +Test accuracy: 89.23% +Epoch 0, Loss: 1.3838392496109009 +Epoch 10, Loss: 1.2052582502365112 +Epoch 20, Loss: 0.9886958599090576 +Epoch 30, Loss: 0.7643296718597412 +Epoch 40, Loss: 0.6036236882209778 +Epoch 50, Loss: 0.48083335161209106 +Epoch 60, Loss: 0.3927817940711975 +Epoch 70, Loss: 0.33858293294906616 +Epoch 80, Loss: 0.29427075386047363 +Epoch 90, Loss: 0.26093143224716187 +Test accuracy: 87.00% +Epoch 0, Loss: 1.3837071657180786 +Epoch 10, Loss: 1.1851404905319214 +Epoch 20, Loss: 0.9470928311347961 +Epoch 30, Loss: 0.7415655851364136 +Epoch 40, Loss: 0.5721685290336609 +Epoch 50, Loss: 0.45628541707992554 +Epoch 60, Loss: 0.37469369173049927 +Epoch 70, Loss: 0.3183935284614563 +Epoch 80, Loss: 0.27866631746292114 +Epoch 90, Loss: 0.24249966442584991 +Test accuracy: 89.68% +Epoch 0, Loss: 1.3876641988754272 +Epoch 10, Loss: 1.2218345403671265 +Epoch 20, Loss: 1.014304757118225 +Epoch 30, Loss: 0.812198281288147 +Epoch 40, Loss: 0.641708254814148 +Epoch 50, Loss: 0.518358051776886 +Epoch 60, Loss: 0.4270665645599365 +Epoch 70, Loss: 0.35841044783592224 +Epoch 80, Loss: 0.31107088923454285 +Epoch 90, Loss: 0.274529367685318 +Test accuracy: 89.01% +Epoch 0, Loss: 1.38552987575531 +Epoch 10, Loss: 1.2339013814926147 +Epoch 20, Loss: 1.0090506076812744 +Epoch 30, Loss: 0.7778310179710388 +Epoch 40, Loss: 0.6116573214530945 +Epoch 50, Loss: 0.4993742108345032 +Epoch 60, Loss: 0.4029735326766968 +Epoch 70, Loss: 0.3426104485988617 +Epoch 80, Loss: 0.290897399187088 +Epoch 90, Loss: 0.2562321126461029 +Test accuracy: 87.32% +Epoch 0, Loss: 1.3865412473678589 +Epoch 10, Loss: 1.2551299333572388 +Epoch 20, Loss: 1.0616459846496582 +Epoch 30, Loss: 0.8586402535438538 +Epoch 40, Loss: 0.6834439039230347 +Epoch 50, Loss: 0.532012939453125 +Epoch 60, Loss: 0.42068126797676086 +Epoch 70, Loss: 0.3390326499938965 +Epoch 80, Loss: 0.2789199650287628 +Epoch 90, Loss: 0.24095775187015533 +Test accuracy: 90.30% +Epoch 0, Loss: 1.3861390352249146 +Epoch 10, Loss: 1.215969443321228 +Epoch 20, Loss: 1.001417636871338 +Epoch 30, Loss: 0.7677152752876282 +Epoch 40, Loss: 0.5943169593811035 +Epoch 50, Loss: 0.47336170077323914 +Epoch 60, Loss: 0.3795381784439087 +Epoch 70, Loss: 0.32802721858024597 +Epoch 80, Loss: 0.28195154666900635 +Epoch 90, Loss: 0.24885517358779907 +Test accuracy: 89.68% +Epoch 0, Loss: 1.3851780891418457 +Epoch 10, Loss: 1.1737998723983765 +Epoch 20, Loss: 0.9396251440048218 +Epoch 30, Loss: 0.7297771573066711 +Epoch 40, Loss: 0.5691521167755127 +Epoch 50, Loss: 0.4495953619480133 +Epoch 60, Loss: 0.3721754550933838 +Epoch 70, Loss: 0.3168952167034149 +Epoch 80, Loss: 0.276653528213501 +Epoch 90, Loss: 0.24695762991905212 +Test accuracy: 89.85% +Epoch 0, Loss: 1.3914003372192383 +Epoch 10, Loss: 1.2417809963226318 +Epoch 20, Loss: 1.0443955659866333 +Epoch 30, Loss: 0.8284052610397339 +Epoch 40, Loss: 0.6556786298751831 +Epoch 50, Loss: 0.528155505657196 +Epoch 60, Loss: 0.43320560455322266 +Epoch 70, Loss: 0.37315529584884644 +Epoch 80, Loss: 0.3139467239379883 +Epoch 90, Loss: 0.27648693323135376 +Test accuracy: 88.43% +Epoch 0, Loss: 1.3880009651184082 +Epoch 10, Loss: 1.3490453958511353 +Epoch 20, Loss: 1.322075605392456 +Epoch 30, Loss: 1.3002852201461792 +Epoch 40, Loss: 1.2733100652694702 +Epoch 50, Loss: 1.2401463985443115 +Epoch 60, Loss: 1.2001327276229858 +Epoch 70, Loss: 1.1625421047210693 +Epoch 80, Loss: 1.117544174194336 +Epoch 90, Loss: 1.0807565450668335 +Test accuracy: 71.07% +Epoch 0, Loss: 1.383015751838684 +Epoch 10, Loss: 1.3412566184997559 +Epoch 20, Loss: 1.3171966075897217 +Epoch 30, Loss: 1.289394736289978 +Epoch 40, Loss: 1.259573221206665 +Epoch 50, Loss: 1.2289304733276367 +Epoch 60, Loss: 1.1927785873413086 +Epoch 70, Loss: 1.1479164361953735 +Epoch 80, Loss: 1.114525556564331 +Epoch 90, Loss: 1.0694819688796997 +Test accuracy: 70.54% +Epoch 0, Loss: 1.3861409425735474 +Epoch 10, Loss: 1.3498482704162598 +Epoch 20, Loss: 1.3279128074645996 +Epoch 30, Loss: 1.3021519184112549 +Epoch 40, Loss: 1.2750746011734009 +Epoch 50, Loss: 1.2413116693496704 +Epoch 60, Loss: 1.2074682712554932 +Epoch 70, Loss: 1.1692694425582886 +Epoch 80, Loss: 1.1267743110656738 +Epoch 90, Loss: 1.0870141983032227 +Test accuracy: 69.92% +Epoch 0, Loss: 1.3856877088546753 +Epoch 10, Loss: 1.3485682010650635 +Epoch 20, Loss: 1.3238816261291504 +Epoch 30, Loss: 1.2977946996688843 +Epoch 40, Loss: 1.2695492506027222 +Epoch 50, Loss: 1.2329777479171753 +Epoch 60, Loss: 1.2005161046981812 +Epoch 70, Loss: 1.1568634510040283 +Epoch 80, Loss: 1.113977313041687 +Epoch 90, Loss: 1.0763239860534668 +Test accuracy: 71.47% +Epoch 0, Loss: 1.389389991760254 +Epoch 10, Loss: 1.3487662076950073 +Epoch 20, Loss: 1.322353482246399 +Epoch 30, Loss: 1.2911268472671509 +Epoch 40, Loss: 1.2617712020874023 +Epoch 50, Loss: 1.2282072305679321 +Epoch 60, Loss: 1.1912106275558472 +Epoch 70, Loss: 1.1531511545181274 +Epoch 80, Loss: 1.113128900527954 +Epoch 90, Loss: 1.0672335624694824 +Test accuracy: 71.92% +Epoch 0, Loss: 1.385964035987854 +Epoch 10, Loss: 1.3484176397323608 +Epoch 20, Loss: 1.3275737762451172 +Epoch 30, Loss: 1.303765892982483 +Epoch 40, Loss: 1.2759292125701904 +Epoch 50, Loss: 1.2432072162628174 +Epoch 60, Loss: 1.2101576328277588 +Epoch 70, Loss: 1.1728838682174683 +Epoch 80, Loss: 1.1290037631988525 +Epoch 90, Loss: 1.088266372680664 +Test accuracy: 71.78% +Epoch 0, Loss: 1.3872679471969604 +Epoch 10, Loss: 1.355265498161316 +Epoch 20, Loss: 1.3344358205795288 +Epoch 30, Loss: 1.315987229347229 +Epoch 40, Loss: 1.2902138233184814 +Epoch 50, Loss: 1.260250449180603 +Epoch 60, Loss: 1.223823070526123 +Epoch 70, Loss: 1.1855071783065796 +Epoch 80, Loss: 1.1437344551086426 +Epoch 90, Loss: 1.098490834236145 +Test accuracy: 72.72% +Epoch 0, Loss: 1.3845548629760742 +Epoch 10, Loss: 1.347674012184143 +Epoch 20, Loss: 1.321821689605713 +Epoch 30, Loss: 1.294365406036377 +Epoch 40, Loss: 1.2623642683029175 +Epoch 50, Loss: 1.227182149887085 +Epoch 60, Loss: 1.189939260482788 +Epoch 70, Loss: 1.1525810956954956 +Epoch 80, Loss: 1.1115565299987793 +Epoch 90, Loss: 1.0706416368484497 +Test accuracy: 70.63% +Epoch 0, Loss: 1.3892481327056885 +Epoch 10, Loss: 1.354751706123352 +Epoch 20, Loss: 1.330674171447754 +Epoch 30, Loss: 1.3051856756210327 +Epoch 40, Loss: 1.2782340049743652 +Epoch 50, Loss: 1.2500325441360474 +Epoch 60, Loss: 1.2118077278137207 +Epoch 70, Loss: 1.1711019277572632 +Epoch 80, Loss: 1.1278027296066284 +Epoch 90, Loss: 1.0900461673736572 +Test accuracy: 72.72% +Epoch 0, Loss: 1.384499430656433 +Epoch 10, Loss: 1.3470516204833984 +Epoch 20, Loss: 1.3135732412338257 +Epoch 30, Loss: 1.2813340425491333 +Epoch 40, Loss: 1.2499300241470337 +Epoch 50, Loss: 1.213929533958435 +Epoch 60, Loss: 1.1795908212661743 +Epoch 70, Loss: 1.1440784931182861 +Epoch 80, Loss: 1.1028627157211304 +Epoch 90, Loss: 1.0623829364776611 +Test accuracy: 70.32% +Epoch 0, Loss: 1.3866677284240723 +Epoch 10, Loss: 1.381121039390564 +Epoch 20, Loss: 1.3758795261383057 +Epoch 30, Loss: 1.3710756301879883 +Epoch 40, Loss: 1.3658920526504517 +Epoch 50, Loss: 1.3617793321609497 +Epoch 60, Loss: 1.3574374914169312 +Epoch 70, Loss: 1.3531012535095215 +Epoch 80, Loss: 1.3494248390197754 +Epoch 90, Loss: 1.3466355800628662 +Test accuracy: 50.38% +Epoch 0, Loss: 1.386617660522461 +Epoch 10, Loss: 1.3814564943313599 +Epoch 20, Loss: 1.3765009641647339 +Epoch 30, Loss: 1.372170329093933 +Epoch 40, Loss: 1.3673990964889526 +Epoch 50, Loss: 1.3629398345947266 +Epoch 60, Loss: 1.3587250709533691 +Epoch 70, Loss: 1.355642557144165 +Epoch 80, Loss: 1.3524515628814697 +Epoch 90, Loss: 1.34806489944458 +Test accuracy: 32.89% +Epoch 0, Loss: 1.3884884119033813 +Epoch 10, Loss: 1.3827248811721802 +Epoch 20, Loss: 1.3780597448349 +Epoch 30, Loss: 1.3727607727050781 +Epoch 40, Loss: 1.368630290031433 +Epoch 50, Loss: 1.3637285232543945 +Epoch 60, Loss: 1.3594138622283936 +Epoch 70, Loss: 1.3546843528747559 +Epoch 80, Loss: 1.3521360158920288 +Epoch 90, Loss: 1.3482089042663574 +Test accuracy: 41.39% +Epoch 0, Loss: 1.3863072395324707 +Epoch 10, Loss: 1.3813867568969727 +Epoch 20, Loss: 1.3771042823791504 +Epoch 30, Loss: 1.3723210096359253 +Epoch 40, Loss: 1.3672616481781006 +Epoch 50, Loss: 1.3631147146224976 +Epoch 60, Loss: 1.3588435649871826 +Epoch 70, Loss: 1.3544062376022339 +Epoch 80, Loss: 1.3502717018127441 +Epoch 90, Loss: 1.3456525802612305 +Test accuracy: 47.00% +Epoch 0, Loss: 1.3880681991577148 +Epoch 10, Loss: 1.382101058959961 +Epoch 20, Loss: 1.3771493434906006 +Epoch 30, Loss: 1.373120903968811 +Epoch 40, Loss: 1.3687349557876587 +Epoch 50, Loss: 1.3644450902938843 +Epoch 60, Loss: 1.359876275062561 +Epoch 70, Loss: 1.3566251993179321 +Epoch 80, Loss: 1.3534196615219116 +Epoch 90, Loss: 1.349941372871399 +Test accuracy: 29.82% +Epoch 0, Loss: 1.3892854452133179 +Epoch 10, Loss: 1.3843460083007812 +Epoch 20, Loss: 1.379837155342102 +Epoch 30, Loss: 1.374399185180664 +Epoch 40, Loss: 1.3693344593048096 +Epoch 50, Loss: 1.3649088144302368 +Epoch 60, Loss: 1.3602968454360962 +Epoch 70, Loss: 1.356642723083496 +Epoch 80, Loss: 1.352023959159851 +Epoch 90, Loss: 1.3484009504318237 +Test accuracy: 34.40% +Epoch 0, Loss: 1.3884294033050537 +Epoch 10, Loss: 1.3806477785110474 +Epoch 20, Loss: 1.3753174543380737 +Epoch 30, Loss: 1.369919776916504 +Epoch 40, Loss: 1.3655649423599243 +Epoch 50, Loss: 1.3613097667694092 +Epoch 60, Loss: 1.3575352430343628 +Epoch 70, Loss: 1.3544282913208008 +Epoch 80, Loss: 1.3510398864746094 +Epoch 90, Loss: 1.3471145629882812 +Test accuracy: 33.56% +Epoch 0, Loss: 1.3893951177597046 +Epoch 10, Loss: 1.3844385147094727 +Epoch 20, Loss: 1.3807023763656616 +Epoch 30, Loss: 1.3770233392715454 +Epoch 40, Loss: 1.3739646673202515 +Epoch 50, Loss: 1.3700653314590454 +Epoch 60, Loss: 1.3660223484039307 +Epoch 70, Loss: 1.3615877628326416 +Epoch 80, Loss: 1.3590987920761108 +Epoch 90, Loss: 1.3553603887557983 +Test accuracy: 41.08% +Epoch 0, Loss: 1.3868515491485596 +Epoch 10, Loss: 1.3812941312789917 +Epoch 20, Loss: 1.3759710788726807 +Epoch 30, Loss: 1.37078857421875 +Epoch 40, Loss: 1.3650106191635132 +Epoch 50, Loss: 1.3607313632965088 +Epoch 60, Loss: 1.3555419445037842 +Epoch 70, Loss: 1.3516515493392944 +Epoch 80, Loss: 1.3480515480041504 +Epoch 90, Loss: 1.3444533348083496 +Test accuracy: 45.53% +Epoch 0, Loss: 1.3859840631484985 +Epoch 10, Loss: 1.3809223175048828 +Epoch 20, Loss: 1.3766452074050903 +Epoch 30, Loss: 1.3716529607772827 +Epoch 40, Loss: 1.3667802810668945 +Epoch 50, Loss: 1.3628593683242798 +Epoch 60, Loss: 1.3589504957199097 +Epoch 70, Loss: 1.3565586805343628 +Epoch 80, Loss: 1.3519744873046875 +Epoch 90, Loss: 1.3497529029846191 +Test accuracy: 48.64% +Epoch 0, Loss: 1.3884178400039673 +Epoch 10, Loss: 1.135044813156128 +Epoch 20, Loss: 0.8365605473518372 +Epoch 30, Loss: 0.5978533625602722 +Epoch 40, Loss: 0.4534481465816498 +Epoch 50, Loss: 0.35194700956344604 +Epoch 60, Loss: 0.28771281242370605 +Epoch 70, Loss: 0.24108123779296875 +Epoch 80, Loss: 0.20719347894191742 +Epoch 90, Loss: 0.1803445667028427 +Test accuracy: 90.70% +Epoch 0, Loss: 1.3867322206497192 +Epoch 10, Loss: 1.1603724956512451 +Epoch 20, Loss: 0.8676769733428955 +Epoch 30, Loss: 0.6176939606666565 +Epoch 40, Loss: 0.4524347484111786 +Epoch 50, Loss: 0.3436739444732666 +Epoch 60, Loss: 0.27989041805267334 +Epoch 70, Loss: 0.23566962778568268 +Epoch 80, Loss: 0.20424506068229675 +Epoch 90, Loss: 0.17291869223117828 +Test accuracy: 89.81% +Epoch 0, Loss: 1.3881773948669434 +Epoch 10, Loss: 1.1593551635742188 +Epoch 20, Loss: 0.8890647888183594 +Epoch 30, Loss: 0.6655043363571167 +Epoch 40, Loss: 0.4857226312160492 +Epoch 50, Loss: 0.37216243147850037 +Epoch 60, Loss: 0.29262053966522217 +Epoch 70, Loss: 0.2349521368741989 +Epoch 80, Loss: 0.2006940394639969 +Epoch 90, Loss: 0.16979312896728516 +Test accuracy: 89.81% +Epoch 0, Loss: 1.390841007232666 +Epoch 10, Loss: 1.1636682748794556 +Epoch 20, Loss: 0.8961487412452698 +Epoch 30, Loss: 0.6542171835899353 +Epoch 40, Loss: 0.48912501335144043 +Epoch 50, Loss: 0.38329219818115234 +Epoch 60, Loss: 0.30723753571510315 +Epoch 70, Loss: 0.24816395342350006 +Epoch 80, Loss: 0.2133994847536087 +Epoch 90, Loss: 0.1843389868736267 +Test accuracy: 90.03% +Epoch 0, Loss: 1.385607361793518 +Epoch 10, Loss: 1.1159138679504395 +Epoch 20, Loss: 0.8058198690414429 +Epoch 30, Loss: 0.5677889585494995 +Epoch 40, Loss: 0.42625072598457336 +Epoch 50, Loss: 0.3264310956001282 +Epoch 60, Loss: 0.2646903693675995 +Epoch 70, Loss: 0.22036512196063995 +Epoch 80, Loss: 0.19075752794742584 +Epoch 90, Loss: 0.16866442561149597 +Test accuracy: 89.27% +Epoch 0, Loss: 1.3873796463012695 +Epoch 10, Loss: 1.1115374565124512 +Epoch 20, Loss: 0.8167297840118408 +Epoch 30, Loss: 0.5974189639091492 +Epoch 40, Loss: 0.44901371002197266 +Epoch 50, Loss: 0.35590845346450806 +Epoch 60, Loss: 0.28578200936317444 +Epoch 70, Loss: 0.24030232429504395 +Epoch 80, Loss: 0.20876729488372803 +Epoch 90, Loss: 0.18205304443836212 +Test accuracy: 90.61% +Epoch 0, Loss: 1.384363055229187 +Epoch 10, Loss: 1.1530119180679321 +Epoch 20, Loss: 0.8614803552627563 +Epoch 30, Loss: 0.6151176691055298 +Epoch 40, Loss: 0.4562402069568634 +Epoch 50, Loss: 0.3551240563392639 +Epoch 60, Loss: 0.2873971462249756 +Epoch 70, Loss: 0.23618485033512115 +Epoch 80, Loss: 0.20037376880645752 +Epoch 90, Loss: 0.17099015414714813 +Test accuracy: 89.68% +Epoch 0, Loss: 1.3847705125808716 +Epoch 10, Loss: 1.1432472467422485 +Epoch 20, Loss: 0.847154438495636 +Epoch 30, Loss: 0.6037564873695374 +Epoch 40, Loss: 0.44711431860923767 +Epoch 50, Loss: 0.34441158175468445 +Epoch 60, Loss: 0.2740779221057892 +Epoch 70, Loss: 0.22634433209896088 +Epoch 80, Loss: 0.19431434571743011 +Epoch 90, Loss: 0.16657698154449463 +Test accuracy: 89.14% +Epoch 0, Loss: 1.3880122900009155 +Epoch 10, Loss: 1.13325035572052 +Epoch 20, Loss: 0.8246162533760071 +Epoch 30, Loss: 0.5995384454727173 +Epoch 40, Loss: 0.4449814260005951 +Epoch 50, Loss: 0.345941424369812 +Epoch 60, Loss: 0.2795337736606598 +Epoch 70, Loss: 0.23204320669174194 +Epoch 80, Loss: 0.19694049656391144 +Epoch 90, Loss: 0.17062485218048096 +Test accuracy: 89.72% +Epoch 0, Loss: 1.3861106634140015 +Epoch 10, Loss: 1.1178622245788574 +Epoch 20, Loss: 0.8118771314620972 +Epoch 30, Loss: 0.5760564804077148 +Epoch 40, Loss: 0.4315356910228729 +Epoch 50, Loss: 0.3352615535259247 +Epoch 60, Loss: 0.2739206850528717 +Epoch 70, Loss: 0.2322963923215866 +Epoch 80, Loss: 0.19823028147220612 +Epoch 90, Loss: 0.17546986043453217 +Test accuracy: 91.28% +Epoch 0, Loss: 1.3824907541275024 +Epoch 10, Loss: 1.3356767892837524 +Epoch 20, Loss: 1.2998669147491455 +Epoch 30, Loss: 1.2566578388214111 +Epoch 40, Loss: 1.2078850269317627 +Epoch 50, Loss: 1.1541827917099 +Epoch 60, Loss: 1.0952388048171997 +Epoch 70, Loss: 1.0364516973495483 +Epoch 80, Loss: 0.9770100712776184 +Epoch 90, Loss: 0.9286320805549622 +Test accuracy: 76.72% +Epoch 0, Loss: 1.389958381652832 +Epoch 10, Loss: 1.3367395401000977 +Epoch 20, Loss: 1.3048231601715088 +Epoch 30, Loss: 1.2626742124557495 +Epoch 40, Loss: 1.211759328842163 +Epoch 50, Loss: 1.1571130752563477 +Epoch 60, Loss: 1.1000981330871582 +Epoch 70, Loss: 1.0407646894454956 +Epoch 80, Loss: 0.976000189781189 +Epoch 90, Loss: 0.9210095405578613 +Test accuracy: 76.68% +Epoch 0, Loss: 1.3880444765090942 +Epoch 10, Loss: 1.3405375480651855 +Epoch 20, Loss: 1.3105247020721436 +Epoch 30, Loss: 1.2757760286331177 +Epoch 40, Loss: 1.2277954816818237 +Epoch 50, Loss: 1.178356409072876 +Epoch 60, Loss: 1.1210085153579712 +Epoch 70, Loss: 1.0681395530700684 +Epoch 80, Loss: 1.0053995847702026 +Epoch 90, Loss: 0.9461789131164551 +Test accuracy: 75.70% +Epoch 0, Loss: 1.3841679096221924 +Epoch 10, Loss: 1.3401809930801392 +Epoch 20, Loss: 1.310088038444519 +Epoch 30, Loss: 1.2709331512451172 +Epoch 40, Loss: 1.22505521774292 +Epoch 50, Loss: 1.1707743406295776 +Epoch 60, Loss: 1.114039421081543 +Epoch 70, Loss: 1.0521790981292725 +Epoch 80, Loss: 0.990649938583374 +Epoch 90, Loss: 0.9317433834075928 +Test accuracy: 76.41% +Epoch 0, Loss: 1.3842835426330566 +Epoch 10, Loss: 1.335771918296814 +Epoch 20, Loss: 1.3021162748336792 +Epoch 30, Loss: 1.260677456855774 +Epoch 40, Loss: 1.2138749361038208 +Epoch 50, Loss: 1.1564698219299316 +Epoch 60, Loss: 1.101584792137146 +Epoch 70, Loss: 1.0426360368728638 +Epoch 80, Loss: 0.9844794869422913 +Epoch 90, Loss: 0.9249131679534912 +Test accuracy: 77.57% +Epoch 0, Loss: 1.3881486654281616 +Epoch 10, Loss: 1.3374137878417969 +Epoch 20, Loss: 1.3065601587295532 +Epoch 30, Loss: 1.2651349306106567 +Epoch 40, Loss: 1.2149289846420288 +Epoch 50, Loss: 1.161527156829834 +Epoch 60, Loss: 1.1012632846832275 +Epoch 70, Loss: 1.0417710542678833 +Epoch 80, Loss: 0.975288450717926 +Epoch 90, Loss: 0.9226260185241699 +Test accuracy: 76.95% +Epoch 0, Loss: 1.3845614194869995 +Epoch 10, Loss: 1.3377625942230225 +Epoch 20, Loss: 1.3015632629394531 +Epoch 30, Loss: 1.2616678476333618 +Epoch 40, Loss: 1.2098585367202759 +Epoch 50, Loss: 1.1590224504470825 +Epoch 60, Loss: 1.1051510572433472 +Epoch 70, Loss: 1.0422163009643555 +Epoch 80, Loss: 0.9874994158744812 +Epoch 90, Loss: 0.9292635917663574 +Test accuracy: 76.86% +Epoch 0, Loss: 1.3914531469345093 +Epoch 10, Loss: 1.344581127166748 +Epoch 20, Loss: 1.313640832901001 +Epoch 30, Loss: 1.2756562232971191 +Epoch 40, Loss: 1.2322086095809937 +Epoch 50, Loss: 1.1805449724197388 +Epoch 60, Loss: 1.1227949857711792 +Epoch 70, Loss: 1.0642948150634766 +Epoch 80, Loss: 1.0048062801361084 +Epoch 90, Loss: 0.9431554079055786 +Test accuracy: 76.41% +Epoch 0, Loss: 1.3842469453811646 +Epoch 10, Loss: 1.337178111076355 +Epoch 20, Loss: 1.3089021444320679 +Epoch 30, Loss: 1.2748996019363403 +Epoch 40, Loss: 1.2320780754089355 +Epoch 50, Loss: 1.1845688819885254 +Epoch 60, Loss: 1.128989577293396 +Epoch 70, Loss: 1.0696280002593994 +Epoch 80, Loss: 1.0100674629211426 +Epoch 90, Loss: 0.950203001499176 +Test accuracy: 76.59% +Epoch 0, Loss: 1.3880956172943115 +Epoch 10, Loss: 1.3372126817703247 +Epoch 20, Loss: 1.3059709072113037 +Epoch 30, Loss: 1.2659045457839966 +Epoch 40, Loss: 1.216661810874939 +Epoch 50, Loss: 1.1625279188156128 +Epoch 60, Loss: 1.1028554439544678 +Epoch 70, Loss: 1.0412770509719849 +Epoch 80, Loss: 0.9816749095916748 +Epoch 90, Loss: 0.9201191663742065 +Test accuracy: 76.10% +Epoch 0, Loss: 1.3839725255966187 +Epoch 10, Loss: 1.3765901327133179 +Epoch 20, Loss: 1.3690502643585205 +Epoch 30, Loss: 1.3627228736877441 +Epoch 40, Loss: 1.3558427095413208 +Epoch 50, Loss: 1.350936770439148 +Epoch 60, Loss: 1.3453289270401 +Epoch 70, Loss: 1.3423027992248535 +Epoch 80, Loss: 1.3365991115570068 +Epoch 90, Loss: 1.3321561813354492 +Test accuracy: 35.02% +Epoch 0, Loss: 1.3840570449829102 +Epoch 10, Loss: 1.377652883529663 +Epoch 20, Loss: 1.3702282905578613 +Epoch 30, Loss: 1.363410234451294 +Epoch 40, Loss: 1.3571032285690308 +Epoch 50, Loss: 1.3518339395523071 +Epoch 60, Loss: 1.3462846279144287 +Epoch 70, Loss: 1.341806173324585 +Epoch 80, Loss: 1.3370921611785889 +Epoch 90, Loss: 1.3348431587219238 +Test accuracy: 47.04% +Epoch 0, Loss: 1.386037826538086 +Epoch 10, Loss: 1.3784254789352417 +Epoch 20, Loss: 1.3710561990737915 +Epoch 30, Loss: 1.3641201257705688 +Epoch 40, Loss: 1.3578530550003052 +Epoch 50, Loss: 1.352774739265442 +Epoch 60, Loss: 1.3470022678375244 +Epoch 70, Loss: 1.3423216342926025 +Epoch 80, Loss: 1.3375922441482544 +Epoch 90, Loss: 1.3336542844772339 +Test accuracy: 37.83% +Epoch 0, Loss: 1.3890326023101807 +Epoch 10, Loss: 1.379936695098877 +Epoch 20, Loss: 1.3733254671096802 +Epoch 30, Loss: 1.367063045501709 +Epoch 40, Loss: 1.361555814743042 +Epoch 50, Loss: 1.3566313982009888 +Epoch 60, Loss: 1.3509997129440308 +Epoch 70, Loss: 1.3466497659683228 +Epoch 80, Loss: 1.3412485122680664 +Epoch 90, Loss: 1.3360995054244995 +Test accuracy: 51.76% +Epoch 0, Loss: 1.38814377784729 +Epoch 10, Loss: 1.3788948059082031 +Epoch 20, Loss: 1.3713290691375732 +Epoch 30, Loss: 1.3641842603683472 +Epoch 40, Loss: 1.3586068153381348 +Epoch 50, Loss: 1.3541589975357056 +Epoch 60, Loss: 1.3483072519302368 +Epoch 70, Loss: 1.3448201417922974 +Epoch 80, Loss: 1.3396377563476562 +Epoch 90, Loss: 1.334410548210144 +Test accuracy: 44.28% +Epoch 0, Loss: 1.3855977058410645 +Epoch 10, Loss: 1.3776440620422363 +Epoch 20, Loss: 1.3700772523880005 +Epoch 30, Loss: 1.3631560802459717 +Epoch 40, Loss: 1.357386589050293 +Epoch 50, Loss: 1.3502812385559082 +Epoch 60, Loss: 1.3441827297210693 +Epoch 70, Loss: 1.3393977880477905 +Epoch 80, Loss: 1.334909439086914 +Epoch 90, Loss: 1.3295003175735474 +Test accuracy: 43.08% +Epoch 0, Loss: 1.3862946033477783 +Epoch 10, Loss: 1.3780614137649536 +Epoch 20, Loss: 1.371211051940918 +Epoch 30, Loss: 1.3639830350875854 +Epoch 40, Loss: 1.3580676317214966 +Epoch 50, Loss: 1.3526557683944702 +Epoch 60, Loss: 1.3479037284851074 +Epoch 70, Loss: 1.343377709388733 +Epoch 80, Loss: 1.3374652862548828 +Epoch 90, Loss: 1.3334180116653442 +Test accuracy: 44.95% +Epoch 0, Loss: 1.3802495002746582 +Epoch 10, Loss: 1.373852252960205 +Epoch 20, Loss: 1.3675236701965332 +Epoch 30, Loss: 1.3612076044082642 +Epoch 40, Loss: 1.3551063537597656 +Epoch 50, Loss: 1.3485060930252075 +Epoch 60, Loss: 1.344228744506836 +Epoch 70, Loss: 1.338861107826233 +Epoch 80, Loss: 1.3338096141815186 +Epoch 90, Loss: 1.3306984901428223 +Test accuracy: 53.63% +Epoch 0, Loss: 1.3917938470840454 +Epoch 10, Loss: 1.3837668895721436 +Epoch 20, Loss: 1.3776649236679077 +Epoch 30, Loss: 1.3722453117370605 +Epoch 40, Loss: 1.3675436973571777 +Epoch 50, Loss: 1.3624382019042969 +Epoch 60, Loss: 1.3579998016357422 +Epoch 70, Loss: 1.3534914255142212 +Epoch 80, Loss: 1.3501861095428467 +Epoch 90, Loss: 1.3449132442474365 +Test accuracy: 42.81% +Epoch 0, Loss: 1.390093207359314 +Epoch 10, Loss: 1.382555365562439 +Epoch 20, Loss: 1.3762222528457642 +Epoch 30, Loss: 1.3698842525482178 +Epoch 40, Loss: 1.3645544052124023 +Epoch 50, Loss: 1.358752727508545 +Epoch 60, Loss: 1.3543330430984497 +Epoch 70, Loss: 1.3501614332199097 +Epoch 80, Loss: 1.3449499607086182 +Epoch 90, Loss: 1.3414254188537598 +Test accuracy: 50.56% +Epoch 0, Loss: 1.3854146003723145 +Epoch 10, Loss: 1.0520508289337158 +Epoch 20, Loss: 0.6707255840301514 +Epoch 30, Loss: 0.4464613199234009 +Epoch 40, Loss: 0.32281258702278137 +Epoch 50, Loss: 0.24656419456005096 +Epoch 60, Loss: 0.19644583761692047 +Epoch 70, Loss: 0.16100774705410004 +Epoch 80, Loss: 0.13519378006458282 +Epoch 90, Loss: 0.11775747686624527 +Test accuracy: 91.05% +Epoch 0, Loss: 1.3910574913024902 +Epoch 10, Loss: 1.0149341821670532 +Epoch 20, Loss: 0.6585711240768433 +Epoch 30, Loss: 0.4350234270095825 +Epoch 40, Loss: 0.3199341893196106 +Epoch 50, Loss: 0.237280935049057 +Epoch 60, Loss: 0.18985457718372345 +Epoch 70, Loss: 0.15655018389225006 +Epoch 80, Loss: 0.12872783839702606 +Epoch 90, Loss: 0.11394818127155304 +Test accuracy: 90.34% +Epoch 0, Loss: 1.383751392364502 +Epoch 10, Loss: 1.0551563501358032 +Epoch 20, Loss: 0.6776295304298401 +Epoch 30, Loss: 0.44446346163749695 +Epoch 40, Loss: 0.32204997539520264 +Epoch 50, Loss: 0.2437923401594162 +Epoch 60, Loss: 0.1955687701702118 +Epoch 70, Loss: 0.15920066833496094 +Epoch 80, Loss: 0.13582658767700195 +Epoch 90, Loss: 0.11744291335344315 +Test accuracy: 90.03% +Epoch 0, Loss: 1.383520245552063 +Epoch 10, Loss: 1.040069580078125 +Epoch 20, Loss: 0.6598474383354187 +Epoch 30, Loss: 0.4298471510410309 +Epoch 40, Loss: 0.3034321665763855 +Epoch 50, Loss: 0.22574889659881592 +Epoch 60, Loss: 0.18158374726772308 +Epoch 70, Loss: 0.1481890231370926 +Epoch 80, Loss: 0.12620557844638824 +Epoch 90, Loss: 0.10805249959230423 +Test accuracy: 90.25% +Epoch 0, Loss: 1.3879902362823486 +Epoch 10, Loss: 1.0352116823196411 +Epoch 20, Loss: 0.6533879041671753 +Epoch 30, Loss: 0.4312499761581421 +Epoch 40, Loss: 0.30755293369293213 +Epoch 50, Loss: 0.2325567901134491 +Epoch 60, Loss: 0.18400149047374725 +Epoch 70, Loss: 0.15453504025936127 +Epoch 80, Loss: 0.12970498204231262 +Epoch 90, Loss: 0.11517536640167236 +Test accuracy: 90.12% +Epoch 0, Loss: 1.3883507251739502 +Epoch 10, Loss: 1.050573706626892 +Epoch 20, Loss: 0.6795700192451477 +Epoch 30, Loss: 0.45576006174087524 +Epoch 40, Loss: 0.32745611667633057 +Epoch 50, Loss: 0.2532528340816498 +Epoch 60, Loss: 0.20048682391643524 +Epoch 70, Loss: 0.16797006130218506 +Epoch 80, Loss: 0.14447493851184845 +Epoch 90, Loss: 0.12587642669677734 +Test accuracy: 91.46% +Epoch 0, Loss: 1.3859285116195679 +Epoch 10, Loss: 1.0424628257751465 +Epoch 20, Loss: 0.6737514138221741 +Epoch 30, Loss: 0.44702407717704773 +Epoch 40, Loss: 0.3234737813472748 +Epoch 50, Loss: 0.251431405544281 +Epoch 60, Loss: 0.19755564630031586 +Epoch 70, Loss: 0.16527648270130157 +Epoch 80, Loss: 0.13812829554080963 +Epoch 90, Loss: 0.12005660682916641 +Test accuracy: 90.97% +Epoch 0, Loss: 1.3862954378128052 +Epoch 10, Loss: 1.042868733406067 +Epoch 20, Loss: 0.6726725101470947 +Epoch 30, Loss: 0.4507889747619629 +Epoch 40, Loss: 0.3232018053531647 +Epoch 50, Loss: 0.24316640198230743 +Epoch 60, Loss: 0.1946265995502472 +Epoch 70, Loss: 0.15793350338935852 +Epoch 80, Loss: 0.13409605622291565 +Epoch 90, Loss: 0.1144651472568512 +Test accuracy: 91.28% +Epoch 0, Loss: 1.3894296884536743 +Epoch 10, Loss: 1.0785030126571655 +Epoch 20, Loss: 0.7077886462211609 +Epoch 30, Loss: 0.4687902331352234 +Epoch 40, Loss: 0.33607447147369385 +Epoch 50, Loss: 0.25455886125564575 +Epoch 60, Loss: 0.20518477261066437 +Epoch 70, Loss: 0.16684162616729736 +Epoch 80, Loss: 0.14129206538200378 +Epoch 90, Loss: 0.11989861726760864 +Test accuracy: 89.85% +Epoch 0, Loss: 1.388770580291748 +Epoch 10, Loss: 1.0875377655029297 +Epoch 20, Loss: 0.7082760334014893 +Epoch 30, Loss: 0.4701494872570038 +Epoch 40, Loss: 0.3340695798397064 +Epoch 50, Loss: 0.2536315321922302 +Epoch 60, Loss: 0.20404060184955597 +Epoch 70, Loss: 0.16588319838047028 +Epoch 80, Loss: 0.14164425432682037 +Epoch 90, Loss: 0.12096604704856873 +Test accuracy: 89.94% +Epoch 0, Loss: 1.387802243232727 +Epoch 10, Loss: 1.3257633447647095 +Epoch 20, Loss: 1.281588077545166 +Epoch 30, Loss: 1.2224534749984741 +Epoch 40, Loss: 1.1538591384887695 +Epoch 50, Loss: 1.075878381729126 +Epoch 60, Loss: 0.9946083426475525 +Epoch 70, Loss: 0.9178887605667114 +Epoch 80, Loss: 0.8420363068580627 +Epoch 90, Loss: 0.7748884558677673 +Test accuracy: 78.64% +Epoch 0, Loss: 1.3866640329360962 +Epoch 10, Loss: 1.3260130882263184 +Epoch 20, Loss: 1.2791534662246704 +Epoch 30, Loss: 1.2190921306610107 +Epoch 40, Loss: 1.14686918258667 +Epoch 50, Loss: 1.0685484409332275 +Epoch 60, Loss: 0.9881006479263306 +Epoch 70, Loss: 0.9105901718139648 +Epoch 80, Loss: 0.8403570652008057 +Epoch 90, Loss: 0.7722476720809937 +Test accuracy: 79.08% +Epoch 0, Loss: 1.3887324333190918 +Epoch 10, Loss: 1.3238246440887451 +Epoch 20, Loss: 1.271376371383667 +Epoch 30, Loss: 1.209659218788147 +Epoch 40, Loss: 1.1339149475097656 +Epoch 50, Loss: 1.0548443794250488 +Epoch 60, Loss: 0.9742494225502014 +Epoch 70, Loss: 0.8926548957824707 +Epoch 80, Loss: 0.8214539885520935 +Epoch 90, Loss: 0.757874608039856 +Test accuracy: 79.39% +Epoch 0, Loss: 1.3881030082702637 +Epoch 10, Loss: 1.3283567428588867 +Epoch 20, Loss: 1.2809644937515259 +Epoch 30, Loss: 1.2227522134780884 +Epoch 40, Loss: 1.1562955379486084 +Epoch 50, Loss: 1.0806881189346313 +Epoch 60, Loss: 1.0019917488098145 +Epoch 70, Loss: 0.9189918041229248 +Epoch 80, Loss: 0.8456252813339233 +Epoch 90, Loss: 0.7787684798240662 +Test accuracy: 79.71% +Epoch 0, Loss: 1.389758825302124 +Epoch 10, Loss: 1.3285152912139893 +Epoch 20, Loss: 1.282008171081543 +Epoch 30, Loss: 1.2235695123672485 +Epoch 40, Loss: 1.1575729846954346 +Epoch 50, Loss: 1.0785988569259644 +Epoch 60, Loss: 0.9972110986709595 +Epoch 70, Loss: 0.9192996025085449 +Epoch 80, Loss: 0.8440195322036743 +Epoch 90, Loss: 0.7718971371650696 +Test accuracy: 80.46% +Epoch 0, Loss: 1.3889901638031006 +Epoch 10, Loss: 1.3251707553863525 +Epoch 20, Loss: 1.277429223060608 +Epoch 30, Loss: 1.2142305374145508 +Epoch 40, Loss: 1.141704797744751 +Epoch 50, Loss: 1.065597414970398 +Epoch 60, Loss: 0.9833812713623047 +Epoch 70, Loss: 0.9031139016151428 +Epoch 80, Loss: 0.8281646966934204 +Epoch 90, Loss: 0.7602707743644714 +Test accuracy: 78.28% +Epoch 0, Loss: 1.387660264968872 +Epoch 10, Loss: 1.3255316019058228 +Epoch 20, Loss: 1.276429295539856 +Epoch 30, Loss: 1.2138029336929321 +Epoch 40, Loss: 1.141211748123169 +Epoch 50, Loss: 1.060522437095642 +Epoch 60, Loss: 0.977787971496582 +Epoch 70, Loss: 0.9006971716880798 +Epoch 80, Loss: 0.8278682827949524 +Epoch 90, Loss: 0.7625306248664856 +Test accuracy: 78.46% +Epoch 0, Loss: 1.3892154693603516 +Epoch 10, Loss: 1.3273228406906128 +Epoch 20, Loss: 1.2831813097000122 +Epoch 30, Loss: 1.2261899709701538 +Epoch 40, Loss: 1.1602188348770142 +Epoch 50, Loss: 1.085471272468567 +Epoch 60, Loss: 1.005797266960144 +Epoch 70, Loss: 0.9235910177230835 +Epoch 80, Loss: 0.8473912477493286 +Epoch 90, Loss: 0.7761938571929932 +Test accuracy: 79.22% +Epoch 0, Loss: 1.387650728225708 +Epoch 10, Loss: 1.3309019804000854 +Epoch 20, Loss: 1.283637523651123 +Epoch 30, Loss: 1.221407413482666 +Epoch 40, Loss: 1.1457209587097168 +Epoch 50, Loss: 1.0678375959396362 +Epoch 60, Loss: 0.9813553690910339 +Epoch 70, Loss: 0.8973442316055298 +Epoch 80, Loss: 0.8264785408973694 +Epoch 90, Loss: 0.7581815123558044 +Test accuracy: 79.66% +Epoch 0, Loss: 1.3860218524932861 +Epoch 10, Loss: 1.3252595663070679 +Epoch 20, Loss: 1.2745400667190552 +Epoch 30, Loss: 1.2132563591003418 +Epoch 40, Loss: 1.1392561197280884 +Epoch 50, Loss: 1.0585726499557495 +Epoch 60, Loss: 0.9780598878860474 +Epoch 70, Loss: 0.8991382718086243 +Epoch 80, Loss: 0.8245933651924133 +Epoch 90, Loss: 0.7599759697914124 +Test accuracy: 79.84% +Epoch 0, Loss: 1.3870023488998413 +Epoch 10, Loss: 1.3764045238494873 +Epoch 20, Loss: 1.367483377456665 +Epoch 30, Loss: 1.359292984008789 +Epoch 40, Loss: 1.3513318300247192 +Epoch 50, Loss: 1.3442914485931396 +Epoch 60, Loss: 1.3375083208084106 +Epoch 70, Loss: 1.3314194679260254 +Epoch 80, Loss: 1.3260008096694946 +Epoch 90, Loss: 1.3202537298202515 +Test accuracy: 46.02% +Epoch 0, Loss: 1.3870395421981812 +Epoch 10, Loss: 1.3760313987731934 +Epoch 20, Loss: 1.3662402629852295 +Epoch 30, Loss: 1.3575385808944702 +Epoch 40, Loss: 1.3500536680221558 +Epoch 50, Loss: 1.3422253131866455 +Epoch 60, Loss: 1.335585355758667 +Epoch 70, Loss: 1.3302662372589111 +Epoch 80, Loss: 1.3232579231262207 +Epoch 90, Loss: 1.316853404045105 +Test accuracy: 50.69% +Epoch 0, Loss: 1.3865936994552612 +Epoch 10, Loss: 1.3761639595031738 +Epoch 20, Loss: 1.3667058944702148 +Epoch 30, Loss: 1.3584282398223877 +Epoch 40, Loss: 1.3505369424819946 +Epoch 50, Loss: 1.3435380458831787 +Epoch 60, Loss: 1.3372727632522583 +Epoch 70, Loss: 1.331379771232605 +Epoch 80, Loss: 1.3254657983779907 +Epoch 90, Loss: 1.3179141283035278 +Test accuracy: 48.69% +Epoch 0, Loss: 1.3906596899032593 +Epoch 10, Loss: 1.3787769079208374 +Epoch 20, Loss: 1.3686935901641846 +Epoch 30, Loss: 1.3600959777832031 +Epoch 40, Loss: 1.3527525663375854 +Epoch 50, Loss: 1.3453302383422852 +Epoch 60, Loss: 1.3396323919296265 +Epoch 70, Loss: 1.333820104598999 +Epoch 80, Loss: 1.3284844160079956 +Epoch 90, Loss: 1.3226326704025269 +Test accuracy: 45.30% +Epoch 0, Loss: 1.3862054347991943 +Epoch 10, Loss: 1.3741034269332886 +Epoch 20, Loss: 1.3642719984054565 +Epoch 30, Loss: 1.355430006980896 +Epoch 40, Loss: 1.3477526903152466 +Epoch 50, Loss: 1.340943694114685 +Epoch 60, Loss: 1.3348450660705566 +Epoch 70, Loss: 1.3286269903182983 +Epoch 80, Loss: 1.3221296072006226 +Epoch 90, Loss: 1.3159258365631104 +Test accuracy: 48.38% +Epoch 0, Loss: 1.3874670267105103 +Epoch 10, Loss: 1.3790993690490723 +Epoch 20, Loss: 1.3714303970336914 +Epoch 30, Loss: 1.363080382347107 +Epoch 40, Loss: 1.3558590412139893 +Epoch 50, Loss: 1.3493329286575317 +Epoch 60, Loss: 1.342010259628296 +Epoch 70, Loss: 1.3358334302902222 +Epoch 80, Loss: 1.3301990032196045 +Epoch 90, Loss: 1.3248448371887207 +Test accuracy: 52.78% +Epoch 0, Loss: 1.3861650228500366 +Epoch 10, Loss: 1.3767368793487549 +Epoch 20, Loss: 1.3674657344818115 +Epoch 30, Loss: 1.359989881515503 +Epoch 40, Loss: 1.3521443605422974 +Epoch 50, Loss: 1.3457210063934326 +Epoch 60, Loss: 1.3396531343460083 +Epoch 70, Loss: 1.3336505889892578 +Epoch 80, Loss: 1.3282158374786377 +Epoch 90, Loss: 1.3211735486984253 +Test accuracy: 43.79% +Epoch 0, Loss: 1.3854459524154663 +Epoch 10, Loss: 1.3743129968643188 +Epoch 20, Loss: 1.3652162551879883 +Epoch 30, Loss: 1.3567543029785156 +Epoch 40, Loss: 1.3491569757461548 +Epoch 50, Loss: 1.3427668809890747 +Epoch 60, Loss: 1.336553931236267 +Epoch 70, Loss: 1.3303288221359253 +Epoch 80, Loss: 1.3249343633651733 +Epoch 90, Loss: 1.3182551860809326 +Test accuracy: 45.26% +Epoch 0, Loss: 1.389426589012146 +Epoch 10, Loss: 1.3776451349258423 +Epoch 20, Loss: 1.36807119846344 +Epoch 30, Loss: 1.3596489429473877 +Epoch 40, Loss: 1.3530505895614624 +Epoch 50, Loss: 1.346518874168396 +Epoch 60, Loss: 1.3392932415008545 +Epoch 70, Loss: 1.3329217433929443 +Epoch 80, Loss: 1.3283368349075317 +Epoch 90, Loss: 1.3210026025772095 +Test accuracy: 51.09% +Epoch 0, Loss: 1.3859212398529053 +Epoch 10, Loss: 1.3759058713912964 +Epoch 20, Loss: 1.367702841758728 +Epoch 30, Loss: 1.359810471534729 +Epoch 40, Loss: 1.3524876832962036 +Epoch 50, Loss: 1.3457882404327393 +Epoch 60, Loss: 1.3399146795272827 +Epoch 70, Loss: 1.3337876796722412 +Epoch 80, Loss: 1.3281272649765015 +Epoch 90, Loss: 1.3218040466308594 +Test accuracy: 50.69% +Epoch 0, Loss: 1.3869556188583374 +Epoch 10, Loss: 1.219825267791748 +Epoch 20, Loss: 1.0202914476394653 +Epoch 30, Loss: 0.8050278425216675 +Epoch 40, Loss: 0.6457632184028625 +Epoch 50, Loss: 0.5320937037467957 +Epoch 60, Loss: 0.4355979561805725 +Epoch 70, Loss: 0.3696947991847992 +Epoch 80, Loss: 0.3145785927772522 +Epoch 90, Loss: 0.2737601399421692 +Epoch 100, Loss: 0.24133767187595367 +Epoch 110, Loss: 0.2162764072418213 +Epoch 120, Loss: 0.18798181414604187 +Epoch 130, Loss: 0.17647713422775269 +Epoch 140, Loss: 0.16522391140460968 +Epoch 150, Loss: 0.15567004680633545 +Epoch 160, Loss: 0.1443926990032196 +Epoch 170, Loss: 0.13834553956985474 +Epoch 180, Loss: 0.1291155219078064 +Epoch 190, Loss: 0.12192684412002563 +Test accuracy: 90.92% +Epoch 0, Loss: 1.3839547634124756 +Epoch 10, Loss: 1.2230805158615112 +Epoch 20, Loss: 0.9986169934272766 +Epoch 30, Loss: 0.7527921795845032 +Epoch 40, Loss: 0.5805984139442444 +Epoch 50, Loss: 0.4639829397201538 +Epoch 60, Loss: 0.3873947858810425 +Epoch 70, Loss: 0.32857462763786316 +Epoch 80, Loss: 0.2846812605857849 +Epoch 90, Loss: 0.252974271774292 +Epoch 100, Loss: 0.23036354780197144 +Epoch 110, Loss: 0.2064623236656189 +Epoch 120, Loss: 0.18890541791915894 +Epoch 130, Loss: 0.1775377094745636 +Epoch 140, Loss: 0.16656574606895447 +Epoch 150, Loss: 0.1550253927707672 +Epoch 160, Loss: 0.1448405385017395 +Epoch 170, Loss: 0.13494786620140076 +Epoch 180, Loss: 0.12812216579914093 +Epoch 190, Loss: 0.1220274418592453 +Test accuracy: 90.12% +Epoch 0, Loss: 1.387711524963379 +Epoch 10, Loss: 1.1844810247421265 +Epoch 20, Loss: 0.967233419418335 +Epoch 30, Loss: 0.7684585452079773 +Epoch 40, Loss: 0.6057966351509094 +Epoch 50, Loss: 0.49404066801071167 +Epoch 60, Loss: 0.40094703435897827 +Epoch 70, Loss: 0.3364122807979584 +Epoch 80, Loss: 0.2916020452976227 +Epoch 90, Loss: 0.26353585720062256 +Epoch 100, Loss: 0.23454509675502777 +Epoch 110, Loss: 0.20981082320213318 +Epoch 120, Loss: 0.19562828540802002 +Epoch 130, Loss: 0.18043874204158783 +Epoch 140, Loss: 0.16572150588035583 +Epoch 150, Loss: 0.15361295640468597 +Epoch 160, Loss: 0.14875563979148865 +Epoch 170, Loss: 0.1429595649242401 +Epoch 180, Loss: 0.1330699771642685 +Epoch 190, Loss: 0.12602967023849487 +Test accuracy: 89.68% +Epoch 0, Loss: 1.3852049112319946 +Epoch 10, Loss: 1.1942079067230225 +Epoch 20, Loss: 0.9761713743209839 +Epoch 30, Loss: 0.7725900411605835 +Epoch 40, Loss: 0.596790075302124 +Epoch 50, Loss: 0.48236924409866333 +Epoch 60, Loss: 0.39419957995414734 +Epoch 70, Loss: 0.3345562219619751 +Epoch 80, Loss: 0.28977540135383606 +Epoch 90, Loss: 0.25665631890296936 +Epoch 100, Loss: 0.2250131219625473 +Epoch 110, Loss: 0.20570719242095947 +Epoch 120, Loss: 0.1885501742362976 +Epoch 130, Loss: 0.17527204751968384 +Epoch 140, Loss: 0.16289830207824707 +Epoch 150, Loss: 0.14965175092220306 +Epoch 160, Loss: 0.1431884765625 +Epoch 170, Loss: 0.13784441351890564 +Epoch 180, Loss: 0.12830691039562225 +Epoch 190, Loss: 0.12163524329662323 +Test accuracy: 91.68% +Epoch 0, Loss: 1.3838214874267578 +Epoch 10, Loss: 1.1558284759521484 +Epoch 20, Loss: 0.9119754433631897 +Epoch 30, Loss: 0.7089598178863525 +Epoch 40, Loss: 0.565104603767395 +Epoch 50, Loss: 0.4574430584907532 +Epoch 60, Loss: 0.38316285610198975 +Epoch 70, Loss: 0.3271823227405548 +Epoch 80, Loss: 0.2838040888309479 +Epoch 90, Loss: 0.2495570331811905 +Epoch 100, Loss: 0.22608798742294312 +Epoch 110, Loss: 0.2046920657157898 +Epoch 120, Loss: 0.19002267718315125 +Epoch 130, Loss: 0.16817247867584229 +Epoch 140, Loss: 0.15973275899887085 +Epoch 150, Loss: 0.15114985406398773 +Epoch 160, Loss: 0.1364240050315857 +Epoch 170, Loss: 0.13028183579444885 +Epoch 180, Loss: 0.12104571610689163 +Epoch 190, Loss: 0.11654362082481384 +Test accuracy: 90.30% +Epoch 0, Loss: 1.3825273513793945 +Epoch 10, Loss: 1.1644049882888794 +Epoch 20, Loss: 0.9303164482116699 +Epoch 30, Loss: 0.7199113965034485 +Epoch 40, Loss: 0.5750124454498291 +Epoch 50, Loss: 0.463370144367218 +Epoch 60, Loss: 0.39078500866889954 +Epoch 70, Loss: 0.32689493894577026 +Epoch 80, Loss: 0.27931275963783264 +Epoch 90, Loss: 0.24759891629219055 +Epoch 100, Loss: 0.21755127608776093 +Epoch 110, Loss: 0.1992378532886505 +Epoch 120, Loss: 0.17939114570617676 +Epoch 130, Loss: 0.16444601118564606 +Epoch 140, Loss: 0.1548413783311844 +Epoch 150, Loss: 0.14203360676765442 +Epoch 160, Loss: 0.13367055356502533 +Epoch 170, Loss: 0.12590479850769043 +Epoch 180, Loss: 0.12124036997556686 +Epoch 190, Loss: 0.11406109482049942 +Test accuracy: 90.70% +Epoch 0, Loss: 1.3864936828613281 +Epoch 10, Loss: 1.2145360708236694 +Epoch 20, Loss: 0.9858101606369019 +Epoch 30, Loss: 0.7730209827423096 +Epoch 40, Loss: 0.5942935943603516 +Epoch 50, Loss: 0.4831162393093109 +Epoch 60, Loss: 0.390844464302063 +Epoch 70, Loss: 0.3299906551837921 +Epoch 80, Loss: 0.28886494040489197 +Epoch 90, Loss: 0.2558388411998749 +Epoch 100, Loss: 0.22886475920677185 +Epoch 110, Loss: 0.2054031938314438 +Epoch 120, Loss: 0.19180628657341003 +Epoch 130, Loss: 0.17458076775074005 +Epoch 140, Loss: 0.16122882068157196 +Epoch 150, Loss: 0.15325234830379486 +Epoch 160, Loss: 0.14488792419433594 +Epoch 170, Loss: 0.13852354884147644 +Epoch 180, Loss: 0.1291176676750183 +Epoch 190, Loss: 0.12248875200748444 +Test accuracy: 91.46% +Epoch 0, Loss: 1.389142394065857 +Epoch 10, Loss: 1.1852144002914429 +Epoch 20, Loss: 0.9762646555900574 +Epoch 30, Loss: 0.7719306945800781 +Epoch 40, Loss: 0.6047284007072449 +Epoch 50, Loss: 0.4861472249031067 +Epoch 60, Loss: 0.3968130350112915 +Epoch 70, Loss: 0.3345073163509369 +Epoch 80, Loss: 0.28422054648399353 +Epoch 90, Loss: 0.25102153420448303 +Epoch 100, Loss: 0.21871453523635864 +Epoch 110, Loss: 0.19869756698608398 +Epoch 120, Loss: 0.18153280019760132 +Epoch 130, Loss: 0.1684221625328064 +Epoch 140, Loss: 0.1592787802219391 +Epoch 150, Loss: 0.15414822101593018 +Epoch 160, Loss: 0.14061495661735535 +Epoch 170, Loss: 0.13173136115074158 +Epoch 180, Loss: 0.12665832042694092 +Epoch 190, Loss: 0.11863569915294647 +Test accuracy: 91.37% +Epoch 0, Loss: 1.3855183124542236 +Epoch 10, Loss: 1.2021300792694092 +Epoch 20, Loss: 0.9811897277832031 +Epoch 30, Loss: 0.7616783976554871 +Epoch 40, Loss: 0.5820848345756531 +Epoch 50, Loss: 0.45643046498298645 +Epoch 60, Loss: 0.37155601382255554 +Epoch 70, Loss: 0.3087955415248871 +Epoch 80, Loss: 0.27134811878204346 +Epoch 90, Loss: 0.23416434228420258 +Epoch 100, Loss: 0.20560389757156372 +Epoch 110, Loss: 0.1885194331407547 +Epoch 120, Loss: 0.16880950331687927 +Epoch 130, Loss: 0.15775901079177856 +Epoch 140, Loss: 0.14499886333942413 +Epoch 150, Loss: 0.1370135396718979 +Epoch 160, Loss: 0.12745437026023865 +Epoch 170, Loss: 0.11940805613994598 +Epoch 180, Loss: 0.11262739449739456 +Epoch 190, Loss: 0.10945319384336472 +Test accuracy: 90.79% +Epoch 0, Loss: 1.3887221813201904 +Epoch 10, Loss: 1.21209716796875 +Epoch 20, Loss: 1.0251319408416748 +Epoch 30, Loss: 0.8371965289115906 +Epoch 40, Loss: 0.6683443188667297 +Epoch 50, Loss: 0.5422911047935486 +Epoch 60, Loss: 0.44380658864974976 +Epoch 70, Loss: 0.3704186677932739 +Epoch 80, Loss: 0.326637327671051 +Epoch 90, Loss: 0.28731828927993774 +Epoch 100, Loss: 0.2518031895160675 +Epoch 110, Loss: 0.23974816501140594 +Epoch 120, Loss: 0.21273179352283478 +Epoch 130, Loss: 0.1964559108018875 +Epoch 140, Loss: 0.18738657236099243 +Epoch 150, Loss: 0.17393705248832703 +Epoch 160, Loss: 0.16656652092933655 +Epoch 170, Loss: 0.1522577553987503 +Epoch 180, Loss: 0.1439853310585022 +Epoch 190, Loss: 0.13856826722621918 +Test accuracy: 90.39% +Epoch 0, Loss: 1.3882462978363037 +Epoch 10, Loss: 1.35261070728302 +Epoch 20, Loss: 1.323880672454834 +Epoch 30, Loss: 1.2988758087158203 +Epoch 40, Loss: 1.265271782875061 +Epoch 50, Loss: 1.2285720109939575 +Epoch 60, Loss: 1.1888771057128906 +Epoch 70, Loss: 1.1473314762115479 +Epoch 80, Loss: 1.1050121784210205 +Epoch 90, Loss: 1.066595435142517 +Epoch 100, Loss: 1.017032504081726 +Epoch 110, Loss: 0.9790999293327332 +Epoch 120, Loss: 0.9423161149024963 +Epoch 130, Loss: 0.9042126536369324 +Epoch 140, Loss: 0.86571204662323 +Epoch 150, Loss: 0.8353511691093445 +Epoch 160, Loss: 0.8057149052619934 +Epoch 170, Loss: 0.7763202786445618 +Epoch 180, Loss: 0.7488722801208496 +Epoch 190, Loss: 0.7220425605773926 +Test accuracy: 79.66% +Epoch 0, Loss: 1.3883562088012695 +Epoch 10, Loss: 1.3490458726882935 +Epoch 20, Loss: 1.325816035270691 +Epoch 30, Loss: 1.3003143072128296 +Epoch 40, Loss: 1.2709662914276123 +Epoch 50, Loss: 1.2376807928085327 +Epoch 60, Loss: 1.199629545211792 +Epoch 70, Loss: 1.1583071947097778 +Epoch 80, Loss: 1.1187279224395752 +Epoch 90, Loss: 1.074242353439331 +Epoch 100, Loss: 1.0300116539001465 +Epoch 110, Loss: 0.9863944053649902 +Epoch 120, Loss: 0.9491803050041199 +Epoch 130, Loss: 0.9107239246368408 +Epoch 140, Loss: 0.8730496764183044 +Epoch 150, Loss: 0.8350100517272949 +Epoch 160, Loss: 0.802975058555603 +Epoch 170, Loss: 0.7717203497886658 +Epoch 180, Loss: 0.7461339235305786 +Epoch 190, Loss: 0.7216139435768127 +Test accuracy: 80.06% +Epoch 0, Loss: 1.3840022087097168 +Epoch 10, Loss: 1.3448028564453125 +Epoch 20, Loss: 1.3230996131896973 +Epoch 30, Loss: 1.2933882474899292 +Epoch 40, Loss: 1.2646549940109253 +Epoch 50, Loss: 1.23243248462677 +Epoch 60, Loss: 1.1986688375473022 +Epoch 70, Loss: 1.1578131914138794 +Epoch 80, Loss: 1.1125397682189941 +Epoch 90, Loss: 1.0731861591339111 +Epoch 100, Loss: 1.0320199728012085 +Epoch 110, Loss: 0.9908576011657715 +Epoch 120, Loss: 0.946101725101471 +Epoch 130, Loss: 0.9128577709197998 +Epoch 140, Loss: 0.8746651411056519 +Epoch 150, Loss: 0.8433385491371155 +Epoch 160, Loss: 0.8150950074195862 +Epoch 170, Loss: 0.7816669344902039 +Epoch 180, Loss: 0.7497097849845886 +Epoch 190, Loss: 0.7313574552536011 +Test accuracy: 78.82% +Epoch 0, Loss: 1.3829894065856934 +Epoch 10, Loss: 1.3423036336898804 +Epoch 20, Loss: 1.3189549446105957 +Epoch 30, Loss: 1.2908310890197754 +Epoch 40, Loss: 1.264752745628357 +Epoch 50, Loss: 1.2295252084732056 +Epoch 60, Loss: 1.1949001550674438 +Epoch 70, Loss: 1.1520395278930664 +Epoch 80, Loss: 1.113606572151184 +Epoch 90, Loss: 1.0757390260696411 +Epoch 100, Loss: 1.0301814079284668 +Epoch 110, Loss: 0.9840475916862488 +Epoch 120, Loss: 0.9556947946548462 +Epoch 130, Loss: 0.9129519462585449 +Epoch 140, Loss: 0.8837144374847412 +Epoch 150, Loss: 0.8446709513664246 +Epoch 160, Loss: 0.8103293776512146 +Epoch 170, Loss: 0.7859659194946289 +Epoch 180, Loss: 0.7572022676467896 +Epoch 190, Loss: 0.7314429879188538 +Test accuracy: 80.55% +Epoch 0, Loss: 1.3878381252288818 +Epoch 10, Loss: 1.3551374673843384 +Epoch 20, Loss: 1.3357840776443481 +Epoch 30, Loss: 1.3161566257476807 +Epoch 40, Loss: 1.2894169092178345 +Epoch 50, Loss: 1.2571295499801636 +Epoch 60, Loss: 1.2240997552871704 +Epoch 70, Loss: 1.1863369941711426 +Epoch 80, Loss: 1.1467390060424805 +Epoch 90, Loss: 1.1052913665771484 +Epoch 100, Loss: 1.0578585863113403 +Epoch 110, Loss: 1.0195461511611938 +Epoch 120, Loss: 0.9751941561698914 +Epoch 130, Loss: 0.9302388429641724 +Epoch 140, Loss: 0.8952741622924805 +Epoch 150, Loss: 0.8583669066429138 +Epoch 160, Loss: 0.8241497874259949 +Epoch 170, Loss: 0.7923064231872559 +Epoch 180, Loss: 0.7629522681236267 +Epoch 190, Loss: 0.7330613732337952 +Test accuracy: 80.73% +Epoch 0, Loss: 1.385694980621338 +Epoch 10, Loss: 1.3514981269836426 +Epoch 20, Loss: 1.3305045366287231 +Epoch 30, Loss: 1.308650016784668 +Epoch 40, Loss: 1.282137393951416 +Epoch 50, Loss: 1.2497831583023071 +Epoch 60, Loss: 1.214055061340332 +Epoch 70, Loss: 1.1735811233520508 +Epoch 80, Loss: 1.1309915781021118 +Epoch 90, Loss: 1.0890947580337524 +Epoch 100, Loss: 1.0424928665161133 +Epoch 110, Loss: 1.0011866092681885 +Epoch 120, Loss: 0.9629995822906494 +Epoch 130, Loss: 0.9193246364593506 +Epoch 140, Loss: 0.8821827173233032 +Epoch 150, Loss: 0.8561258912086487 +Epoch 160, Loss: 0.8167614936828613 +Epoch 170, Loss: 0.7886197566986084 +Epoch 180, Loss: 0.7640570998191833 +Epoch 190, Loss: 0.7296862006187439 +Test accuracy: 79.80% +Epoch 0, Loss: 1.3854422569274902 +Epoch 10, Loss: 1.3469995260238647 +Epoch 20, Loss: 1.3249506950378418 +Epoch 30, Loss: 1.3002870082855225 +Epoch 40, Loss: 1.2749977111816406 +Epoch 50, Loss: 1.2414205074310303 +Epoch 60, Loss: 1.2025728225708008 +Epoch 70, Loss: 1.1651010513305664 +Epoch 80, Loss: 1.121799349784851 +Epoch 90, Loss: 1.077892780303955 +Epoch 100, Loss: 1.0371428728103638 +Epoch 110, Loss: 0.9925066828727722 +Epoch 120, Loss: 0.9474669694900513 +Epoch 130, Loss: 0.9151764512062073 +Epoch 140, Loss: 0.8774574398994446 +Epoch 150, Loss: 0.8430833220481873 +Epoch 160, Loss: 0.8083740472793579 +Epoch 170, Loss: 0.7773230075836182 +Epoch 180, Loss: 0.7529248595237732 +Epoch 190, Loss: 0.7238496541976929 +Test accuracy: 80.20% +Epoch 0, Loss: 1.3885328769683838 +Epoch 10, Loss: 1.3569542169570923 +Epoch 20, Loss: 1.3323628902435303 +Epoch 30, Loss: 1.3102717399597168 +Epoch 40, Loss: 1.281054139137268 +Epoch 50, Loss: 1.2446801662445068 +Epoch 60, Loss: 1.2113666534423828 +Epoch 70, Loss: 1.1745755672454834 +Epoch 80, Loss: 1.1273744106292725 +Epoch 90, Loss: 1.0867705345153809 +Epoch 100, Loss: 1.0460838079452515 +Epoch 110, Loss: 1.0066015720367432 +Epoch 120, Loss: 0.9638383388519287 +Epoch 130, Loss: 0.9217207431793213 +Epoch 140, Loss: 0.8820804357528687 +Epoch 150, Loss: 0.8486581444740295 +Epoch 160, Loss: 0.8187488913536072 +Epoch 170, Loss: 0.7843850255012512 +Epoch 180, Loss: 0.7587867975234985 +Epoch 190, Loss: 0.7285638451576233 +Test accuracy: 80.42% +Epoch 0, Loss: 1.3888298273086548 +Epoch 10, Loss: 1.349142074584961 +Epoch 20, Loss: 1.3295172452926636 +Epoch 30, Loss: 1.3027675151824951 +Epoch 40, Loss: 1.2722049951553345 +Epoch 50, Loss: 1.2381116151809692 +Epoch 60, Loss: 1.1984800100326538 +Epoch 70, Loss: 1.1602890491485596 +Epoch 80, Loss: 1.1134777069091797 +Epoch 90, Loss: 1.072081208229065 +Epoch 100, Loss: 1.026482105255127 +Epoch 110, Loss: 0.9879459142684937 +Epoch 120, Loss: 0.9515528082847595 +Epoch 130, Loss: 0.9110531806945801 +Epoch 140, Loss: 0.8816195726394653 +Epoch 150, Loss: 0.8408957123756409 +Epoch 160, Loss: 0.8071659803390503 +Epoch 170, Loss: 0.7796837687492371 +Epoch 180, Loss: 0.757495641708374 +Epoch 190, Loss: 0.7332531809806824 +Test accuracy: 81.04% +Epoch 0, Loss: 1.3861900568008423 +Epoch 10, Loss: 1.3502171039581299 +Epoch 20, Loss: 1.324601173400879 +Epoch 30, Loss: 1.297955870628357 +Epoch 40, Loss: 1.2663753032684326 +Epoch 50, Loss: 1.2346218824386597 +Epoch 60, Loss: 1.1995902061462402 +Epoch 70, Loss: 1.1595855951309204 +Epoch 80, Loss: 1.116775631904602 +Epoch 90, Loss: 1.078464388847351 +Epoch 100, Loss: 1.033227801322937 +Epoch 110, Loss: 0.9970264434814453 +Epoch 120, Loss: 0.9485139846801758 +Epoch 130, Loss: 0.9133719801902771 +Epoch 140, Loss: 0.877660870552063 +Epoch 150, Loss: 0.8399858474731445 +Epoch 160, Loss: 0.8178150653839111 +Epoch 170, Loss: 0.779315710067749 +Epoch 180, Loss: 0.7521874904632568 +Epoch 190, Loss: 0.7282838225364685 +Test accuracy: 80.91% +Epoch 0, Loss: 1.3834298849105835 +Epoch 10, Loss: 1.3786077499389648 +Epoch 20, Loss: 1.3726974725723267 +Epoch 30, Loss: 1.3672994375228882 +Epoch 40, Loss: 1.3628665208816528 +Epoch 50, Loss: 1.3577604293823242 +Epoch 60, Loss: 1.3540022373199463 +Epoch 70, Loss: 1.3495734930038452 +Epoch 80, Loss: 1.346564531326294 +Epoch 90, Loss: 1.3431766033172607 +Epoch 100, Loss: 1.338752269744873 +Epoch 110, Loss: 1.3360151052474976 +Epoch 120, Loss: 1.3335667848587036 +Epoch 130, Loss: 1.3292322158813477 +Epoch 140, Loss: 1.3261058330535889 +Epoch 150, Loss: 1.3222438097000122 +Epoch 160, Loss: 1.3188844919204712 +Epoch 170, Loss: 1.3134690523147583 +Epoch 180, Loss: 1.3115525245666504 +Epoch 190, Loss: 1.3074811697006226 +Test accuracy: 50.07% +Epoch 0, Loss: 1.3858492374420166 +Epoch 10, Loss: 1.381123661994934 +Epoch 20, Loss: 1.3774946928024292 +Epoch 30, Loss: 1.3745261430740356 +Epoch 40, Loss: 1.3711092472076416 +Epoch 50, Loss: 1.3678008317947388 +Epoch 60, Loss: 1.3647854328155518 +Epoch 70, Loss: 1.3621306419372559 +Epoch 80, Loss: 1.3580718040466309 +Epoch 90, Loss: 1.354932427406311 +Epoch 100, Loss: 1.3515959978103638 +Epoch 110, Loss: 1.3479121923446655 +Epoch 120, Loss: 1.3454749584197998 +Epoch 130, Loss: 1.341247797012329 +Epoch 140, Loss: 1.33810555934906 +Epoch 150, Loss: 1.3340187072753906 +Epoch 160, Loss: 1.332472324371338 +Epoch 170, Loss: 1.3292889595031738 +Epoch 180, Loss: 1.3257126808166504 +Epoch 190, Loss: 1.3240065574645996 +Test accuracy: 49.40% +Epoch 0, Loss: 1.386582612991333 +Epoch 10, Loss: 1.381704568862915 +Epoch 20, Loss: 1.377561330795288 +Epoch 30, Loss: 1.373821496963501 +Epoch 40, Loss: 1.3701598644256592 +Epoch 50, Loss: 1.3669910430908203 +Epoch 60, Loss: 1.3626593351364136 +Epoch 70, Loss: 1.3598490953445435 +Epoch 80, Loss: 1.3548367023468018 +Epoch 90, Loss: 1.3510254621505737 +Epoch 100, Loss: 1.3490675687789917 +Epoch 110, Loss: 1.3449970483779907 +Epoch 120, Loss: 1.3396016359329224 +Epoch 130, Loss: 1.3354010581970215 +Epoch 140, Loss: 1.3340637683868408 +Epoch 150, Loss: 1.3299165964126587 +Epoch 160, Loss: 1.3250325918197632 +Epoch 170, Loss: 1.3221068382263184 +Epoch 180, Loss: 1.3196135759353638 +Epoch 190, Loss: 1.314622163772583 +Test accuracy: 53.00% +Epoch 0, Loss: 1.3872078657150269 +Epoch 10, Loss: 1.3814071416854858 +Epoch 20, Loss: 1.3765642642974854 +Epoch 30, Loss: 1.372018575668335 +Epoch 40, Loss: 1.3684371709823608 +Epoch 50, Loss: 1.364545226097107 +Epoch 60, Loss: 1.361412525177002 +Epoch 70, Loss: 1.3580574989318848 +Epoch 80, Loss: 1.3546924591064453 +Epoch 90, Loss: 1.3528884649276733 +Epoch 100, Loss: 1.349303960800171 +Epoch 110, Loss: 1.3467508554458618 +Epoch 120, Loss: 1.3430255651474 +Epoch 130, Loss: 1.340241551399231 +Epoch 140, Loss: 1.337739109992981 +Epoch 150, Loss: 1.3355997800827026 +Epoch 160, Loss: 1.3315258026123047 +Epoch 170, Loss: 1.327237606048584 +Epoch 180, Loss: 1.3254705667495728 +Epoch 190, Loss: 1.3229707479476929 +Test accuracy: 48.33% +Epoch 0, Loss: 1.3874837160110474 +Epoch 10, Loss: 1.3847274780273438 +Epoch 20, Loss: 1.382230281829834 +Epoch 30, Loss: 1.3791513442993164 +Epoch 40, Loss: 1.375836968421936 +Epoch 50, Loss: 1.3725261688232422 +Epoch 60, Loss: 1.369324803352356 +Epoch 70, Loss: 1.365664005279541 +Epoch 80, Loss: 1.3628770112991333 +Epoch 90, Loss: 1.3587584495544434 +Epoch 100, Loss: 1.3552815914154053 +Epoch 110, Loss: 1.3502986431121826 +Epoch 120, Loss: 1.3470585346221924 +Epoch 130, Loss: 1.344055414199829 +Epoch 140, Loss: 1.3394560813903809 +Epoch 150, Loss: 1.3362843990325928 +Epoch 160, Loss: 1.3326842784881592 +Epoch 170, Loss: 1.328147530555725 +Epoch 180, Loss: 1.3234772682189941 +Epoch 190, Loss: 1.3203976154327393 +Test accuracy: 56.12% +Epoch 0, Loss: 1.3851433992385864 +Epoch 10, Loss: 1.3804067373275757 +Epoch 20, Loss: 1.3762950897216797 +Epoch 30, Loss: 1.371656894683838 +Epoch 40, Loss: 1.36725652217865 +Epoch 50, Loss: 1.3625829219818115 +Epoch 60, Loss: 1.3588781356811523 +Epoch 70, Loss: 1.354196548461914 +Epoch 80, Loss: 1.3497047424316406 +Epoch 90, Loss: 1.3462117910385132 +Epoch 100, Loss: 1.3424363136291504 +Epoch 110, Loss: 1.3378171920776367 +Epoch 120, Loss: 1.3330713510513306 +Epoch 130, Loss: 1.3304024934768677 +Epoch 140, Loss: 1.3265836238861084 +Epoch 150, Loss: 1.323792576789856 +Epoch 160, Loss: 1.3190438747406006 +Epoch 170, Loss: 1.31682550907135 +Epoch 180, Loss: 1.312470555305481 +Epoch 190, Loss: 1.3080705404281616 +Test accuracy: 54.34% +Epoch 0, Loss: 1.3858972787857056 +Epoch 10, Loss: 1.3814983367919922 +Epoch 20, Loss: 1.3781585693359375 +Epoch 30, Loss: 1.374424934387207 +Epoch 40, Loss: 1.3712987899780273 +Epoch 50, Loss: 1.3678841590881348 +Epoch 60, Loss: 1.3643417358398438 +Epoch 70, Loss: 1.3602674007415771 +Epoch 80, Loss: 1.3568757772445679 +Epoch 90, Loss: 1.3535377979278564 +Epoch 100, Loss: 1.3503010272979736 +Epoch 110, Loss: 1.3476371765136719 +Epoch 120, Loss: 1.343699336051941 +Epoch 130, Loss: 1.3400152921676636 +Epoch 140, Loss: 1.337143063545227 +Epoch 150, Loss: 1.3332083225250244 +Epoch 160, Loss: 1.3285545110702515 +Epoch 170, Loss: 1.3270971775054932 +Epoch 180, Loss: 1.3253607749938965 +Epoch 190, Loss: 1.3202251195907593 +Test accuracy: 50.91% +Epoch 0, Loss: 1.3875219821929932 +Epoch 10, Loss: 1.382086157798767 +Epoch 20, Loss: 1.376481056213379 +Epoch 30, Loss: 1.3708970546722412 +Epoch 40, Loss: 1.3657830953598022 +Epoch 50, Loss: 1.3610483407974243 +Epoch 60, Loss: 1.3575555086135864 +Epoch 70, Loss: 1.3533694744110107 +Epoch 80, Loss: 1.3493748903274536 +Epoch 90, Loss: 1.3468499183654785 +Epoch 100, Loss: 1.3425188064575195 +Epoch 110, Loss: 1.3388115167617798 +Epoch 120, Loss: 1.336415410041809 +Epoch 130, Loss: 1.333509087562561 +Epoch 140, Loss: 1.3294079303741455 +Epoch 150, Loss: 1.3259716033935547 +Epoch 160, Loss: 1.3236899375915527 +Epoch 170, Loss: 1.3202162981033325 +Epoch 180, Loss: 1.317355990409851 +Epoch 190, Loss: 1.3138492107391357 +Test accuracy: 43.61% +Epoch 0, Loss: 1.3856042623519897 +Epoch 10, Loss: 1.3792229890823364 +Epoch 20, Loss: 1.3730504512786865 +Epoch 30, Loss: 1.3671090602874756 +Epoch 40, Loss: 1.362501621246338 +Epoch 50, Loss: 1.3580021858215332 +Epoch 60, Loss: 1.3546229600906372 +Epoch 70, Loss: 1.3508973121643066 +Epoch 80, Loss: 1.34721040725708 +Epoch 90, Loss: 1.3437309265136719 +Epoch 100, Loss: 1.3394088745117188 +Epoch 110, Loss: 1.3371456861495972 +Epoch 120, Loss: 1.3348565101623535 +Epoch 130, Loss: 1.330619215965271 +Epoch 140, Loss: 1.3272908926010132 +Epoch 150, Loss: 1.3243690729141235 +Epoch 160, Loss: 1.3208260536193848 +Epoch 170, Loss: 1.3179019689559937 +Epoch 180, Loss: 1.3135383129119873 +Epoch 190, Loss: 1.3093475103378296 +Test accuracy: 47.71% +Epoch 0, Loss: 1.384310245513916 +Epoch 10, Loss: 1.3790605068206787 +Epoch 20, Loss: 1.3731285333633423 +Epoch 30, Loss: 1.3675602674484253 +Epoch 40, Loss: 1.3625084161758423 +Epoch 50, Loss: 1.3579069375991821 +Epoch 60, Loss: 1.3536250591278076 +Epoch 70, Loss: 1.3506474494934082 +Epoch 80, Loss: 1.3473280668258667 +Epoch 90, Loss: 1.3443682193756104 +Epoch 100, Loss: 1.3405240774154663 +Epoch 110, Loss: 1.337930679321289 +Epoch 120, Loss: 1.3345117568969727 +Epoch 130, Loss: 1.3316084146499634 +Epoch 140, Loss: 1.3270201683044434 +Epoch 150, Loss: 1.3266278505325317 +Epoch 160, Loss: 1.3205313682556152 +Epoch 170, Loss: 1.3173826932907104 +Epoch 180, Loss: 1.3144335746765137 +Epoch 190, Loss: 1.3106294870376587 +Test accuracy: 48.02% +Epoch 0, Loss: 1.3852676153182983 +Epoch 10, Loss: 1.1235383749008179 +Epoch 20, Loss: 0.8259861469268799 +Epoch 30, Loss: 0.5775792598724365 +Epoch 40, Loss: 0.4273291230201721 +Epoch 50, Loss: 0.3333800733089447 +Epoch 60, Loss: 0.2731674015522003 +Epoch 70, Loss: 0.2276098132133484 +Epoch 80, Loss: 0.1987191140651703 +Epoch 90, Loss: 0.17373871803283691 +Epoch 100, Loss: 0.15091359615325928 +Epoch 110, Loss: 0.1401394009590149 +Epoch 120, Loss: 0.12399972975254059 +Epoch 130, Loss: 0.1152903214097023 +Epoch 140, Loss: 0.10649684071540833 +Epoch 150, Loss: 0.0995192751288414 +Epoch 160, Loss: 0.09200987219810486 +Epoch 170, Loss: 0.08664407581090927 +Epoch 180, Loss: 0.08040644973516464 +Epoch 190, Loss: 0.07452751696109772 +Test accuracy: 91.59% +Epoch 0, Loss: 1.3860143423080444 +Epoch 10, Loss: 1.1318789720535278 +Epoch 20, Loss: 0.8471381068229675 +Epoch 30, Loss: 0.6058401465415955 +Epoch 40, Loss: 0.4487481117248535 +Epoch 50, Loss: 0.3487546443939209 +Epoch 60, Loss: 0.27849963307380676 +Epoch 70, Loss: 0.22879871726036072 +Epoch 80, Loss: 0.1973971426486969 +Epoch 90, Loss: 0.16996027529239655 +Epoch 100, Loss: 0.151073157787323 +Epoch 110, Loss: 0.13529279828071594 +Epoch 120, Loss: 0.12272846698760986 +Epoch 130, Loss: 0.11239160597324371 +Epoch 140, Loss: 0.10238561034202576 +Epoch 150, Loss: 0.0949959009885788 +Epoch 160, Loss: 0.08882391452789307 +Epoch 170, Loss: 0.08291786909103394 +Epoch 180, Loss: 0.07757061719894409 +Epoch 190, Loss: 0.07370675355195999 +Test accuracy: 90.43% +Epoch 0, Loss: 1.389555811882019 +Epoch 10, Loss: 1.1929831504821777 +Epoch 20, Loss: 0.9047382473945618 +Epoch 30, Loss: 0.6486021876335144 +Epoch 40, Loss: 0.48068374395370483 +Epoch 50, Loss: 0.3722454011440277 +Epoch 60, Loss: 0.2979337275028229 +Epoch 70, Loss: 0.24793994426727295 +Epoch 80, Loss: 0.20823583006858826 +Epoch 90, Loss: 0.17864973843097687 +Epoch 100, Loss: 0.1579713076353073 +Epoch 110, Loss: 0.1404123604297638 +Epoch 120, Loss: 0.12773534655570984 +Epoch 130, Loss: 0.11824691295623779 +Epoch 140, Loss: 0.10621736943721771 +Epoch 150, Loss: 0.09847065061330795 +Epoch 160, Loss: 0.09067774564027786 +Epoch 170, Loss: 0.08354076743125916 +Epoch 180, Loss: 0.07897689938545227 +Epoch 190, Loss: 0.07250364869832993 +Test accuracy: 90.57% +Epoch 0, Loss: 1.3875247240066528 +Epoch 10, Loss: 1.1761634349822998 +Epoch 20, Loss: 0.8922242522239685 +Epoch 30, Loss: 0.623166024684906 +Epoch 40, Loss: 0.4510935842990875 +Epoch 50, Loss: 0.3512320816516876 +Epoch 60, Loss: 0.2844964563846588 +Epoch 70, Loss: 0.23771624267101288 +Epoch 80, Loss: 0.20219933986663818 +Epoch 90, Loss: 0.1771765500307083 +Epoch 100, Loss: 0.15573649108409882 +Epoch 110, Loss: 0.14219704270362854 +Epoch 120, Loss: 0.12981660664081573 +Epoch 130, Loss: 0.11804593354463577 +Epoch 140, Loss: 0.10751118510961533 +Epoch 150, Loss: 0.10220444202423096 +Epoch 160, Loss: 0.09419989585876465 +Epoch 170, Loss: 0.08769866824150085 +Epoch 180, Loss: 0.08287876099348068 +Epoch 190, Loss: 0.07713571190834045 +Test accuracy: 90.43% +Epoch 0, Loss: 1.3896007537841797 +Epoch 10, Loss: 1.1542714834213257 +Epoch 20, Loss: 0.8730335235595703 +Epoch 30, Loss: 0.6418161392211914 +Epoch 40, Loss: 0.4839886724948883 +Epoch 50, Loss: 0.3835345208644867 +Epoch 60, Loss: 0.3089917302131653 +Epoch 70, Loss: 0.26010632514953613 +Epoch 80, Loss: 0.2263280600309372 +Epoch 90, Loss: 0.19318871200084686 +Epoch 100, Loss: 0.17015601694583893 +Epoch 110, Loss: 0.15200282633304596 +Epoch 120, Loss: 0.13593697547912598 +Epoch 130, Loss: 0.12476123124361038 +Epoch 140, Loss: 0.11716640740633011 +Epoch 150, Loss: 0.10583995282649994 +Epoch 160, Loss: 0.09939177334308624 +Epoch 170, Loss: 0.09213060140609741 +Epoch 180, Loss: 0.08659754693508148 +Epoch 190, Loss: 0.07929326593875885 +Test accuracy: 90.16% +Epoch 0, Loss: 1.3855197429656982 +Epoch 10, Loss: 1.1420737504959106 +Epoch 20, Loss: 0.8361662030220032 +Epoch 30, Loss: 0.5889331698417664 +Epoch 40, Loss: 0.4398934245109558 +Epoch 50, Loss: 0.34397631883621216 +Epoch 60, Loss: 0.2796443700790405 +Epoch 70, Loss: 0.2329866886138916 +Epoch 80, Loss: 0.1968473494052887 +Epoch 90, Loss: 0.1726512461900711 +Epoch 100, Loss: 0.15206049382686615 +Epoch 110, Loss: 0.135503351688385 +Epoch 120, Loss: 0.12326635420322418 +Epoch 130, Loss: 0.11199110746383667 +Epoch 140, Loss: 0.10466208308935165 +Epoch 150, Loss: 0.09471825510263443 +Epoch 160, Loss: 0.08752176910638809 +Epoch 170, Loss: 0.08125097304582596 +Epoch 180, Loss: 0.07610247284173965 +Epoch 190, Loss: 0.07225767523050308 +Test accuracy: 90.08% +Epoch 0, Loss: 1.3885730504989624 +Epoch 10, Loss: 1.1397532224655151 +Epoch 20, Loss: 0.8372939825057983 +Epoch 30, Loss: 0.6126987934112549 +Epoch 40, Loss: 0.4619505703449249 +Epoch 50, Loss: 0.36020371317863464 +Epoch 60, Loss: 0.2883266508579254 +Epoch 70, Loss: 0.24153827130794525 +Epoch 80, Loss: 0.2068122774362564 +Epoch 90, Loss: 0.17701847851276398 +Epoch 100, Loss: 0.15826596319675446 +Epoch 110, Loss: 0.14218908548355103 +Epoch 120, Loss: 0.12812475860118866 +Epoch 130, Loss: 0.11670345813035965 +Epoch 140, Loss: 0.1068887710571289 +Epoch 150, Loss: 0.10299661755561829 +Epoch 160, Loss: 0.09661057591438293 +Epoch 170, Loss: 0.08781326562166214 +Epoch 180, Loss: 0.08352194726467133 +Epoch 190, Loss: 0.0790332779288292 +Test accuracy: 91.32% +Epoch 0, Loss: 1.3874449729919434 +Epoch 10, Loss: 1.12087881565094 +Epoch 20, Loss: 0.7954813838005066 +Epoch 30, Loss: 0.5667380094528198 +Epoch 40, Loss: 0.4203792214393616 +Epoch 50, Loss: 0.3301628530025482 +Epoch 60, Loss: 0.2636251449584961 +Epoch 70, Loss: 0.22375911474227905 +Epoch 80, Loss: 0.19088831543922424 +Epoch 90, Loss: 0.16582196950912476 +Epoch 100, Loss: 0.15047933161258698 +Epoch 110, Loss: 0.1309100091457367 +Epoch 120, Loss: 0.12097223848104477 +Epoch 130, Loss: 0.10958444327116013 +Epoch 140, Loss: 0.10082252323627472 +Epoch 150, Loss: 0.09419980645179749 +Epoch 160, Loss: 0.08915136754512787 +Epoch 170, Loss: 0.08113588392734528 +Epoch 180, Loss: 0.07613814622163773 +Epoch 190, Loss: 0.07151404023170471 +Test accuracy: 91.59% +Epoch 0, Loss: 1.3854953050613403 +Epoch 10, Loss: 1.109050989151001 +Epoch 20, Loss: 0.7919424772262573 +Epoch 30, Loss: 0.5531733632087708 +Epoch 40, Loss: 0.4079773724079132 +Epoch 50, Loss: 0.31897929310798645 +Epoch 60, Loss: 0.2562122642993927 +Epoch 70, Loss: 0.21412044763565063 +Epoch 80, Loss: 0.18427209556102753 +Epoch 90, Loss: 0.16037403047084808 +Epoch 100, Loss: 0.14553487300872803 +Epoch 110, Loss: 0.13124944269657135 +Epoch 120, Loss: 0.11670908331871033 +Epoch 130, Loss: 0.11029035598039627 +Epoch 140, Loss: 0.10152336955070496 +Epoch 150, Loss: 0.09541162103414536 +Epoch 160, Loss: 0.08829217404127121 +Epoch 170, Loss: 0.08198206126689911 +Epoch 180, Loss: 0.07777858525514603 +Epoch 190, Loss: 0.07362834364175797 +Test accuracy: 90.92% +Epoch 0, Loss: 1.3898873329162598 +Epoch 10, Loss: 1.1463756561279297 +Epoch 20, Loss: 0.863935649394989 +Epoch 30, Loss: 0.6352163553237915 +Epoch 40, Loss: 0.4827460050582886 +Epoch 50, Loss: 0.37419748306274414 +Epoch 60, Loss: 0.3017995357513428 +Epoch 70, Loss: 0.2481175661087036 +Epoch 80, Loss: 0.21120676398277283 +Epoch 90, Loss: 0.18260176479816437 +Epoch 100, Loss: 0.1594856232404709 +Epoch 110, Loss: 0.14258763194084167 +Epoch 120, Loss: 0.12955954670906067 +Epoch 130, Loss: 0.11913552135229111 +Epoch 140, Loss: 0.1063840314745903 +Epoch 150, Loss: 0.10080567747354507 +Epoch 160, Loss: 0.09048178046941757 +Epoch 170, Loss: 0.08444855362176895 +Epoch 180, Loss: 0.08113568276166916 +Epoch 190, Loss: 0.07593745738267899 +Test accuracy: 90.88% +Epoch 0, Loss: 1.3837720155715942 +Epoch 10, Loss: 1.3289794921875 +Epoch 20, Loss: 1.2910879850387573 +Epoch 30, Loss: 1.2447595596313477 +Epoch 40, Loss: 1.1898396015167236 +Epoch 50, Loss: 1.133426308631897 +Epoch 60, Loss: 1.0754903554916382 +Epoch 70, Loss: 1.014941692352295 +Epoch 80, Loss: 0.9580321907997131 +Epoch 90, Loss: 0.9003778696060181 +Epoch 100, Loss: 0.8504837155342102 +Epoch 110, Loss: 0.7995595335960388 +Epoch 120, Loss: 0.7585324645042419 +Epoch 130, Loss: 0.7219336628913879 +Epoch 140, Loss: 0.6849491000175476 +Epoch 150, Loss: 0.6507781744003296 +Epoch 160, Loss: 0.6196486949920654 +Epoch 170, Loss: 0.5905826687812805 +Epoch 180, Loss: 0.5645450353622437 +Epoch 190, Loss: 0.5439261794090271 +Test accuracy: 84.51% +Epoch 0, Loss: 1.3857930898666382 +Epoch 10, Loss: 1.3389301300048828 +Epoch 20, Loss: 1.3051613569259644 +Epoch 30, Loss: 1.2605066299438477 +Epoch 40, Loss: 1.216085433959961 +Epoch 50, Loss: 1.1637063026428223 +Epoch 60, Loss: 1.1040657758712769 +Epoch 70, Loss: 1.045640230178833 +Epoch 80, Loss: 0.9856804609298706 +Epoch 90, Loss: 0.9280349016189575 +Epoch 100, Loss: 0.8764320015907288 +Epoch 110, Loss: 0.8276554346084595 +Epoch 120, Loss: 0.7788572907447815 +Epoch 130, Loss: 0.7402625679969788 +Epoch 140, Loss: 0.6984351873397827 +Epoch 150, Loss: 0.6676878333091736 +Epoch 160, Loss: 0.6290295124053955 +Epoch 170, Loss: 0.6052759885787964 +Epoch 180, Loss: 0.5773846507072449 +Epoch 190, Loss: 0.5497792959213257 +Test accuracy: 84.65% +Epoch 0, Loss: 1.3859270811080933 +Epoch 10, Loss: 1.334740161895752 +Epoch 20, Loss: 1.2987860441207886 +Epoch 30, Loss: 1.255715012550354 +Epoch 40, Loss: 1.205029010772705 +Epoch 50, Loss: 1.1500943899154663 +Epoch 60, Loss: 1.0888978242874146 +Epoch 70, Loss: 1.0262420177459717 +Epoch 80, Loss: 0.9654853343963623 +Epoch 90, Loss: 0.9050434231758118 +Epoch 100, Loss: 0.853352427482605 +Epoch 110, Loss: 0.804863452911377 +Epoch 120, Loss: 0.765449583530426 +Epoch 130, Loss: 0.723056435585022 +Epoch 140, Loss: 0.6851338744163513 +Epoch 150, Loss: 0.653718113899231 +Epoch 160, Loss: 0.6196248531341553 +Epoch 170, Loss: 0.5952385663986206 +Epoch 180, Loss: 0.5622754693031311 +Epoch 190, Loss: 0.5408639907836914 +Test accuracy: 82.51% +Epoch 0, Loss: 1.3832144737243652 +Epoch 10, Loss: 1.3367801904678345 +Epoch 20, Loss: 1.3027524948120117 +Epoch 30, Loss: 1.2583297491073608 +Epoch 40, Loss: 1.2080836296081543 +Epoch 50, Loss: 1.1515231132507324 +Epoch 60, Loss: 1.090341567993164 +Epoch 70, Loss: 1.0297391414642334 +Epoch 80, Loss: 0.965421199798584 +Epoch 90, Loss: 0.9107279181480408 +Epoch 100, Loss: 0.8541241884231567 +Epoch 110, Loss: 0.8091398477554321 +Epoch 120, Loss: 0.7651974558830261 +Epoch 130, Loss: 0.7231466770172119 +Epoch 140, Loss: 0.6912961006164551 +Epoch 150, Loss: 0.6558460593223572 +Epoch 160, Loss: 0.618468165397644 +Epoch 170, Loss: 0.5938808917999268 +Epoch 180, Loss: 0.5683111548423767 +Epoch 190, Loss: 0.5419129133224487 +Test accuracy: 83.27% +Epoch 0, Loss: 1.3915581703186035 +Epoch 10, Loss: 1.3416988849639893 +Epoch 20, Loss: 1.305582046508789 +Epoch 30, Loss: 1.2691442966461182 +Epoch 40, Loss: 1.2219077348709106 +Epoch 50, Loss: 1.1711432933807373 +Epoch 60, Loss: 1.1184812784194946 +Epoch 70, Loss: 1.0603981018066406 +Epoch 80, Loss: 1.0003654956817627 +Epoch 90, Loss: 0.9422003030776978 +Epoch 100, Loss: 0.8875604271888733 +Epoch 110, Loss: 0.8313048481941223 +Epoch 120, Loss: 0.787736177444458 +Epoch 130, Loss: 0.7371302843093872 +Epoch 140, Loss: 0.7000241875648499 +Epoch 150, Loss: 0.6627399325370789 +Epoch 160, Loss: 0.6305631995201111 +Epoch 170, Loss: 0.6049786806106567 +Epoch 180, Loss: 0.5760247111320496 +Epoch 190, Loss: 0.5527182221412659 +Test accuracy: 82.47% +Epoch 0, Loss: 1.3844153881072998 +Epoch 10, Loss: 1.336543321609497 +Epoch 20, Loss: 1.306634783744812 +Epoch 30, Loss: 1.265894889831543 +Epoch 40, Loss: 1.2187975645065308 +Epoch 50, Loss: 1.1677793264389038 +Epoch 60, Loss: 1.1169379949569702 +Epoch 70, Loss: 1.0595823526382446 +Epoch 80, Loss: 1.0033711194992065 +Epoch 90, Loss: 0.9473984241485596 +Epoch 100, Loss: 0.8921090364456177 +Epoch 110, Loss: 0.8372591137886047 +Epoch 120, Loss: 0.7965505123138428 +Epoch 130, Loss: 0.7518143653869629 +Epoch 140, Loss: 0.7079082131385803 +Epoch 150, Loss: 0.675591230392456 +Epoch 160, Loss: 0.6471511721611023 +Epoch 170, Loss: 0.6090580821037292 +Epoch 180, Loss: 0.5854516625404358 +Epoch 190, Loss: 0.5562903881072998 +Test accuracy: 83.98% +Epoch 0, Loss: 1.3888128995895386 +Epoch 10, Loss: 1.3425064086914062 +Epoch 20, Loss: 1.3146915435791016 +Epoch 30, Loss: 1.2790589332580566 +Epoch 40, Loss: 1.2370893955230713 +Epoch 50, Loss: 1.185679316520691 +Epoch 60, Loss: 1.1261266469955444 +Epoch 70, Loss: 1.0668038129806519 +Epoch 80, Loss: 1.006605863571167 +Epoch 90, Loss: 0.9469964504241943 +Epoch 100, Loss: 0.8895479440689087 +Epoch 110, Loss: 0.841518223285675 +Epoch 120, Loss: 0.7928873300552368 +Epoch 130, Loss: 0.7443462610244751 +Epoch 140, Loss: 0.7060192823410034 +Epoch 150, Loss: 0.6698431372642517 +Epoch 160, Loss: 0.6344574689865112 +Epoch 170, Loss: 0.6052277088165283 +Epoch 180, Loss: 0.5707271695137024 +Epoch 190, Loss: 0.5513062477111816 +Test accuracy: 82.06% +Epoch 0, Loss: 1.3855719566345215 +Epoch 10, Loss: 1.3375215530395508 +Epoch 20, Loss: 1.306057333946228 +Epoch 30, Loss: 1.2657804489135742 +Epoch 40, Loss: 1.2198878526687622 +Epoch 50, Loss: 1.1671961545944214 +Epoch 60, Loss: 1.1109730005264282 +Epoch 70, Loss: 1.052310824394226 +Epoch 80, Loss: 0.9907115697860718 +Epoch 90, Loss: 0.9289754033088684 +Epoch 100, Loss: 0.8749828338623047 +Epoch 110, Loss: 0.8224849700927734 +Epoch 120, Loss: 0.7788331508636475 +Epoch 130, Loss: 0.7373870611190796 +Epoch 140, Loss: 0.6957141160964966 +Epoch 150, Loss: 0.6554678082466125 +Epoch 160, Loss: 0.6292099952697754 +Epoch 170, Loss: 0.5992793440818787 +Epoch 180, Loss: 0.5713099837303162 +Epoch 190, Loss: 0.5454257130622864 +Test accuracy: 82.82% +Epoch 0, Loss: 1.3844809532165527 +Epoch 10, Loss: 1.333243489265442 +Epoch 20, Loss: 1.2988624572753906 +Epoch 30, Loss: 1.2546067237854004 +Epoch 40, Loss: 1.2051610946655273 +Epoch 50, Loss: 1.156261920928955 +Epoch 60, Loss: 1.0977991819381714 +Epoch 70, Loss: 1.0429840087890625 +Epoch 80, Loss: 0.9778203368186951 +Epoch 90, Loss: 0.9256529808044434 +Epoch 100, Loss: 0.8713173270225525 +Epoch 110, Loss: 0.8194608688354492 +Epoch 120, Loss: 0.7831090092658997 +Epoch 130, Loss: 0.7376707196235657 +Epoch 140, Loss: 0.7036867141723633 +Epoch 150, Loss: 0.6633922457695007 +Epoch 160, Loss: 0.6286227107048035 +Epoch 170, Loss: 0.6031530499458313 +Epoch 180, Loss: 0.5726712942123413 +Epoch 190, Loss: 0.5478419065475464 +Test accuracy: 81.84% +Epoch 0, Loss: 1.3864845037460327 +Epoch 10, Loss: 1.3393443822860718 +Epoch 20, Loss: 1.30474054813385 +Epoch 30, Loss: 1.2645814418792725 +Epoch 40, Loss: 1.2142550945281982 +Epoch 50, Loss: 1.159501314163208 +Epoch 60, Loss: 1.1007657051086426 +Epoch 70, Loss: 1.0394008159637451 +Epoch 80, Loss: 0.9825728535652161 +Epoch 90, Loss: 0.9269363284111023 +Epoch 100, Loss: 0.8718264698982239 +Epoch 110, Loss: 0.8197274208068848 +Epoch 120, Loss: 0.7764488458633423 +Epoch 130, Loss: 0.7295452952384949 +Epoch 140, Loss: 0.6938949823379517 +Epoch 150, Loss: 0.6596653461456299 +Epoch 160, Loss: 0.6335554122924805 +Epoch 170, Loss: 0.6045184135437012 +Epoch 180, Loss: 0.5734683871269226 +Epoch 190, Loss: 0.5510009527206421 +Test accuracy: 82.87% +Epoch 0, Loss: 1.388601303100586 +Epoch 10, Loss: 1.381019949913025 +Epoch 20, Loss: 1.3740644454956055 +Epoch 30, Loss: 1.3670865297317505 +Epoch 40, Loss: 1.361127257347107 +Epoch 50, Loss: 1.355239987373352 +Epoch 60, Loss: 1.3494621515274048 +Epoch 70, Loss: 1.344150424003601 +Epoch 80, Loss: 1.3401354551315308 +Epoch 90, Loss: 1.3339146375656128 +Epoch 100, Loss: 1.3295577764511108 +Epoch 110, Loss: 1.3251128196716309 +Epoch 120, Loss: 1.3206332921981812 +Epoch 130, Loss: 1.3138881921768188 +Epoch 140, Loss: 1.3103915452957153 +Epoch 150, Loss: 1.3047816753387451 +Epoch 160, Loss: 1.3004441261291504 +Epoch 170, Loss: 1.294288158416748 +Epoch 180, Loss: 1.2901421785354614 +Epoch 190, Loss: 1.2851026058197021 +Test accuracy: 49.67% +Epoch 0, Loss: 1.3888674974441528 +Epoch 10, Loss: 1.380410075187683 +Epoch 20, Loss: 1.3725963830947876 +Epoch 30, Loss: 1.3658218383789062 +Epoch 40, Loss: 1.3595374822616577 +Epoch 50, Loss: 1.3548871278762817 +Epoch 60, Loss: 1.3501582145690918 +Epoch 70, Loss: 1.3471040725708008 +Epoch 80, Loss: 1.341435194015503 +Epoch 90, Loss: 1.338135838508606 +Epoch 100, Loss: 1.3337308168411255 +Epoch 110, Loss: 1.3310877084732056 +Epoch 120, Loss: 1.326526403427124 +Epoch 130, Loss: 1.321790337562561 +Epoch 140, Loss: 1.318905234336853 +Epoch 150, Loss: 1.3125665187835693 +Epoch 160, Loss: 1.3084670305252075 +Epoch 170, Loss: 1.303667664527893 +Epoch 180, Loss: 1.3006961345672607 +Epoch 190, Loss: 1.2945626974105835 +Test accuracy: 46.91% +Epoch 0, Loss: 1.385236382484436 +Epoch 10, Loss: 1.3772021532058716 +Epoch 20, Loss: 1.3707053661346436 +Epoch 30, Loss: 1.365155816078186 +Epoch 40, Loss: 1.3587208986282349 +Epoch 50, Loss: 1.3539535999298096 +Epoch 60, Loss: 1.3498845100402832 +Epoch 70, Loss: 1.3459683656692505 +Epoch 80, Loss: 1.3404978513717651 +Epoch 90, Loss: 1.337483525276184 +Epoch 100, Loss: 1.332153558731079 +Epoch 110, Loss: 1.329391598701477 +Epoch 120, Loss: 1.3257051706314087 +Epoch 130, Loss: 1.3196014165878296 +Epoch 140, Loss: 1.3159867525100708 +Epoch 150, Loss: 1.3106588125228882 +Epoch 160, Loss: 1.3050899505615234 +Epoch 170, Loss: 1.300521731376648 +Epoch 180, Loss: 1.2965068817138672 +Epoch 190, Loss: 1.2891851663589478 +Test accuracy: 50.65% +Epoch 0, Loss: 1.3858773708343506 +Epoch 10, Loss: 1.3784888982772827 +Epoch 20, Loss: 1.373155951499939 +Epoch 30, Loss: 1.3667956590652466 +Epoch 40, Loss: 1.3610599040985107 +Epoch 50, Loss: 1.3554767370224 +Epoch 60, Loss: 1.350537896156311 +Epoch 70, Loss: 1.344731092453003 +Epoch 80, Loss: 1.3394428491592407 +Epoch 90, Loss: 1.3347699642181396 +Epoch 100, Loss: 1.329047679901123 +Epoch 110, Loss: 1.3246088027954102 +Epoch 120, Loss: 1.3196020126342773 +Epoch 130, Loss: 1.3136993646621704 +Epoch 140, Loss: 1.309191107749939 +Epoch 150, Loss: 1.3035310506820679 +Epoch 160, Loss: 1.299450397491455 +Epoch 170, Loss: 1.2937443256378174 +Epoch 180, Loss: 1.2891526222229004 +Epoch 190, Loss: 1.283782720565796 +Test accuracy: 56.16% +Epoch 0, Loss: 1.3827159404754639 +Epoch 10, Loss: 1.37347412109375 +Epoch 20, Loss: 1.365595817565918 +Epoch 30, Loss: 1.3599430322647095 +Epoch 40, Loss: 1.3549762964248657 +Epoch 50, Loss: 1.3493196964263916 +Epoch 60, Loss: 1.3450313806533813 +Epoch 70, Loss: 1.3416080474853516 +Epoch 80, Loss: 1.3370920419692993 +Epoch 90, Loss: 1.3318023681640625 +Epoch 100, Loss: 1.32846200466156 +Epoch 110, Loss: 1.324042797088623 +Epoch 120, Loss: 1.3195116519927979 +Epoch 130, Loss: 1.3147974014282227 +Epoch 140, Loss: 1.3099068403244019 +Epoch 150, Loss: 1.305217981338501 +Epoch 160, Loss: 1.299854040145874 +Epoch 170, Loss: 1.296398639678955 +Epoch 180, Loss: 1.290007472038269 +Epoch 190, Loss: 1.2840774059295654 +Test accuracy: 53.45% +Epoch 0, Loss: 1.3879088163375854 +Epoch 10, Loss: 1.3792085647583008 +Epoch 20, Loss: 1.3712553977966309 +Epoch 30, Loss: 1.3651319742202759 +Epoch 40, Loss: 1.359365463256836 +Epoch 50, Loss: 1.3538883924484253 +Epoch 60, Loss: 1.3473784923553467 +Epoch 70, Loss: 1.343485951423645 +Epoch 80, Loss: 1.3390192985534668 +Epoch 90, Loss: 1.3346840143203735 +Epoch 100, Loss: 1.3308452367782593 +Epoch 110, Loss: 1.3256577253341675 +Epoch 120, Loss: 1.3201555013656616 +Epoch 130, Loss: 1.3157397508621216 +Epoch 140, Loss: 1.3122642040252686 +Epoch 150, Loss: 1.3058514595031738 +Epoch 160, Loss: 1.3008794784545898 +Epoch 170, Loss: 1.2949801683425903 +Epoch 180, Loss: 1.2921748161315918 +Epoch 190, Loss: 1.2858649492263794 +Test accuracy: 54.61% +Epoch 0, Loss: 1.3880778551101685 +Epoch 10, Loss: 1.3810837268829346 +Epoch 20, Loss: 1.3745383024215698 +Epoch 30, Loss: 1.3682138919830322 +Epoch 40, Loss: 1.3619599342346191 +Epoch 50, Loss: 1.3566545248031616 +Epoch 60, Loss: 1.3512600660324097 +Epoch 70, Loss: 1.3456333875656128 +Epoch 80, Loss: 1.3415255546569824 +Epoch 90, Loss: 1.3362863063812256 +Epoch 100, Loss: 1.3310400247573853 +Epoch 110, Loss: 1.327934741973877 +Epoch 120, Loss: 1.323598027229309 +Epoch 130, Loss: 1.3189681768417358 +Epoch 140, Loss: 1.314406394958496 +Epoch 150, Loss: 1.3099480867385864 +Epoch 160, Loss: 1.3061003684997559 +Epoch 170, Loss: 1.302295446395874 +Epoch 180, Loss: 1.2961127758026123 +Epoch 190, Loss: 1.2895599603652954 +Test accuracy: 55.85% +Epoch 0, Loss: 1.387598991394043 +Epoch 10, Loss: 1.3796100616455078 +Epoch 20, Loss: 1.372002363204956 +Epoch 30, Loss: 1.3645035028457642 +Epoch 40, Loss: 1.3577747344970703 +Epoch 50, Loss: 1.3518733978271484 +Epoch 60, Loss: 1.3470007181167603 +Epoch 70, Loss: 1.3408929109573364 +Epoch 80, Loss: 1.3366892337799072 +Epoch 90, Loss: 1.331158995628357 +Epoch 100, Loss: 1.3269559144973755 +Epoch 110, Loss: 1.3224871158599854 +Epoch 120, Loss: 1.316573977470398 +Epoch 130, Loss: 1.312349796295166 +Epoch 140, Loss: 1.30728280544281 +Epoch 150, Loss: 1.30086088180542 +Epoch 160, Loss: 1.296175241470337 +Epoch 170, Loss: 1.2906571626663208 +Epoch 180, Loss: 1.2857675552368164 +Epoch 190, Loss: 1.2783496379852295 +Test accuracy: 56.21% +Epoch 0, Loss: 1.3858071565628052 +Epoch 10, Loss: 1.3782349824905396 +Epoch 20, Loss: 1.3708062171936035 +Epoch 30, Loss: 1.364144206047058 +Epoch 40, Loss: 1.3573342561721802 +Epoch 50, Loss: 1.3514008522033691 +Epoch 60, Loss: 1.3466143608093262 +Epoch 70, Loss: 1.3415499925613403 +Epoch 80, Loss: 1.337766170501709 +Epoch 90, Loss: 1.333046793937683 +Epoch 100, Loss: 1.328931212425232 +Epoch 110, Loss: 1.3237537145614624 +Epoch 120, Loss: 1.3195282220840454 +Epoch 130, Loss: 1.3148499727249146 +Epoch 140, Loss: 1.3104922771453857 +Epoch 150, Loss: 1.3054065704345703 +Epoch 160, Loss: 1.2993760108947754 +Epoch 170, Loss: 1.294257640838623 +Epoch 180, Loss: 1.288656234741211 +Epoch 190, Loss: 1.2833847999572754 +Test accuracy: 53.94% +Epoch 0, Loss: 1.3882343769073486 +Epoch 10, Loss: 1.3810687065124512 +Epoch 20, Loss: 1.37431800365448 +Epoch 30, Loss: 1.368179440498352 +Epoch 40, Loss: 1.3629790544509888 +Epoch 50, Loss: 1.3566192388534546 +Epoch 60, Loss: 1.351983904838562 +Epoch 70, Loss: 1.3468395471572876 +Epoch 80, Loss: 1.340857982635498 +Epoch 90, Loss: 1.338212251663208 +Epoch 100, Loss: 1.3323371410369873 +Epoch 110, Loss: 1.3297251462936401 +Epoch 120, Loss: 1.323218822479248 +Epoch 130, Loss: 1.3188841342926025 +Epoch 140, Loss: 1.3137019872665405 +Epoch 150, Loss: 1.308922529220581 +Epoch 160, Loss: 1.3058420419692993 +Epoch 170, Loss: 1.2999063730239868 +Epoch 180, Loss: 1.2930493354797363 +Epoch 190, Loss: 1.2863049507141113 +Test accuracy: 52.96% +Epoch 0, Loss: 1.3868807554244995 +Epoch 10, Loss: 1.0623191595077515 +Epoch 20, Loss: 0.6786327362060547 +Epoch 30, Loss: 0.45216745138168335 +Epoch 40, Loss: 0.3209710121154785 +Epoch 50, Loss: 0.2404567301273346 +Epoch 60, Loss: 0.19036804139614105 +Epoch 70, Loss: 0.15784239768981934 +Epoch 80, Loss: 0.13241226971149445 +Epoch 90, Loss: 0.11487722396850586 +Epoch 100, Loss: 0.10187732428312302 +Epoch 110, Loss: 0.08853504806756973 +Epoch 120, Loss: 0.08032985031604767 +Epoch 130, Loss: 0.07229981571435928 +Epoch 140, Loss: 0.06518105417490005 +Epoch 150, Loss: 0.061082873493433 +Epoch 160, Loss: 0.05483519285917282 +Epoch 170, Loss: 0.05017164722084999 +Epoch 180, Loss: 0.04846792668104172 +Epoch 190, Loss: 0.04399847611784935 +Test accuracy: 91.05% +Epoch 0, Loss: 1.38788902759552 +Epoch 10, Loss: 1.07076096534729 +Epoch 20, Loss: 0.6974599957466125 +Epoch 30, Loss: 0.46382179856300354 +Epoch 40, Loss: 0.3368076682090759 +Epoch 50, Loss: 0.2536547780036926 +Epoch 60, Loss: 0.20417119562625885 +Epoch 70, Loss: 0.1703944206237793 +Epoch 80, Loss: 0.14448542892932892 +Epoch 90, Loss: 0.12552738189697266 +Epoch 100, Loss: 0.11243529617786407 +Epoch 110, Loss: 0.10075533390045166 +Epoch 120, Loss: 0.08980672806501389 +Epoch 130, Loss: 0.0814737007021904 +Epoch 140, Loss: 0.07439084351062775 +Epoch 150, Loss: 0.06765618920326233 +Epoch 160, Loss: 0.062493566423654556 +Epoch 170, Loss: 0.05803949758410454 +Epoch 180, Loss: 0.05457009747624397 +Epoch 190, Loss: 0.04982556402683258 +Test accuracy: 91.50% +Epoch 0, Loss: 1.3847531080245972 +Epoch 10, Loss: 1.0605018138885498 +Epoch 20, Loss: 0.6831187009811401 +Epoch 30, Loss: 0.45491334795951843 +Epoch 40, Loss: 0.3251286745071411 +Epoch 50, Loss: 0.24726489186286926 +Epoch 60, Loss: 0.19279372692108154 +Epoch 70, Loss: 0.15963368117809296 +Epoch 80, Loss: 0.13269105553627014 +Epoch 90, Loss: 0.11411428451538086 +Epoch 100, Loss: 0.0994788333773613 +Epoch 110, Loss: 0.08850692212581635 +Epoch 120, Loss: 0.07780186086893082 +Epoch 130, Loss: 0.07253587990999222 +Epoch 140, Loss: 0.0652162954211235 +Epoch 150, Loss: 0.06140564754605293 +Epoch 160, Loss: 0.05417574569582939 +Epoch 170, Loss: 0.050556283444166183 +Epoch 180, Loss: 0.04804568365216255 +Epoch 190, Loss: 0.04374253749847412 +Test accuracy: 90.83% +Epoch 0, Loss: 1.3864423036575317 +Epoch 10, Loss: 1.0282137393951416 +Epoch 20, Loss: 0.654170572757721 +Epoch 30, Loss: 0.4327196478843689 +Epoch 40, Loss: 0.3088414669036865 +Epoch 50, Loss: 0.23672205209732056 +Epoch 60, Loss: 0.18793798983097076 +Epoch 70, Loss: 0.15262028574943542 +Epoch 80, Loss: 0.13021573424339294 +Epoch 90, Loss: 0.11168543249368668 +Epoch 100, Loss: 0.09639180451631546 +Epoch 110, Loss: 0.0860917940735817 +Epoch 120, Loss: 0.07699970155954361 +Epoch 130, Loss: 0.06887147575616837 +Epoch 140, Loss: 0.06232520937919617 +Epoch 150, Loss: 0.058438681066036224 +Epoch 160, Loss: 0.0521407388150692 +Epoch 170, Loss: 0.047974564135074615 +Epoch 180, Loss: 0.043842170387506485 +Epoch 190, Loss: 0.041447218507528305 +Test accuracy: 91.05% +Epoch 0, Loss: 1.3844927549362183 +Epoch 10, Loss: 1.0460911989212036 +Epoch 20, Loss: 0.6705583333969116 +Epoch 30, Loss: 0.44591230154037476 +Epoch 40, Loss: 0.3175293505191803 +Epoch 50, Loss: 0.24463805556297302 +Epoch 60, Loss: 0.1937771737575531 +Epoch 70, Loss: 0.15884333848953247 +Epoch 80, Loss: 0.13415126502513885 +Epoch 90, Loss: 0.11427238583564758 +Epoch 100, Loss: 0.09972919523715973 +Epoch 110, Loss: 0.08817653357982635 +Epoch 120, Loss: 0.08085193485021591 +Epoch 130, Loss: 0.07252974063158035 +Epoch 140, Loss: 0.0667477548122406 +Epoch 150, Loss: 0.060402095317840576 +Epoch 160, Loss: 0.05524303764104843 +Epoch 170, Loss: 0.051635537296533585 +Epoch 180, Loss: 0.04832632094621658 +Epoch 190, Loss: 0.044465553015470505 +Test accuracy: 90.83% +Epoch 0, Loss: 1.388419270515442 +Epoch 10, Loss: 1.0645276308059692 +Epoch 20, Loss: 0.6971367001533508 +Epoch 30, Loss: 0.46732720732688904 +Epoch 40, Loss: 0.3379763960838318 +Epoch 50, Loss: 0.2576673626899719 +Epoch 60, Loss: 0.20785632729530334 +Epoch 70, Loss: 0.17460651695728302 +Epoch 80, Loss: 0.14483216404914856 +Epoch 90, Loss: 0.12631645798683167 +Epoch 100, Loss: 0.11228057742118835 +Epoch 110, Loss: 0.09903033822774887 +Epoch 120, Loss: 0.08946548402309418 +Epoch 130, Loss: 0.08130944520235062 +Epoch 140, Loss: 0.07320903986692429 +Epoch 150, Loss: 0.06806745380163193 +Epoch 160, Loss: 0.06189962476491928 +Epoch 170, Loss: 0.0573609322309494 +Epoch 180, Loss: 0.053181979805231094 +Epoch 190, Loss: 0.05134262517094612 +Test accuracy: 90.65% +Epoch 0, Loss: 1.3827396631240845 +Epoch 10, Loss: 1.0614914894104004 +Epoch 20, Loss: 0.6869574785232544 +Epoch 30, Loss: 0.44976648688316345 +Epoch 40, Loss: 0.31782808899879456 +Epoch 50, Loss: 0.24230162799358368 +Epoch 60, Loss: 0.1918337494134903 +Epoch 70, Loss: 0.15550096333026886 +Epoch 80, Loss: 0.13051199913024902 +Epoch 90, Loss: 0.11281047016382217 +Epoch 100, Loss: 0.09980672597885132 +Epoch 110, Loss: 0.08887842297554016 +Epoch 120, Loss: 0.08044963330030441 +Epoch 130, Loss: 0.07281401753425598 +Epoch 140, Loss: 0.06605713069438934 +Epoch 150, Loss: 0.058950066566467285 +Epoch 160, Loss: 0.05556159093976021 +Epoch 170, Loss: 0.05146140977740288 +Epoch 180, Loss: 0.04674345254898071 +Epoch 190, Loss: 0.043690137565135956 +Test accuracy: 91.19% +Epoch 0, Loss: 1.386193871498108 +Epoch 10, Loss: 1.041043996810913 +Epoch 20, Loss: 0.6640700101852417 +Epoch 30, Loss: 0.44241440296173096 +Epoch 40, Loss: 0.31194940209388733 +Epoch 50, Loss: 0.2358769029378891 +Epoch 60, Loss: 0.1870661824941635 +Epoch 70, Loss: 0.15392933785915375 +Epoch 80, Loss: 0.13327321410179138 +Epoch 90, Loss: 0.11639604717493057 +Epoch 100, Loss: 0.10122543573379517 +Epoch 110, Loss: 0.08985092490911484 +Epoch 120, Loss: 0.08220326155424118 +Epoch 130, Loss: 0.07347245514392853 +Epoch 140, Loss: 0.06728462874889374 +Epoch 150, Loss: 0.06131693720817566 +Epoch 160, Loss: 0.05715202912688255 +Epoch 170, Loss: 0.05271317437291145 +Epoch 180, Loss: 0.04809859022498131 +Epoch 190, Loss: 0.044571537524461746 +Test accuracy: 90.21% +Epoch 0, Loss: 1.3895894289016724 +Epoch 10, Loss: 1.0607043504714966 +Epoch 20, Loss: 0.6818236708641052 +Epoch 30, Loss: 0.45834270119667053 +Epoch 40, Loss: 0.32064688205718994 +Epoch 50, Loss: 0.24209220707416534 +Epoch 60, Loss: 0.19215857982635498 +Epoch 70, Loss: 0.15842701494693756 +Epoch 80, Loss: 0.13396121561527252 +Epoch 90, Loss: 0.11572769284248352 +Epoch 100, Loss: 0.10411939024925232 +Epoch 110, Loss: 0.09187007695436478 +Epoch 120, Loss: 0.08187122642993927 +Epoch 130, Loss: 0.07593698054552078 +Epoch 140, Loss: 0.06923224031925201 +Epoch 150, Loss: 0.0625448152422905 +Epoch 160, Loss: 0.05749582499265671 +Epoch 170, Loss: 0.05421364679932594 +Epoch 180, Loss: 0.04904395714402199 +Epoch 190, Loss: 0.045714594423770905 +Test accuracy: 90.88% +Epoch 0, Loss: 1.385834813117981 +Epoch 10, Loss: 1.0623589754104614 +Epoch 20, Loss: 0.6847968101501465 +Epoch 30, Loss: 0.4494135081768036 +Epoch 40, Loss: 0.31515827775001526 +Epoch 50, Loss: 0.23883315920829773 +Epoch 60, Loss: 0.18778280913829803 +Epoch 70, Loss: 0.1536729633808136 +Epoch 80, Loss: 0.1296575516462326 +Epoch 90, Loss: 0.11211986094713211 +Epoch 100, Loss: 0.09770604968070984 +Epoch 110, Loss: 0.08693106472492218 +Epoch 120, Loss: 0.07830508798360825 +Epoch 130, Loss: 0.0697486400604248 +Epoch 140, Loss: 0.06511128693819046 +Epoch 150, Loss: 0.057914067059755325 +Epoch 160, Loss: 0.05517364293336868 +Epoch 170, Loss: 0.05048112943768501 +Epoch 180, Loss: 0.04589502513408661 +Epoch 190, Loss: 0.04345479980111122 +Test accuracy: 90.43% +Epoch 0, Loss: 1.386948585510254 +Epoch 10, Loss: 1.323099136352539 +Epoch 20, Loss: 1.273311972618103 +Epoch 30, Loss: 1.212019920349121 +Epoch 40, Loss: 1.13663911819458 +Epoch 50, Loss: 1.0594415664672852 +Epoch 60, Loss: 0.9812473654747009 +Epoch 70, Loss: 0.8977675437927246 +Epoch 80, Loss: 0.8246899843215942 +Epoch 90, Loss: 0.7608193159103394 +Epoch 100, Loss: 0.6962583065032959 +Epoch 110, Loss: 0.6428635120391846 +Epoch 120, Loss: 0.5961773991584778 +Epoch 130, Loss: 0.5554243326187134 +Epoch 140, Loss: 0.5181863903999329 +Epoch 150, Loss: 0.4911116659641266 +Epoch 160, Loss: 0.46139684319496155 +Epoch 170, Loss: 0.43735256791114807 +Epoch 180, Loss: 0.41365885734558105 +Epoch 190, Loss: 0.39034304022789 +Test accuracy: 85.49% +Epoch 0, Loss: 1.3863513469696045 +Epoch 10, Loss: 1.3221474885940552 +Epoch 20, Loss: 1.272039532661438 +Epoch 30, Loss: 1.20917546749115 +Epoch 40, Loss: 1.1351943016052246 +Epoch 50, Loss: 1.0541220903396606 +Epoch 60, Loss: 0.9735820889472961 +Epoch 70, Loss: 0.8953102827072144 +Epoch 80, Loss: 0.8219131231307983 +Epoch 90, Loss: 0.7572924494743347 +Epoch 100, Loss: 0.7006328105926514 +Epoch 110, Loss: 0.6465423703193665 +Epoch 120, Loss: 0.6052132844924927 +Epoch 130, Loss: 0.5631029605865479 +Epoch 140, Loss: 0.5287454724311829 +Epoch 150, Loss: 0.49368396401405334 +Epoch 160, Loss: 0.46855273842811584 +Epoch 170, Loss: 0.43901756405830383 +Epoch 180, Loss: 0.42022427916526794 +Epoch 190, Loss: 0.3968620300292969 +Test accuracy: 85.22% +Epoch 0, Loss: 1.3863252401351929 +Epoch 10, Loss: 1.3264641761779785 +Epoch 20, Loss: 1.2766659259796143 +Epoch 30, Loss: 1.2137352228164673 +Epoch 40, Loss: 1.141446590423584 +Epoch 50, Loss: 1.0617711544036865 +Epoch 60, Loss: 0.9782910943031311 +Epoch 70, Loss: 0.8942542672157288 +Epoch 80, Loss: 0.8249115347862244 +Epoch 90, Loss: 0.7597345113754272 +Epoch 100, Loss: 0.7013579607009888 +Epoch 110, Loss: 0.6465625166893005 +Epoch 120, Loss: 0.6040608286857605 +Epoch 130, Loss: 0.5649498701095581 +Epoch 140, Loss: 0.5307915806770325 +Epoch 150, Loss: 0.4991401433944702 +Epoch 160, Loss: 0.46803018450737 +Epoch 170, Loss: 0.44295698404312134 +Epoch 180, Loss: 0.4189361035823822 +Epoch 190, Loss: 0.39773452281951904 +Test accuracy: 85.98% +Epoch 0, Loss: 1.3829740285873413 +Epoch 10, Loss: 1.3201162815093994 +Epoch 20, Loss: 1.2661203145980835 +Epoch 30, Loss: 1.2000691890716553 +Epoch 40, Loss: 1.1279886960983276 +Epoch 50, Loss: 1.0520910024642944 +Epoch 60, Loss: 0.9748145937919617 +Epoch 70, Loss: 0.8996291160583496 +Epoch 80, Loss: 0.8310197591781616 +Epoch 90, Loss: 0.7653401494026184 +Epoch 100, Loss: 0.7117089629173279 +Epoch 110, Loss: 0.6538561582565308 +Epoch 120, Loss: 0.6064378619194031 +Epoch 130, Loss: 0.5688475966453552 +Epoch 140, Loss: 0.5320025086402893 +Epoch 150, Loss: 0.4981589913368225 +Epoch 160, Loss: 0.4711076021194458 +Epoch 170, Loss: 0.4435550570487976 +Epoch 180, Loss: 0.4187997281551361 +Epoch 190, Loss: 0.39853635430336 +Test accuracy: 86.43% +Epoch 0, Loss: 1.3847535848617554 +Epoch 10, Loss: 1.324244499206543 +Epoch 20, Loss: 1.2755956649780273 +Epoch 30, Loss: 1.2134933471679688 +Epoch 40, Loss: 1.1388903856277466 +Epoch 50, Loss: 1.058170199394226 +Epoch 60, Loss: 0.9761254787445068 +Epoch 70, Loss: 0.8975511193275452 +Epoch 80, Loss: 0.8248583078384399 +Epoch 90, Loss: 0.7547392845153809 +Epoch 100, Loss: 0.6956835985183716 +Epoch 110, Loss: 0.6417934894561768 +Epoch 120, Loss: 0.601382851600647 +Epoch 130, Loss: 0.5594225525856018 +Epoch 140, Loss: 0.5245170593261719 +Epoch 150, Loss: 0.49175864458084106 +Epoch 160, Loss: 0.4635697603225708 +Epoch 170, Loss: 0.43995389342308044 +Epoch 180, Loss: 0.41521087288856506 +Epoch 190, Loss: 0.3973343074321747 +Test accuracy: 85.54% +Epoch 0, Loss: 1.3852618932724 +Epoch 10, Loss: 1.323371410369873 +Epoch 20, Loss: 1.2716351747512817 +Epoch 30, Loss: 1.2087326049804688 +Epoch 40, Loss: 1.136161208152771 +Epoch 50, Loss: 1.0580966472625732 +Epoch 60, Loss: 0.9782825708389282 +Epoch 70, Loss: 0.8970054388046265 +Epoch 80, Loss: 0.8251324892044067 +Epoch 90, Loss: 0.7592600584030151 +Epoch 100, Loss: 0.700645923614502 +Epoch 110, Loss: 0.6460943222045898 +Epoch 120, Loss: 0.6032999157905579 +Epoch 130, Loss: 0.5629398822784424 +Epoch 140, Loss: 0.5225749015808105 +Epoch 150, Loss: 0.4981279969215393 +Epoch 160, Loss: 0.46752962470054626 +Epoch 170, Loss: 0.4421411156654358 +Epoch 180, Loss: 0.418748676776886 +Epoch 190, Loss: 0.4006763696670532 +Test accuracy: 86.03% +Epoch 0, Loss: 1.387999415397644 +Epoch 10, Loss: 1.3247931003570557 +Epoch 20, Loss: 1.2779779434204102 +Epoch 30, Loss: 1.2184642553329468 +Epoch 40, Loss: 1.1471832990646362 +Epoch 50, Loss: 1.067920446395874 +Epoch 60, Loss: 0.9864184260368347 +Epoch 70, Loss: 0.910061776638031 +Epoch 80, Loss: 0.8305314779281616 +Epoch 90, Loss: 0.763442873954773 +Epoch 100, Loss: 0.7088731527328491 +Epoch 110, Loss: 0.656366765499115 +Epoch 120, Loss: 0.6056029796600342 +Epoch 130, Loss: 0.5669882893562317 +Epoch 140, Loss: 0.532433032989502 +Epoch 150, Loss: 0.4997968077659607 +Epoch 160, Loss: 0.4695809781551361 +Epoch 170, Loss: 0.44422003626823425 +Epoch 180, Loss: 0.4213411509990692 +Epoch 190, Loss: 0.3979792892932892 +Test accuracy: 86.07% +Epoch 0, Loss: 1.3859764337539673 +Epoch 10, Loss: 1.323050856590271 +Epoch 20, Loss: 1.2732656002044678 +Epoch 30, Loss: 1.2101598978042603 +Epoch 40, Loss: 1.1390624046325684 +Epoch 50, Loss: 1.058864951133728 +Epoch 60, Loss: 0.981478750705719 +Epoch 70, Loss: 0.9006791114807129 +Epoch 80, Loss: 0.830478847026825 +Epoch 90, Loss: 0.7642144560813904 +Epoch 100, Loss: 0.7019712328910828 +Epoch 110, Loss: 0.6500149369239807 +Epoch 120, Loss: 0.6033796072006226 +Epoch 130, Loss: 0.5658741593360901 +Epoch 140, Loss: 0.527780294418335 +Epoch 150, Loss: 0.494301438331604 +Epoch 160, Loss: 0.4712393283843994 +Epoch 170, Loss: 0.44222694635391235 +Epoch 180, Loss: 0.4173159599304199 +Epoch 190, Loss: 0.39587414264678955 +Test accuracy: 85.76% +Epoch 0, Loss: 1.3841313123703003 +Epoch 10, Loss: 1.3240795135498047 +Epoch 20, Loss: 1.2717182636260986 +Epoch 30, Loss: 1.2055271863937378 +Epoch 40, Loss: 1.1267576217651367 +Epoch 50, Loss: 1.0446476936340332 +Epoch 60, Loss: 0.9629684686660767 +Epoch 70, Loss: 0.8883092999458313 +Epoch 80, Loss: 0.813154399394989 +Epoch 90, Loss: 0.7469939589500427 +Epoch 100, Loss: 0.6917588710784912 +Epoch 110, Loss: 0.6398966908454895 +Epoch 120, Loss: 0.5997216701507568 +Epoch 130, Loss: 0.5581250786781311 +Epoch 140, Loss: 0.5256232619285583 +Epoch 150, Loss: 0.49473318457603455 +Epoch 160, Loss: 0.4660987854003906 +Epoch 170, Loss: 0.4421115815639496 +Epoch 180, Loss: 0.42077118158340454 +Epoch 190, Loss: 0.3983655869960785 +Test accuracy: 85.00% +Epoch 0, Loss: 1.385669469833374 +Epoch 10, Loss: 1.3237208127975464 +Epoch 20, Loss: 1.2746021747589111 +Epoch 30, Loss: 1.213302731513977 +Epoch 40, Loss: 1.1415499448776245 +Epoch 50, Loss: 1.0638309717178345 +Epoch 60, Loss: 0.9816087484359741 +Epoch 70, Loss: 0.9043010473251343 +Epoch 80, Loss: 0.8297415375709534 +Epoch 90, Loss: 0.7635629177093506 +Epoch 100, Loss: 0.7051215767860413 +Epoch 110, Loss: 0.6518590450286865 +Epoch 120, Loss: 0.6020861268043518 +Epoch 130, Loss: 0.5662134289741516 +Epoch 140, Loss: 0.5292118787765503 +Epoch 150, Loss: 0.4992736876010895 +Epoch 160, Loss: 0.468696266412735 +Epoch 170, Loss: 0.44439002871513367 +Epoch 180, Loss: 0.42161431908607483 +Epoch 190, Loss: 0.39989686012268066 +Test accuracy: 85.45% +Epoch 0, Loss: 1.3893193006515503 +Epoch 10, Loss: 1.3780975341796875 +Epoch 20, Loss: 1.3692682981491089 +Epoch 30, Loss: 1.3610113859176636 +Epoch 40, Loss: 1.3531333208084106 +Epoch 50, Loss: 1.3460891246795654 +Epoch 60, Loss: 1.3403807878494263 +Epoch 70, Loss: 1.334587812423706 +Epoch 80, Loss: 1.3283559083938599 +Epoch 90, Loss: 1.3233481645584106 +Epoch 100, Loss: 1.3177729845046997 +Epoch 110, Loss: 1.3108490705490112 +Epoch 120, Loss: 1.303546667098999 +Epoch 130, Loss: 1.2975727319717407 +Epoch 140, Loss: 1.291654348373413 +Epoch 150, Loss: 1.2832928895950317 +Epoch 160, Loss: 1.2759627103805542 +Epoch 170, Loss: 1.2683014869689941 +Epoch 180, Loss: 1.2605605125427246 +Epoch 190, Loss: 1.2524513006210327 +Test accuracy: 63.06% +Epoch 0, Loss: 1.3871371746063232 +Epoch 10, Loss: 1.3776311874389648 +Epoch 20, Loss: 1.3687291145324707 +Epoch 30, Loss: 1.3604687452316284 +Epoch 40, Loss: 1.3522179126739502 +Epoch 50, Loss: 1.3460447788238525 +Epoch 60, Loss: 1.338692545890808 +Epoch 70, Loss: 1.333203911781311 +Epoch 80, Loss: 1.327697992324829 +Epoch 90, Loss: 1.321858525276184 +Epoch 100, Loss: 1.3168195486068726 +Epoch 110, Loss: 1.3110020160675049 +Epoch 120, Loss: 1.3053425550460815 +Epoch 130, Loss: 1.2983818054199219 +Epoch 140, Loss: 1.29253089427948 +Epoch 150, Loss: 1.28506338596344 +Epoch 160, Loss: 1.2791612148284912 +Epoch 170, Loss: 1.2707456350326538 +Epoch 180, Loss: 1.264562726020813 +Epoch 190, Loss: 1.256437063217163 +Test accuracy: 60.39% +Epoch 0, Loss: 1.3911563158035278 +Epoch 10, Loss: 1.379525899887085 +Epoch 20, Loss: 1.3702776432037354 +Epoch 30, Loss: 1.3622549772262573 +Epoch 40, Loss: 1.3553305864334106 +Epoch 50, Loss: 1.348708987236023 +Epoch 60, Loss: 1.3422693014144897 +Epoch 70, Loss: 1.3362706899642944 +Epoch 80, Loss: 1.3305014371871948 +Epoch 90, Loss: 1.3244926929473877 +Epoch 100, Loss: 1.318284273147583 +Epoch 110, Loss: 1.3125865459442139 +Epoch 120, Loss: 1.3060505390167236 +Epoch 130, Loss: 1.2991799116134644 +Epoch 140, Loss: 1.2927049398422241 +Epoch 150, Loss: 1.2859318256378174 +Epoch 160, Loss: 1.2794374227523804 +Epoch 170, Loss: 1.271422028541565 +Epoch 180, Loss: 1.2638044357299805 +Epoch 190, Loss: 1.2565361261367798 +Test accuracy: 61.42% +Epoch 0, Loss: 1.3877863883972168 +Epoch 10, Loss: 1.3776899576187134 +Epoch 20, Loss: 1.3687808513641357 +Epoch 30, Loss: 1.3610094785690308 +Epoch 40, Loss: 1.353403925895691 +Epoch 50, Loss: 1.3465529680252075 +Epoch 60, Loss: 1.3407942056655884 +Epoch 70, Loss: 1.3342584371566772 +Epoch 80, Loss: 1.328015923500061 +Epoch 90, Loss: 1.321202039718628 +Epoch 100, Loss: 1.3130121231079102 +Epoch 110, Loss: 1.3074116706848145 +Epoch 120, Loss: 1.299867033958435 +Epoch 130, Loss: 1.2928929328918457 +Epoch 140, Loss: 1.2861121892929077 +Epoch 150, Loss: 1.2788114547729492 +Epoch 160, Loss: 1.2699512243270874 +Epoch 170, Loss: 1.2631127834320068 +Epoch 180, Loss: 1.2558826208114624 +Epoch 190, Loss: 1.2475394010543823 +Test accuracy: 64.44% +Epoch 0, Loss: 1.3836618661880493 +Epoch 10, Loss: 1.3723353147506714 +Epoch 20, Loss: 1.3624259233474731 +Epoch 30, Loss: 1.3536018133163452 +Epoch 40, Loss: 1.3462392091751099 +Epoch 50, Loss: 1.3386459350585938 +Epoch 60, Loss: 1.3319920301437378 +Epoch 70, Loss: 1.3257994651794434 +Epoch 80, Loss: 1.3196107149124146 +Epoch 90, Loss: 1.3129830360412598 +Epoch 100, Loss: 1.3072891235351562 +Epoch 110, Loss: 1.3002209663391113 +Epoch 120, Loss: 1.2919855117797852 +Epoch 130, Loss: 1.2843272686004639 +Epoch 140, Loss: 1.2765741348266602 +Epoch 150, Loss: 1.2699201107025146 +Epoch 160, Loss: 1.26213800907135 +Epoch 170, Loss: 1.2541223764419556 +Epoch 180, Loss: 1.2453250885009766 +Epoch 190, Loss: 1.2366836071014404 +Test accuracy: 62.84% +Epoch 0, Loss: 1.3846129179000854 +Epoch 10, Loss: 1.3730705976486206 +Epoch 20, Loss: 1.3633700609207153 +Epoch 30, Loss: 1.3542245626449585 +Epoch 40, Loss: 1.3471744060516357 +Epoch 50, Loss: 1.3392581939697266 +Epoch 60, Loss: 1.3337554931640625 +Epoch 70, Loss: 1.3271185159683228 +Epoch 80, Loss: 1.3214322328567505 +Epoch 90, Loss: 1.3148629665374756 +Epoch 100, Loss: 1.3070815801620483 +Epoch 110, Loss: 1.3018441200256348 +Epoch 120, Loss: 1.2940623760223389 +Epoch 130, Loss: 1.2876341342926025 +Epoch 140, Loss: 1.2792079448699951 +Epoch 150, Loss: 1.2721754312515259 +Epoch 160, Loss: 1.264980673789978 +Epoch 170, Loss: 1.2548552751541138 +Epoch 180, Loss: 1.247288465499878 +Epoch 190, Loss: 1.2401481866836548 +Test accuracy: 60.17% +Epoch 0, Loss: 1.387176752090454 +Epoch 10, Loss: 1.3771823644638062 +Epoch 20, Loss: 1.3688273429870605 +Epoch 30, Loss: 1.3619266748428345 +Epoch 40, Loss: 1.355040431022644 +Epoch 50, Loss: 1.3481889963150024 +Epoch 60, Loss: 1.341644048690796 +Epoch 70, Loss: 1.3353116512298584 +Epoch 80, Loss: 1.3290073871612549 +Epoch 90, Loss: 1.3229557275772095 +Epoch 100, Loss: 1.31656014919281 +Epoch 110, Loss: 1.3099766969680786 +Epoch 120, Loss: 1.3026008605957031 +Epoch 130, Loss: 1.2963758707046509 +Epoch 140, Loss: 1.2890708446502686 +Epoch 150, Loss: 1.283974528312683 +Epoch 160, Loss: 1.2745026350021362 +Epoch 170, Loss: 1.2673842906951904 +Epoch 180, Loss: 1.261262059211731 +Epoch 190, Loss: 1.250719666481018 +Test accuracy: 63.73% +Epoch 0, Loss: 1.385743260383606 +Epoch 10, Loss: 1.37473464012146 +Epoch 20, Loss: 1.3654536008834839 +Epoch 30, Loss: 1.357287049293518 +Epoch 40, Loss: 1.3494987487792969 +Epoch 50, Loss: 1.3429652452468872 +Epoch 60, Loss: 1.3362904787063599 +Epoch 70, Loss: 1.3302397727966309 +Epoch 80, Loss: 1.3246475458145142 +Epoch 90, Loss: 1.318162441253662 +Epoch 100, Loss: 1.312691330909729 +Epoch 110, Loss: 1.3072052001953125 +Epoch 120, Loss: 1.2992372512817383 +Epoch 130, Loss: 1.2917841672897339 +Epoch 140, Loss: 1.284886360168457 +Epoch 150, Loss: 1.2773547172546387 +Epoch 160, Loss: 1.2702600955963135 +Epoch 170, Loss: 1.2623329162597656 +Epoch 180, Loss: 1.2547355890274048 +Epoch 190, Loss: 1.2462818622589111 +Test accuracy: 60.93% +Epoch 0, Loss: 1.3871768712997437 +Epoch 10, Loss: 1.3771767616271973 +Epoch 20, Loss: 1.3685588836669922 +Epoch 30, Loss: 1.3605778217315674 +Epoch 40, Loss: 1.3529521226882935 +Epoch 50, Loss: 1.3465642929077148 +Epoch 60, Loss: 1.3390976190567017 +Epoch 70, Loss: 1.3340344429016113 +Epoch 80, Loss: 1.3275470733642578 +Epoch 90, Loss: 1.3216326236724854 +Epoch 100, Loss: 1.314971923828125 +Epoch 110, Loss: 1.3086786270141602 +Epoch 120, Loss: 1.3016904592514038 +Epoch 130, Loss: 1.2955008745193481 +Epoch 140, Loss: 1.288577675819397 +Epoch 150, Loss: 1.2817994356155396 +Epoch 160, Loss: 1.2736847400665283 +Epoch 170, Loss: 1.2667644023895264 +Epoch 180, Loss: 1.259290337562561 +Epoch 190, Loss: 1.2495882511138916 +Test accuracy: 63.55% +Epoch 0, Loss: 1.3857160806655884 +Epoch 10, Loss: 1.373766303062439 +Epoch 20, Loss: 1.3639280796051025 +Epoch 30, Loss: 1.355380892753601 +Epoch 40, Loss: 1.3482272624969482 +Epoch 50, Loss: 1.3419535160064697 +Epoch 60, Loss: 1.3357399702072144 +Epoch 70, Loss: 1.3291945457458496 +Epoch 80, Loss: 1.3246004581451416 +Epoch 90, Loss: 1.3185933828353882 +Epoch 100, Loss: 1.313464879989624 +Epoch 110, Loss: 1.3064970970153809 +Epoch 120, Loss: 1.2997666597366333 +Epoch 130, Loss: 1.29281485080719 +Epoch 140, Loss: 1.2849698066711426 +Epoch 150, Loss: 1.2775647640228271 +Epoch 160, Loss: 1.2698601484298706 +Epoch 170, Loss: 1.2634752988815308 +Epoch 180, Loss: 1.2549067735671997 +Epoch 190, Loss: 1.2473608255386353 +Test accuracy: 61.10% +Epoch 0, Loss: 1.3866560459136963 +Epoch 10, Loss: 1.202790379524231 +Epoch 20, Loss: 0.9872891902923584 +Epoch 30, Loss: 0.7989488244056702 +Epoch 40, Loss: 0.624291181564331 +Epoch 50, Loss: 0.4988057613372803 +Epoch 60, Loss: 0.40366801619529724 +Epoch 70, Loss: 0.32745224237442017 +Epoch 80, Loss: 0.282329797744751 +Epoch 90, Loss: 0.24215374886989594 +Epoch 100, Loss: 0.21361494064331055 +Epoch 110, Loss: 0.1901201754808426 +Epoch 120, Loss: 0.17183150351047516 +Epoch 130, Loss: 0.16019253432750702 +Epoch 140, Loss: 0.147467702627182 +Epoch 150, Loss: 0.13571016490459442 +Epoch 160, Loss: 0.12678378820419312 +Epoch 170, Loss: 0.12215471267700195 +Epoch 180, Loss: 0.11533018201589584 +Epoch 190, Loss: 0.10773025453090668 +Epoch 200, Loss: 0.10569675266742706 +Epoch 210, Loss: 0.10149082541465759 +Epoch 220, Loss: 0.09557398408651352 +Epoch 230, Loss: 0.09510543197393417 +Epoch 240, Loss: 0.08798561245203018 +Epoch 250, Loss: 0.08407778292894363 +Epoch 260, Loss: 0.08219903707504272 +Epoch 270, Loss: 0.08308731764554977 +Epoch 280, Loss: 0.07795903831720352 +Epoch 290, Loss: 0.07481221854686737 +Test accuracy: 91.10% +Epoch 0, Loss: 1.384423017501831 +Epoch 10, Loss: 1.1832517385482788 +Epoch 20, Loss: 0.9383856058120728 +Epoch 30, Loss: 0.7281785607337952 +Epoch 40, Loss: 0.5843523144721985 +Epoch 50, Loss: 0.4719865620136261 +Epoch 60, Loss: 0.390129953622818 +Epoch 70, Loss: 0.33101415634155273 +Epoch 80, Loss: 0.2870246171951294 +Epoch 90, Loss: 0.25402867794036865 +Epoch 100, Loss: 0.22291651368141174 +Epoch 110, Loss: 0.19837944209575653 +Epoch 120, Loss: 0.18003173172473907 +Epoch 130, Loss: 0.16161224246025085 +Epoch 140, Loss: 0.15027257800102234 +Epoch 150, Loss: 0.1400502771139145 +Epoch 160, Loss: 0.13573071360588074 +Epoch 170, Loss: 0.12178090214729309 +Epoch 180, Loss: 0.11614439636468887 +Epoch 190, Loss: 0.10878924280405045 +Epoch 200, Loss: 0.10467236489057541 +Epoch 210, Loss: 0.09819100052118301 +Epoch 220, Loss: 0.09519503265619278 +Epoch 230, Loss: 0.09099447727203369 +Epoch 240, Loss: 0.08602417260408401 +Epoch 250, Loss: 0.0851060152053833 +Epoch 260, Loss: 0.0789530798792839 +Epoch 270, Loss: 0.07773903012275696 +Epoch 280, Loss: 0.07423784583806992 +Epoch 290, Loss: 0.0713978186249733 +Test accuracy: 90.52% +Epoch 0, Loss: 1.389551043510437 +Epoch 10, Loss: 1.2349945306777954 +Epoch 20, Loss: 1.04498291015625 +Epoch 30, Loss: 0.8451548218727112 +Epoch 40, Loss: 0.6750333905220032 +Epoch 50, Loss: 0.5538209676742554 +Epoch 60, Loss: 0.4580095112323761 +Epoch 70, Loss: 0.3828405439853668 +Epoch 80, Loss: 0.3242496848106384 +Epoch 90, Loss: 0.28679391741752625 +Epoch 100, Loss: 0.25549498200416565 +Epoch 110, Loss: 0.2282762974500656 +Epoch 120, Loss: 0.2062428742647171 +Epoch 130, Loss: 0.1869734674692154 +Epoch 140, Loss: 0.17141488194465637 +Epoch 150, Loss: 0.1574697047472 +Epoch 160, Loss: 0.15014415979385376 +Epoch 170, Loss: 0.1379825472831726 +Epoch 180, Loss: 0.13087448477745056 +Epoch 190, Loss: 0.12200410664081573 +Epoch 200, Loss: 0.1196632981300354 +Epoch 210, Loss: 0.11362802982330322 +Epoch 220, Loss: 0.10853980481624603 +Epoch 230, Loss: 0.1004730835556984 +Epoch 240, Loss: 0.09622400999069214 +Epoch 250, Loss: 0.09436841309070587 +Epoch 260, Loss: 0.08599096536636353 +Epoch 270, Loss: 0.08562178164720535 +Epoch 280, Loss: 0.08237449824810028 +Epoch 290, Loss: 0.07937919348478317 +Test accuracy: 91.86% +Epoch 0, Loss: 1.386720061302185 +Epoch 10, Loss: 1.1690239906311035 +Epoch 20, Loss: 0.9357665777206421 +Epoch 30, Loss: 0.7141404151916504 +Epoch 40, Loss: 0.5542738437652588 +Epoch 50, Loss: 0.4476126730442047 +Epoch 60, Loss: 0.36435753107070923 +Epoch 70, Loss: 0.3055197298526764 +Epoch 80, Loss: 0.26823264360427856 +Epoch 90, Loss: 0.24014657735824585 +Epoch 100, Loss: 0.2164667844772339 +Epoch 110, Loss: 0.1982322335243225 +Epoch 120, Loss: 0.17841453850269318 +Epoch 130, Loss: 0.16532042622566223 +Epoch 140, Loss: 0.1564522683620453 +Epoch 150, Loss: 0.14385583996772766 +Epoch 160, Loss: 0.13871730864048004 +Epoch 170, Loss: 0.1316843330860138 +Epoch 180, Loss: 0.12427607923746109 +Epoch 190, Loss: 0.11924686282873154 +Epoch 200, Loss: 0.11403527855873108 +Epoch 210, Loss: 0.10800893604755402 +Epoch 220, Loss: 0.10416759550571442 +Epoch 230, Loss: 0.09682657569646835 +Epoch 240, Loss: 0.0947406068444252 +Epoch 250, Loss: 0.09286007285118103 +Epoch 260, Loss: 0.08706257492303848 +Epoch 270, Loss: 0.08730906248092651 +Epoch 280, Loss: 0.08241771906614304 +Epoch 290, Loss: 0.07904267311096191 +Test accuracy: 91.10% +Epoch 0, Loss: 1.3892793655395508 +Epoch 10, Loss: 1.226642370223999 +Epoch 20, Loss: 1.03870689868927 +Epoch 30, Loss: 0.8396116495132446 +Epoch 40, Loss: 0.6538942456245422 +Epoch 50, Loss: 0.5128770470619202 +Epoch 60, Loss: 0.41663509607315063 +Epoch 70, Loss: 0.3486708104610443 +Epoch 80, Loss: 0.30234295129776 +Epoch 90, Loss: 0.26363322138786316 +Epoch 100, Loss: 0.2374897599220276 +Epoch 110, Loss: 0.21449165046215057 +Epoch 120, Loss: 0.20084628462791443 +Epoch 130, Loss: 0.18432511389255524 +Epoch 140, Loss: 0.17446742951869965 +Epoch 150, Loss: 0.1626211553812027 +Epoch 160, Loss: 0.1575726568698883 +Epoch 170, Loss: 0.145149365067482 +Epoch 180, Loss: 0.13922353088855743 +Epoch 190, Loss: 0.12886236608028412 +Epoch 200, Loss: 0.12798014283180237 +Epoch 210, Loss: 0.12165345251560211 +Epoch 220, Loss: 0.11611651629209518 +Epoch 230, Loss: 0.11410072445869446 +Epoch 240, Loss: 0.11254601180553436 +Epoch 250, Loss: 0.10447009652853012 +Epoch 260, Loss: 0.10100402683019638 +Epoch 270, Loss: 0.09886582940816879 +Epoch 280, Loss: 0.09505772590637207 +Epoch 290, Loss: 0.09105154126882553 +Test accuracy: 92.12% +Epoch 0, Loss: 1.3846547603607178 +Epoch 10, Loss: 1.146453619003296 +Epoch 20, Loss: 0.8882558345794678 +Epoch 30, Loss: 0.6731315851211548 +Epoch 40, Loss: 0.5218530893325806 +Epoch 50, Loss: 0.4239543676376343 +Epoch 60, Loss: 0.34406065940856934 +Epoch 70, Loss: 0.29402828216552734 +Epoch 80, Loss: 0.25674664974212646 +Epoch 90, Loss: 0.22633495926856995 +Epoch 100, Loss: 0.2046007364988327 +Epoch 110, Loss: 0.18996521830558777 +Epoch 120, Loss: 0.1750267744064331 +Epoch 130, Loss: 0.16334129869937897 +Epoch 140, Loss: 0.15030229091644287 +Epoch 150, Loss: 0.14219747483730316 +Epoch 160, Loss: 0.13417795300483704 +Epoch 170, Loss: 0.12847217917442322 +Epoch 180, Loss: 0.12073874473571777 +Epoch 190, Loss: 0.1149471327662468 +Epoch 200, Loss: 0.10762288421392441 +Epoch 210, Loss: 0.10525503009557724 +Epoch 220, Loss: 0.09906799346208572 +Epoch 230, Loss: 0.09487393498420715 +Epoch 240, Loss: 0.09204117208719254 +Epoch 250, Loss: 0.08686546236276627 +Epoch 260, Loss: 0.08412911742925644 +Epoch 270, Loss: 0.08368699997663498 +Epoch 280, Loss: 0.07872691005468369 +Epoch 290, Loss: 0.07624679058790207 +Test accuracy: 90.74% +Epoch 0, Loss: 1.3841654062271118 +Epoch 10, Loss: 1.2198054790496826 +Epoch 20, Loss: 1.0071805715560913 +Epoch 30, Loss: 0.7994639873504639 +Epoch 40, Loss: 0.6190387606620789 +Epoch 50, Loss: 0.4809304475784302 +Epoch 60, Loss: 0.38824981451034546 +Epoch 70, Loss: 0.31951165199279785 +Epoch 80, Loss: 0.26965370774269104 +Epoch 90, Loss: 0.23603448271751404 +Epoch 100, Loss: 0.20889441668987274 +Epoch 110, Loss: 0.18812589347362518 +Epoch 120, Loss: 0.1746269166469574 +Epoch 130, Loss: 0.16100329160690308 +Epoch 140, Loss: 0.14641161262989044 +Epoch 150, Loss: 0.14034435153007507 +Epoch 160, Loss: 0.13126076757907867 +Epoch 170, Loss: 0.12357977777719498 +Epoch 180, Loss: 0.11844265460968018 +Epoch 190, Loss: 0.11257224529981613 +Epoch 200, Loss: 0.10521093010902405 +Epoch 210, Loss: 0.10366282612085342 +Epoch 220, Loss: 0.0980900451540947 +Epoch 230, Loss: 0.09518347680568695 +Epoch 240, Loss: 0.09085483103990555 +Epoch 250, Loss: 0.08798941969871521 +Epoch 260, Loss: 0.08567506819963455 +Epoch 270, Loss: 0.08224658668041229 +Epoch 280, Loss: 0.0782034695148468 +Epoch 290, Loss: 0.07806368917226791 +Test accuracy: 89.85% +Epoch 0, Loss: 1.386418104171753 +Epoch 10, Loss: 1.193505048751831 +Epoch 20, Loss: 0.9732402563095093 +Epoch 30, Loss: 0.7542398571968079 +Epoch 40, Loss: 0.5938031673431396 +Epoch 50, Loss: 0.47519806027412415 +Epoch 60, Loss: 0.3830541968345642 +Epoch 70, Loss: 0.3238375186920166 +Epoch 80, Loss: 0.2799511253833771 +Epoch 90, Loss: 0.2502138614654541 +Epoch 100, Loss: 0.22423288226127625 +Epoch 110, Loss: 0.20326074957847595 +Epoch 120, Loss: 0.18681667745113373 +Epoch 130, Loss: 0.17464549839496613 +Epoch 140, Loss: 0.15821664035320282 +Epoch 150, Loss: 0.1504461169242859 +Epoch 160, Loss: 0.14135287702083588 +Epoch 170, Loss: 0.13243065774440765 +Epoch 180, Loss: 0.1257028877735138 +Epoch 190, Loss: 0.12176084518432617 +Epoch 200, Loss: 0.11486369371414185 +Epoch 210, Loss: 0.1097496822476387 +Epoch 220, Loss: 0.10770849883556366 +Epoch 230, Loss: 0.09983540326356888 +Epoch 240, Loss: 0.10095258057117462 +Epoch 250, Loss: 0.09484591335058212 +Epoch 260, Loss: 0.0918569415807724 +Epoch 270, Loss: 0.08948951214551926 +Epoch 280, Loss: 0.08629388362169266 +Epoch 290, Loss: 0.08315178751945496 +Test accuracy: 90.48% +Epoch 0, Loss: 1.3862870931625366 +Epoch 10, Loss: 1.2342278957366943 +Epoch 20, Loss: 1.0373247861862183 +Epoch 30, Loss: 0.7993283867835999 +Epoch 40, Loss: 0.6202364563941956 +Epoch 50, Loss: 0.49214407801628113 +Epoch 60, Loss: 0.40140068531036377 +Epoch 70, Loss: 0.33309105038642883 +Epoch 80, Loss: 0.29159092903137207 +Epoch 90, Loss: 0.25168806314468384 +Epoch 100, Loss: 0.22370746731758118 +Epoch 110, Loss: 0.2075512856245041 +Epoch 120, Loss: 0.1854449361562729 +Epoch 130, Loss: 0.16940081119537354 +Epoch 140, Loss: 0.1561892181634903 +Epoch 150, Loss: 0.14422571659088135 +Epoch 160, Loss: 0.1351204365491867 +Epoch 170, Loss: 0.12939785420894623 +Epoch 180, Loss: 0.12065058201551437 +Epoch 190, Loss: 0.11343569308519363 +Epoch 200, Loss: 0.10974454879760742 +Epoch 210, Loss: 0.10380764305591583 +Epoch 220, Loss: 0.097908616065979 +Epoch 230, Loss: 0.09576722979545593 +Epoch 240, Loss: 0.09526853263378143 +Epoch 250, Loss: 0.09005527943372726 +Epoch 260, Loss: 0.08644121885299683 +Epoch 270, Loss: 0.08570201694965363 +Epoch 280, Loss: 0.07990369945764542 +Epoch 290, Loss: 0.07618166506290436 +Test accuracy: 90.08% +Epoch 0, Loss: 1.387316346168518 +Epoch 10, Loss: 1.2122187614440918 +Epoch 20, Loss: 0.9914325475692749 +Epoch 30, Loss: 0.7636944651603699 +Epoch 40, Loss: 0.6057707071304321 +Epoch 50, Loss: 0.4868949055671692 +Epoch 60, Loss: 0.39776158332824707 +Epoch 70, Loss: 0.3304300904273987 +Epoch 80, Loss: 0.2892311215400696 +Epoch 90, Loss: 0.25266560912132263 +Epoch 100, Loss: 0.2276296764612198 +Epoch 110, Loss: 0.20485569536685944 +Epoch 120, Loss: 0.18289342522621155 +Epoch 130, Loss: 0.17115339636802673 +Epoch 140, Loss: 0.16174480319023132 +Epoch 150, Loss: 0.15122804045677185 +Epoch 160, Loss: 0.14135165512561798 +Epoch 170, Loss: 0.13167716562747955 +Epoch 180, Loss: 0.12488934397697449 +Epoch 190, Loss: 0.11851823329925537 +Epoch 200, Loss: 0.11254134029150009 +Epoch 210, Loss: 0.10668793320655823 +Epoch 220, Loss: 0.10412193834781647 +Epoch 230, Loss: 0.09725244343280792 +Epoch 240, Loss: 0.09517308324575424 +Epoch 250, Loss: 0.09118730574846268 +Epoch 260, Loss: 0.09107114374637604 +Epoch 270, Loss: 0.087496317923069 +Epoch 280, Loss: 0.08219702541828156 +Epoch 290, Loss: 0.07890896499156952 +Test accuracy: 90.88% +Epoch 0, Loss: 1.3837000131607056 +Epoch 10, Loss: 1.3450069427490234 +Epoch 20, Loss: 1.3214143514633179 +Epoch 30, Loss: 1.2976911067962646 +Epoch 40, Loss: 1.2707180976867676 +Epoch 50, Loss: 1.2376104593276978 +Epoch 60, Loss: 1.2034401893615723 +Epoch 70, Loss: 1.1636877059936523 +Epoch 80, Loss: 1.1249959468841553 +Epoch 90, Loss: 1.0809686183929443 +Epoch 100, Loss: 1.0324466228485107 +Epoch 110, Loss: 0.9837509989738464 +Epoch 120, Loss: 0.9489062428474426 +Epoch 130, Loss: 0.9058153033256531 +Epoch 140, Loss: 0.8714920282363892 +Epoch 150, Loss: 0.837627649307251 +Epoch 160, Loss: 0.8005445599555969 +Epoch 170, Loss: 0.7753855586051941 +Epoch 180, Loss: 0.7451537251472473 +Epoch 190, Loss: 0.7205379009246826 +Epoch 200, Loss: 0.6949794292449951 +Epoch 210, Loss: 0.6750547885894775 +Epoch 220, Loss: 0.6475628614425659 +Epoch 230, Loss: 0.6329144239425659 +Epoch 240, Loss: 0.6152263283729553 +Epoch 250, Loss: 0.5901490449905396 +Epoch 260, Loss: 0.5782666802406311 +Epoch 270, Loss: 0.5607517957687378 +Epoch 280, Loss: 0.5453757047653198 +Epoch 290, Loss: 0.5255328416824341 +Test accuracy: 82.87% +Epoch 0, Loss: 1.3869495391845703 +Epoch 10, Loss: 1.3505408763885498 +Epoch 20, Loss: 1.3285775184631348 +Epoch 30, Loss: 1.298643708229065 +Epoch 40, Loss: 1.268754005432129 +Epoch 50, Loss: 1.2331992387771606 +Epoch 60, Loss: 1.193692922592163 +Epoch 70, Loss: 1.1446897983551025 +Epoch 80, Loss: 1.0998505353927612 +Epoch 90, Loss: 1.0678578615188599 +Epoch 100, Loss: 1.019456386566162 +Epoch 110, Loss: 0.9821844100952148 +Epoch 120, Loss: 0.9421236515045166 +Epoch 130, Loss: 0.9064207673072815 +Epoch 140, Loss: 0.8668200969696045 +Epoch 150, Loss: 0.8329665064811707 +Epoch 160, Loss: 0.7996777296066284 +Epoch 170, Loss: 0.7716353535652161 +Epoch 180, Loss: 0.7428612112998962 +Epoch 190, Loss: 0.7176215052604675 +Epoch 200, Loss: 0.6965620517730713 +Epoch 210, Loss: 0.6767515540122986 +Epoch 220, Loss: 0.6537451148033142 +Epoch 230, Loss: 0.6338062286376953 +Epoch 240, Loss: 0.6101620197296143 +Epoch 250, Loss: 0.5943065285682678 +Epoch 260, Loss: 0.5750191807746887 +Epoch 270, Loss: 0.5598079562187195 +Epoch 280, Loss: 0.5419647097587585 +Epoch 290, Loss: 0.5314521193504333 +Test accuracy: 83.89% +Epoch 0, Loss: 1.3843512535095215 +Epoch 10, Loss: 1.34432053565979 +Epoch 20, Loss: 1.3223870992660522 +Epoch 30, Loss: 1.2960463762283325 +Epoch 40, Loss: 1.2638243436813354 +Epoch 50, Loss: 1.2278670072555542 +Epoch 60, Loss: 1.188334584236145 +Epoch 70, Loss: 1.1468312740325928 +Epoch 80, Loss: 1.1028122901916504 +Epoch 90, Loss: 1.0591522455215454 +Epoch 100, Loss: 1.0168992280960083 +Epoch 110, Loss: 0.971816897392273 +Epoch 120, Loss: 0.9368525147438049 +Epoch 130, Loss: 0.899430513381958 +Epoch 140, Loss: 0.8636306524276733 +Epoch 150, Loss: 0.8306112289428711 +Epoch 160, Loss: 0.7956889271736145 +Epoch 170, Loss: 0.7710946202278137 +Epoch 180, Loss: 0.7377041578292847 +Epoch 190, Loss: 0.7167219519615173 +Epoch 200, Loss: 0.688951849937439 +Epoch 210, Loss: 0.665184736251831 +Epoch 220, Loss: 0.6528394222259521 +Epoch 230, Loss: 0.6241171956062317 +Epoch 240, Loss: 0.6059131026268005 +Epoch 250, Loss: 0.5950939655303955 +Epoch 260, Loss: 0.5661689043045044 +Epoch 270, Loss: 0.562416136264801 +Epoch 280, Loss: 0.541741669178009 +Epoch 290, Loss: 0.5213267803192139 +Test accuracy: 81.66% +Epoch 0, Loss: 1.3860359191894531 +Epoch 10, Loss: 1.35441255569458 +Epoch 20, Loss: 1.3317333459854126 +Epoch 30, Loss: 1.304918885231018 +Epoch 40, Loss: 1.2765814065933228 +Epoch 50, Loss: 1.24139404296875 +Epoch 60, Loss: 1.1980385780334473 +Epoch 70, Loss: 1.1569985151290894 +Epoch 80, Loss: 1.1206796169281006 +Epoch 90, Loss: 1.07859206199646 +Epoch 100, Loss: 1.0414320230484009 +Epoch 110, Loss: 1.0016334056854248 +Epoch 120, Loss: 0.9678021669387817 +Epoch 130, Loss: 0.9248175024986267 +Epoch 140, Loss: 0.8943506479263306 +Epoch 150, Loss: 0.8665137887001038 +Epoch 160, Loss: 0.8319879770278931 +Epoch 170, Loss: 0.7997222542762756 +Epoch 180, Loss: 0.7727878093719482 +Epoch 190, Loss: 0.7519466280937195 +Epoch 200, Loss: 0.7207363247871399 +Epoch 210, Loss: 0.6987189650535583 +Epoch 220, Loss: 0.6761896014213562 +Epoch 230, Loss: 0.6526851654052734 +Epoch 240, Loss: 0.6303660869598389 +Epoch 250, Loss: 0.6199550032615662 +Epoch 260, Loss: 0.5978082418441772 +Epoch 270, Loss: 0.581429660320282 +Epoch 280, Loss: 0.564760684967041 +Epoch 290, Loss: 0.5544058084487915 +Test accuracy: 84.20% +Epoch 0, Loss: 1.3870410919189453 +Epoch 10, Loss: 1.3562109470367432 +Epoch 20, Loss: 1.333160638809204 +Epoch 30, Loss: 1.310301661491394 +Epoch 40, Loss: 1.2826045751571655 +Epoch 50, Loss: 1.2478340864181519 +Epoch 60, Loss: 1.2093805074691772 +Epoch 70, Loss: 1.1687383651733398 +Epoch 80, Loss: 1.1217113733291626 +Epoch 90, Loss: 1.0737417936325073 +Epoch 100, Loss: 1.0349701642990112 +Epoch 110, Loss: 0.9954206943511963 +Epoch 120, Loss: 0.9524512887001038 +Epoch 130, Loss: 0.9142876267433167 +Epoch 140, Loss: 0.8720229268074036 +Epoch 150, Loss: 0.8390489816665649 +Epoch 160, Loss: 0.8056374788284302 +Epoch 170, Loss: 0.781989574432373 +Epoch 180, Loss: 0.750452995300293 +Epoch 190, Loss: 0.7299162745475769 +Epoch 200, Loss: 0.6987406611442566 +Epoch 210, Loss: 0.6739872097969055 +Epoch 220, Loss: 0.6501486897468567 +Epoch 230, Loss: 0.6283408999443054 +Epoch 240, Loss: 0.6115714907646179 +Epoch 250, Loss: 0.5954806804656982 +Epoch 260, Loss: 0.5763590931892395 +Epoch 270, Loss: 0.5599794983863831 +Epoch 280, Loss: 0.5465255975723267 +Epoch 290, Loss: 0.5347119569778442 +Test accuracy: 83.40% +Epoch 0, Loss: 1.3870890140533447 +Epoch 10, Loss: 1.3504315614700317 +Epoch 20, Loss: 1.3328101634979248 +Epoch 30, Loss: 1.3122687339782715 +Epoch 40, Loss: 1.2889915704727173 +Epoch 50, Loss: 1.26115882396698 +Epoch 60, Loss: 1.2297093868255615 +Epoch 70, Loss: 1.193093180656433 +Epoch 80, Loss: 1.1496615409851074 +Epoch 90, Loss: 1.1038358211517334 +Epoch 100, Loss: 1.0607926845550537 +Epoch 110, Loss: 1.0165045261383057 +Epoch 120, Loss: 0.9738073945045471 +Epoch 130, Loss: 0.9272204041481018 +Epoch 140, Loss: 0.8966741561889648 +Epoch 150, Loss: 0.8592460751533508 +Epoch 160, Loss: 0.8310083746910095 +Epoch 170, Loss: 0.79038405418396 +Epoch 180, Loss: 0.7637011408805847 +Epoch 190, Loss: 0.7326359748840332 +Epoch 200, Loss: 0.7096948027610779 +Epoch 210, Loss: 0.6822751760482788 +Epoch 220, Loss: 0.6648087501525879 +Epoch 230, Loss: 0.6417484879493713 +Epoch 240, Loss: 0.6159791350364685 +Epoch 250, Loss: 0.6005628108978271 +Epoch 260, Loss: 0.5819315910339355 +Epoch 270, Loss: 0.5672863721847534 +Epoch 280, Loss: 0.5476911067962646 +Epoch 290, Loss: 0.534060537815094 +Test accuracy: 84.65% +Epoch 0, Loss: 1.3825629949569702 +Epoch 10, Loss: 1.3480082750320435 +Epoch 20, Loss: 1.322170615196228 +Epoch 30, Loss: 1.2984422445297241 +Epoch 40, Loss: 1.268635630607605 +Epoch 50, Loss: 1.2386341094970703 +Epoch 60, Loss: 1.2017602920532227 +Epoch 70, Loss: 1.168709635734558 +Epoch 80, Loss: 1.128632664680481 +Epoch 90, Loss: 1.0830233097076416 +Epoch 100, Loss: 1.0432664155960083 +Epoch 110, Loss: 1.0072956085205078 +Epoch 120, Loss: 0.9632247686386108 +Epoch 130, Loss: 0.9294906854629517 +Epoch 140, Loss: 0.8945077061653137 +Epoch 150, Loss: 0.8582006692886353 +Epoch 160, Loss: 0.8270878195762634 +Epoch 170, Loss: 0.7896695733070374 +Epoch 180, Loss: 0.7689781785011292 +Epoch 190, Loss: 0.7409091591835022 +Epoch 200, Loss: 0.7138811945915222 +Epoch 210, Loss: 0.686288058757782 +Epoch 220, Loss: 0.6656978726387024 +Epoch 230, Loss: 0.6427597999572754 +Epoch 240, Loss: 0.6281012892723083 +Epoch 250, Loss: 0.6052899956703186 +Epoch 260, Loss: 0.5874230861663818 +Epoch 270, Loss: 0.5751945972442627 +Epoch 280, Loss: 0.5565559267997742 +Epoch 290, Loss: 0.5428131818771362 +Test accuracy: 83.09% +Epoch 0, Loss: 1.3870593309402466 +Epoch 10, Loss: 1.3575139045715332 +Epoch 20, Loss: 1.3341411352157593 +Epoch 30, Loss: 1.3106193542480469 +Epoch 40, Loss: 1.2876015901565552 +Epoch 50, Loss: 1.2611768245697021 +Epoch 60, Loss: 1.2266086339950562 +Epoch 70, Loss: 1.1904044151306152 +Epoch 80, Loss: 1.1516374349594116 +Epoch 90, Loss: 1.11191725730896 +Epoch 100, Loss: 1.0677062273025513 +Epoch 110, Loss: 1.0165493488311768 +Epoch 120, Loss: 0.9740809798240662 +Epoch 130, Loss: 0.9335641264915466 +Epoch 140, Loss: 0.8946393132209778 +Epoch 150, Loss: 0.8602162003517151 +Epoch 160, Loss: 0.8322759866714478 +Epoch 170, Loss: 0.8011489510536194 +Epoch 180, Loss: 0.7685794830322266 +Epoch 190, Loss: 0.7354718446731567 +Epoch 200, Loss: 0.7153007388114929 +Epoch 210, Loss: 0.6907358169555664 +Epoch 220, Loss: 0.662123441696167 +Epoch 230, Loss: 0.6445022821426392 +Epoch 240, Loss: 0.6267691850662231 +Epoch 250, Loss: 0.6038415431976318 +Epoch 260, Loss: 0.585682213306427 +Epoch 270, Loss: 0.5624982714653015 +Epoch 280, Loss: 0.5453526973724365 +Epoch 290, Loss: 0.5375065803527832 +Test accuracy: 83.18% +Epoch 0, Loss: 1.3887896537780762 +Epoch 10, Loss: 1.3562911748886108 +Epoch 20, Loss: 1.332891821861267 +Epoch 30, Loss: 1.3083548545837402 +Epoch 40, Loss: 1.2843345403671265 +Epoch 50, Loss: 1.2430511713027954 +Epoch 60, Loss: 1.2086951732635498 +Epoch 70, Loss: 1.1677545309066772 +Epoch 80, Loss: 1.127987265586853 +Epoch 90, Loss: 1.081824779510498 +Epoch 100, Loss: 1.0470149517059326 +Epoch 110, Loss: 0.9975528717041016 +Epoch 120, Loss: 0.9652526378631592 +Epoch 130, Loss: 0.9215637445449829 +Epoch 140, Loss: 0.8828836679458618 +Epoch 150, Loss: 0.8442851305007935 +Epoch 160, Loss: 0.8152588605880737 +Epoch 170, Loss: 0.7908186912536621 +Epoch 180, Loss: 0.7527779340744019 +Epoch 190, Loss: 0.7311848402023315 +Epoch 200, Loss: 0.7092900276184082 +Epoch 210, Loss: 0.6824987530708313 +Epoch 220, Loss: 0.6608208417892456 +Epoch 230, Loss: 0.6351377367973328 +Epoch 240, Loss: 0.6193763613700867 +Epoch 250, Loss: 0.6017819046974182 +Epoch 260, Loss: 0.5801957249641418 +Epoch 270, Loss: 0.564392626285553 +Epoch 280, Loss: 0.5487404465675354 +Epoch 290, Loss: 0.5318535566329956 +Test accuracy: 81.98% +Epoch 0, Loss: 1.3878017663955688 +Epoch 10, Loss: 1.3549439907073975 +Epoch 20, Loss: 1.3270177841186523 +Epoch 30, Loss: 1.2970010042190552 +Epoch 40, Loss: 1.2638509273529053 +Epoch 50, Loss: 1.2305067777633667 +Epoch 60, Loss: 1.1910218000411987 +Epoch 70, Loss: 1.150898814201355 +Epoch 80, Loss: 1.1125332117080688 +Epoch 90, Loss: 1.0765913724899292 +Epoch 100, Loss: 1.0330607891082764 +Epoch 110, Loss: 0.9922946095466614 +Epoch 120, Loss: 0.9511122107505798 +Epoch 130, Loss: 0.9193059206008911 +Epoch 140, Loss: 0.886867880821228 +Epoch 150, Loss: 0.849267303943634 +Epoch 160, Loss: 0.8203638792037964 +Epoch 170, Loss: 0.7883878946304321 +Epoch 180, Loss: 0.7569189667701721 +Epoch 190, Loss: 0.7303574681282043 +Epoch 200, Loss: 0.7028898596763611 +Epoch 210, Loss: 0.6862272024154663 +Epoch 220, Loss: 0.6589040160179138 +Epoch 230, Loss: 0.6421058177947998 +Epoch 240, Loss: 0.6217731833457947 +Epoch 250, Loss: 0.6041455268859863 +Epoch 260, Loss: 0.582497239112854 +Epoch 270, Loss: 0.5686593651771545 +Epoch 280, Loss: 0.5494045615196228 +Epoch 290, Loss: 0.5396068096160889 +Test accuracy: 84.11% +Epoch 0, Loss: 1.3860204219818115 +Epoch 10, Loss: 1.3810770511627197 +Epoch 20, Loss: 1.3765814304351807 +Epoch 30, Loss: 1.37177574634552 +Epoch 40, Loss: 1.3676800727844238 +Epoch 50, Loss: 1.3635627031326294 +Epoch 60, Loss: 1.360260009765625 +Epoch 70, Loss: 1.3570548295974731 +Epoch 80, Loss: 1.3531712293624878 +Epoch 90, Loss: 1.349848985671997 +Epoch 100, Loss: 1.3464982509613037 +Epoch 110, Loss: 1.3442620038986206 +Epoch 120, Loss: 1.3405455350875854 +Epoch 130, Loss: 1.3366835117340088 +Epoch 140, Loss: 1.3338465690612793 +Epoch 150, Loss: 1.329546570777893 +Epoch 160, Loss: 1.3256069421768188 +Epoch 170, Loss: 1.3231157064437866 +Epoch 180, Loss: 1.3203320503234863 +Epoch 190, Loss: 1.3171874284744263 +Epoch 200, Loss: 1.313531517982483 +Epoch 210, Loss: 1.3130701780319214 +Epoch 220, Loss: 1.306585669517517 +Epoch 230, Loss: 1.3043886423110962 +Epoch 240, Loss: 1.2979589700698853 +Epoch 250, Loss: 1.2976908683776855 +Epoch 260, Loss: 1.2929539680480957 +Epoch 270, Loss: 1.2896416187286377 +Epoch 280, Loss: 1.28461754322052 +Epoch 290, Loss: 1.2822070121765137 +Test accuracy: 58.57% +Epoch 0, Loss: 1.3876302242279053 +Epoch 10, Loss: 1.3840398788452148 +Epoch 20, Loss: 1.3810979127883911 +Epoch 30, Loss: 1.3783875703811646 +Epoch 40, Loss: 1.3756153583526611 +Epoch 50, Loss: 1.3726760149002075 +Epoch 60, Loss: 1.36960768699646 +Epoch 70, Loss: 1.366237998008728 +Epoch 80, Loss: 1.3631960153579712 +Epoch 90, Loss: 1.360069751739502 +Epoch 100, Loss: 1.357115387916565 +Epoch 110, Loss: 1.3529903888702393 +Epoch 120, Loss: 1.3500837087631226 +Epoch 130, Loss: 1.3476454019546509 +Epoch 140, Loss: 1.3441002368927002 +Epoch 150, Loss: 1.3399031162261963 +Epoch 160, Loss: 1.3373744487762451 +Epoch 170, Loss: 1.3337093591690063 +Epoch 180, Loss: 1.3301832675933838 +Epoch 190, Loss: 1.3267947435379028 +Epoch 200, Loss: 1.3231827020645142 +Epoch 210, Loss: 1.3207833766937256 +Epoch 220, Loss: 1.316102147102356 +Epoch 230, Loss: 1.3117393255233765 +Epoch 240, Loss: 1.3090673685073853 +Epoch 250, Loss: 1.304072380065918 +Epoch 260, Loss: 1.3000025749206543 +Epoch 270, Loss: 1.2963216304779053 +Epoch 280, Loss: 1.2929242849349976 +Epoch 290, Loss: 1.2875310182571411 +Test accuracy: 52.69% +Epoch 0, Loss: 1.388375997543335 +Epoch 10, Loss: 1.3834186792373657 +Epoch 20, Loss: 1.3789461851119995 +Epoch 30, Loss: 1.3744639158248901 +Epoch 40, Loss: 1.3709266185760498 +Epoch 50, Loss: 1.3660991191864014 +Epoch 60, Loss: 1.3628144264221191 +Epoch 70, Loss: 1.3592878580093384 +Epoch 80, Loss: 1.3548208475112915 +Epoch 90, Loss: 1.3512840270996094 +Epoch 100, Loss: 1.3484305143356323 +Epoch 110, Loss: 1.3450007438659668 +Epoch 120, Loss: 1.342012882232666 +Epoch 130, Loss: 1.3377455472946167 +Epoch 140, Loss: 1.3352020978927612 +Epoch 150, Loss: 1.332090973854065 +Epoch 160, Loss: 1.3273117542266846 +Epoch 170, Loss: 1.3240915536880493 +Epoch 180, Loss: 1.3202354907989502 +Epoch 190, Loss: 1.3169883489608765 +Epoch 200, Loss: 1.3148716688156128 +Epoch 210, Loss: 1.3096846342086792 +Epoch 220, Loss: 1.3056830167770386 +Epoch 230, Loss: 1.3027969598770142 +Epoch 240, Loss: 1.2984205484390259 +Epoch 250, Loss: 1.2942291498184204 +Epoch 260, Loss: 1.2897597551345825 +Epoch 270, Loss: 1.288010597229004 +Epoch 280, Loss: 1.2842707633972168 +Epoch 290, Loss: 1.277930498123169 +Test accuracy: 63.68% +Epoch 0, Loss: 1.3878215551376343 +Epoch 10, Loss: 1.3813047409057617 +Epoch 20, Loss: 1.376753330230713 +Epoch 30, Loss: 1.372794270515442 +Epoch 40, Loss: 1.3697230815887451 +Epoch 50, Loss: 1.3656318187713623 +Epoch 60, Loss: 1.3625030517578125 +Epoch 70, Loss: 1.3600136041641235 +Epoch 80, Loss: 1.356787919998169 +Epoch 90, Loss: 1.3541585206985474 +Epoch 100, Loss: 1.3511261940002441 +Epoch 110, Loss: 1.34911048412323 +Epoch 120, Loss: 1.3458843231201172 +Epoch 130, Loss: 1.3426060676574707 +Epoch 140, Loss: 1.340308427810669 +Epoch 150, Loss: 1.3346984386444092 +Epoch 160, Loss: 1.3339484930038452 +Epoch 170, Loss: 1.328783631324768 +Epoch 180, Loss: 1.327082872390747 +Epoch 190, Loss: 1.323301911354065 +Epoch 200, Loss: 1.3173645734786987 +Epoch 210, Loss: 1.3148283958435059 +Epoch 220, Loss: 1.3114069700241089 +Epoch 230, Loss: 1.307835340499878 +Epoch 240, Loss: 1.3053808212280273 +Epoch 250, Loss: 1.299886703491211 +Epoch 260, Loss: 1.2952123880386353 +Epoch 270, Loss: 1.2919213771820068 +Epoch 280, Loss: 1.287732481956482 +Epoch 290, Loss: 1.2859094142913818 +Test accuracy: 47.98% +Epoch 0, Loss: 1.3865996599197388 +Epoch 10, Loss: 1.3811302185058594 +Epoch 20, Loss: 1.3754385709762573 +Epoch 30, Loss: 1.370978593826294 +Epoch 40, Loss: 1.3669407367706299 +Epoch 50, Loss: 1.3632042407989502 +Epoch 60, Loss: 1.3597780466079712 +Epoch 70, Loss: 1.3555469512939453 +Epoch 80, Loss: 1.3520046472549438 +Epoch 90, Loss: 1.3491843938827515 +Epoch 100, Loss: 1.3451889753341675 +Epoch 110, Loss: 1.341902256011963 +Epoch 120, Loss: 1.3386015892028809 +Epoch 130, Loss: 1.3363330364227295 +Epoch 140, Loss: 1.3333133459091187 +Epoch 150, Loss: 1.330074429512024 +Epoch 160, Loss: 1.3266862630844116 +Epoch 170, Loss: 1.3234803676605225 +Epoch 180, Loss: 1.320418357849121 +Epoch 190, Loss: 1.315743327140808 +Epoch 200, Loss: 1.3141958713531494 +Epoch 210, Loss: 1.3113754987716675 +Epoch 220, Loss: 1.3059800863265991 +Epoch 230, Loss: 1.3041622638702393 +Epoch 240, Loss: 1.3004212379455566 +Epoch 250, Loss: 1.2969510555267334 +Epoch 260, Loss: 1.2941950559616089 +Epoch 270, Loss: 1.28859543800354 +Epoch 280, Loss: 1.2859150171279907 +Epoch 290, Loss: 1.2849655151367188 +Test accuracy: 53.54% +Epoch 0, Loss: 1.383008599281311 +Epoch 10, Loss: 1.3778828382492065 +Epoch 20, Loss: 1.3719532489776611 +Epoch 30, Loss: 1.3680949211120605 +Epoch 40, Loss: 1.3633227348327637 +Epoch 50, Loss: 1.3594404458999634 +Epoch 60, Loss: 1.355542778968811 +Epoch 70, Loss: 1.35146164894104 +Epoch 80, Loss: 1.3480088710784912 +Epoch 90, Loss: 1.3444938659667969 +Epoch 100, Loss: 1.3422154188156128 +Epoch 110, Loss: 1.3378349542617798 +Epoch 120, Loss: 1.3370071649551392 +Epoch 130, Loss: 1.3322170972824097 +Epoch 140, Loss: 1.3282140493392944 +Epoch 150, Loss: 1.326562523841858 +Epoch 160, Loss: 1.3214467763900757 +Epoch 170, Loss: 1.318106770515442 +Epoch 180, Loss: 1.3149652481079102 +Epoch 190, Loss: 1.3123081922531128 +Epoch 200, Loss: 1.310478925704956 +Epoch 210, Loss: 1.305731177330017 +Epoch 220, Loss: 1.3037285804748535 +Epoch 230, Loss: 1.2995325326919556 +Epoch 240, Loss: 1.2956300973892212 +Epoch 250, Loss: 1.2917739152908325 +Epoch 260, Loss: 1.2869155406951904 +Epoch 270, Loss: 1.28378164768219 +Epoch 280, Loss: 1.2806137800216675 +Epoch 290, Loss: 1.276098370552063 +Test accuracy: 54.34% +Epoch 0, Loss: 1.3868939876556396 +Epoch 10, Loss: 1.3821265697479248 +Epoch 20, Loss: 1.3787051439285278 +Epoch 30, Loss: 1.3749016523361206 +Epoch 40, Loss: 1.3717279434204102 +Epoch 50, Loss: 1.3680840730667114 +Epoch 60, Loss: 1.364630937576294 +Epoch 70, Loss: 1.3614221811294556 +Epoch 80, Loss: 1.3576093912124634 +Epoch 90, Loss: 1.3553229570388794 +Epoch 100, Loss: 1.3519561290740967 +Epoch 110, Loss: 1.3477013111114502 +Epoch 120, Loss: 1.3455326557159424 +Epoch 130, Loss: 1.341213345527649 +Epoch 140, Loss: 1.338721513748169 +Epoch 150, Loss: 1.3357679843902588 +Epoch 160, Loss: 1.3316899538040161 +Epoch 170, Loss: 1.3290629386901855 +Epoch 180, Loss: 1.326202392578125 +Epoch 190, Loss: 1.3214274644851685 +Epoch 200, Loss: 1.318648099899292 +Epoch 210, Loss: 1.3138388395309448 +Epoch 220, Loss: 1.3097796440124512 +Epoch 230, Loss: 1.3046091794967651 +Epoch 240, Loss: 1.3028801679611206 +Epoch 250, Loss: 1.2985955476760864 +Epoch 260, Loss: 1.295220136642456 +Epoch 270, Loss: 1.2874373197555542 +Epoch 280, Loss: 1.2864562273025513 +Epoch 290, Loss: 1.2827236652374268 +Test accuracy: 59.95% +Epoch 0, Loss: 1.3897395133972168 +Epoch 10, Loss: 1.3831273317337036 +Epoch 20, Loss: 1.3774288892745972 +Epoch 30, Loss: 1.37310791015625 +Epoch 40, Loss: 1.3687816858291626 +Epoch 50, Loss: 1.364052414894104 +Epoch 60, Loss: 1.3610093593597412 +Epoch 70, Loss: 1.3569296598434448 +Epoch 80, Loss: 1.354098916053772 +Epoch 90, Loss: 1.3500776290893555 +Epoch 100, Loss: 1.3468769788742065 +Epoch 110, Loss: 1.344498634338379 +Epoch 120, Loss: 1.3411201238632202 +Epoch 130, Loss: 1.337016224861145 +Epoch 140, Loss: 1.334381103515625 +Epoch 150, Loss: 1.3303455114364624 +Epoch 160, Loss: 1.3256700038909912 +Epoch 170, Loss: 1.3244203329086304 +Epoch 180, Loss: 1.3210896253585815 +Epoch 190, Loss: 1.3179112672805786 +Epoch 200, Loss: 1.3133366107940674 +Epoch 210, Loss: 1.308472752571106 +Epoch 220, Loss: 1.3057785034179688 +Epoch 230, Loss: 1.303171157836914 +Epoch 240, Loss: 1.297404408454895 +Epoch 250, Loss: 1.2946358919143677 +Epoch 260, Loss: 1.291102647781372 +Epoch 270, Loss: 1.286341905593872 +Epoch 280, Loss: 1.2822668552398682 +Epoch 290, Loss: 1.277361512184143 +Test accuracy: 55.32% +Epoch 0, Loss: 1.383609414100647 +Epoch 10, Loss: 1.3793374300003052 +Epoch 20, Loss: 1.374520182609558 +Epoch 30, Loss: 1.3695518970489502 +Epoch 40, Loss: 1.3654422760009766 +Epoch 50, Loss: 1.3618900775909424 +Epoch 60, Loss: 1.3583059310913086 +Epoch 70, Loss: 1.3541463613510132 +Epoch 80, Loss: 1.3491114377975464 +Epoch 90, Loss: 1.3467992544174194 +Epoch 100, Loss: 1.3446518182754517 +Epoch 110, Loss: 1.3403595685958862 +Epoch 120, Loss: 1.337619423866272 +Epoch 130, Loss: 1.3348047733306885 +Epoch 140, Loss: 1.331421971321106 +Epoch 150, Loss: 1.3272945880889893 +Epoch 160, Loss: 1.3256428241729736 +Epoch 170, Loss: 1.3215841054916382 +Epoch 180, Loss: 1.3195011615753174 +Epoch 190, Loss: 1.3142938613891602 +Epoch 200, Loss: 1.3107173442840576 +Epoch 210, Loss: 1.307761788368225 +Epoch 220, Loss: 1.3051049709320068 +Epoch 230, Loss: 1.300930142402649 +Epoch 240, Loss: 1.2977126836776733 +Epoch 250, Loss: 1.294411301612854 +Epoch 260, Loss: 1.291669487953186 +Epoch 270, Loss: 1.2867408990859985 +Epoch 280, Loss: 1.2828277349472046 +Epoch 290, Loss: 1.2791887521743774 +Test accuracy: 51.31% +Epoch 0, Loss: 1.3845064640045166 +Epoch 10, Loss: 1.3791172504425049 +Epoch 20, Loss: 1.3741930723190308 +Epoch 30, Loss: 1.3703105449676514 +Epoch 40, Loss: 1.3650988340377808 +Epoch 50, Loss: 1.361285924911499 +Epoch 60, Loss: 1.3563262224197388 +Epoch 70, Loss: 1.3532179594039917 +Epoch 80, Loss: 1.349591612815857 +Epoch 90, Loss: 1.3459409475326538 +Epoch 100, Loss: 1.3430657386779785 +Epoch 110, Loss: 1.3387337923049927 +Epoch 120, Loss: 1.3358733654022217 +Epoch 130, Loss: 1.331010103225708 +Epoch 140, Loss: 1.3285369873046875 +Epoch 150, Loss: 1.3246419429779053 +Epoch 160, Loss: 1.3216708898544312 +Epoch 170, Loss: 1.3188166618347168 +Epoch 180, Loss: 1.311835765838623 +Epoch 190, Loss: 1.3105868101119995 +Epoch 200, Loss: 1.307084560394287 +Epoch 210, Loss: 1.3028147220611572 +Epoch 220, Loss: 1.2988924980163574 +Epoch 230, Loss: 1.294691562652588 +Epoch 240, Loss: 1.292283296585083 +Epoch 250, Loss: 1.2863672971725464 +Epoch 260, Loss: 1.2824753522872925 +Epoch 270, Loss: 1.279393196105957 +Epoch 280, Loss: 1.2734166383743286 +Epoch 290, Loss: 1.2711966037750244 +Test accuracy: 63.60% +Epoch 0, Loss: 1.3871971368789673 +Epoch 10, Loss: 1.1526117324829102 +Epoch 20, Loss: 0.8640612959861755 +Epoch 30, Loss: 0.6181985139846802 +Epoch 40, Loss: 0.4536436200141907 +Epoch 50, Loss: 0.35374388098716736 +Epoch 60, Loss: 0.28500187397003174 +Epoch 70, Loss: 0.23517481982707977 +Epoch 80, Loss: 0.19959959387779236 +Epoch 90, Loss: 0.17244620621204376 +Epoch 100, Loss: 0.15189719200134277 +Epoch 110, Loss: 0.1343308389186859 +Epoch 120, Loss: 0.12428209185600281 +Epoch 130, Loss: 0.11196254193782806 +Epoch 140, Loss: 0.10329057276248932 +Epoch 150, Loss: 0.09624220430850983 +Epoch 160, Loss: 0.09054305404424667 +Epoch 170, Loss: 0.0869353711605072 +Epoch 180, Loss: 0.07751189917325974 +Epoch 190, Loss: 0.0750923603773117 +Epoch 200, Loss: 0.06889095902442932 +Epoch 210, Loss: 0.06398362666368484 +Epoch 220, Loss: 0.06195851415395737 +Epoch 230, Loss: 0.06073031574487686 +Epoch 240, Loss: 0.055980950593948364 +Epoch 250, Loss: 0.054683804512023926 +Epoch 260, Loss: 0.051234159618616104 +Epoch 270, Loss: 0.051059186458587646 +Epoch 280, Loss: 0.04829072952270508 +Epoch 290, Loss: 0.04500966891646385 +Test accuracy: 90.16% +Epoch 0, Loss: 1.3854700326919556 +Epoch 10, Loss: 1.142483115196228 +Epoch 20, Loss: 0.8481318354606628 +Epoch 30, Loss: 0.6015543937683105 +Epoch 40, Loss: 0.44576096534729004 +Epoch 50, Loss: 0.3409562110900879 +Epoch 60, Loss: 0.2754710614681244 +Epoch 70, Loss: 0.22811788320541382 +Epoch 80, Loss: 0.20341087877750397 +Epoch 90, Loss: 0.17694494128227234 +Epoch 100, Loss: 0.15398624539375305 +Epoch 110, Loss: 0.14166496694087982 +Epoch 120, Loss: 0.13101445138454437 +Epoch 130, Loss: 0.11937560141086578 +Epoch 140, Loss: 0.10924391448497772 +Epoch 150, Loss: 0.10046704113483429 +Epoch 160, Loss: 0.09489760547876358 +Epoch 170, Loss: 0.08833490312099457 +Epoch 180, Loss: 0.08416593074798584 +Epoch 190, Loss: 0.07757415622472763 +Epoch 200, Loss: 0.07411419600248337 +Epoch 210, Loss: 0.06938334554433823 +Epoch 220, Loss: 0.0654166042804718 +Epoch 230, Loss: 0.06198001652956009 +Epoch 240, Loss: 0.05967380851507187 +Epoch 250, Loss: 0.05541731417179108 +Epoch 260, Loss: 0.05178344249725342 +Epoch 270, Loss: 0.04905233532190323 +Epoch 280, Loss: 0.04667287319898605 +Epoch 290, Loss: 0.04572351649403572 +Test accuracy: 91.28% +Epoch 0, Loss: 1.3859806060791016 +Epoch 10, Loss: 1.1242411136627197 +Epoch 20, Loss: 0.8110311031341553 +Epoch 30, Loss: 0.5692870020866394 +Epoch 40, Loss: 0.4209630489349365 +Epoch 50, Loss: 0.3248538374900818 +Epoch 60, Loss: 0.26113781332969666 +Epoch 70, Loss: 0.21650387346744537 +Epoch 80, Loss: 0.18610221147537231 +Epoch 90, Loss: 0.1668386608362198 +Epoch 100, Loss: 0.14597225189208984 +Epoch 110, Loss: 0.1318976879119873 +Epoch 120, Loss: 0.11854227632284164 +Epoch 130, Loss: 0.10960233211517334 +Epoch 140, Loss: 0.1033986285328865 +Epoch 150, Loss: 0.09377503395080566 +Epoch 160, Loss: 0.0883617103099823 +Epoch 170, Loss: 0.08267529308795929 +Epoch 180, Loss: 0.07724378257989883 +Epoch 190, Loss: 0.07083918899297714 +Epoch 200, Loss: 0.06793215125799179 +Epoch 210, Loss: 0.06422117352485657 +Epoch 220, Loss: 0.062037307769060135 +Epoch 230, Loss: 0.05862110108137131 +Epoch 240, Loss: 0.053924866020679474 +Epoch 250, Loss: 0.05180886760354042 +Epoch 260, Loss: 0.04815519228577614 +Epoch 270, Loss: 0.04715685546398163 +Epoch 280, Loss: 0.04497271403670311 +Epoch 290, Loss: 0.04268067330121994 +Test accuracy: 91.14% +Epoch 0, Loss: 1.383008599281311 +Epoch 10, Loss: 1.0904902219772339 +Epoch 20, Loss: 0.7879736423492432 +Epoch 30, Loss: 0.5634092688560486 +Epoch 40, Loss: 0.419038325548172 +Epoch 50, Loss: 0.32372939586639404 +Epoch 60, Loss: 0.25827842950820923 +Epoch 70, Loss: 0.2143983244895935 +Epoch 80, Loss: 0.18139995634555817 +Epoch 90, Loss: 0.1589874029159546 +Epoch 100, Loss: 0.13814492523670197 +Epoch 110, Loss: 0.12324263155460358 +Epoch 120, Loss: 0.11198224872350693 +Epoch 130, Loss: 0.10299636423587799 +Epoch 140, Loss: 0.0937393307685852 +Epoch 150, Loss: 0.08533012121915817 +Epoch 160, Loss: 0.08205676823854446 +Epoch 170, Loss: 0.07590028643608093 +Epoch 180, Loss: 0.07127631455659866 +Epoch 190, Loss: 0.06779025495052338 +Epoch 200, Loss: 0.06243886426091194 +Epoch 210, Loss: 0.058526162058115005 +Epoch 220, Loss: 0.05590203404426575 +Epoch 230, Loss: 0.05452268198132515 +Epoch 240, Loss: 0.049567148089408875 +Epoch 250, Loss: 0.04881805181503296 +Epoch 260, Loss: 0.04566311463713646 +Epoch 270, Loss: 0.04335714876651764 +Epoch 280, Loss: 0.04295650124549866 +Epoch 290, Loss: 0.038926515728235245 +Test accuracy: 90.83% +Epoch 0, Loss: 1.390512228012085 +Epoch 10, Loss: 1.2150955200195312 +Epoch 20, Loss: 0.9942270517349243 +Epoch 30, Loss: 0.7435503602027893 +Epoch 40, Loss: 0.5552856922149658 +Epoch 50, Loss: 0.42985838651657104 +Epoch 60, Loss: 0.3371560275554657 +Epoch 70, Loss: 0.2714606523513794 +Epoch 80, Loss: 0.22648651897907257 +Epoch 90, Loss: 0.1961134523153305 +Epoch 100, Loss: 0.16922172904014587 +Epoch 110, Loss: 0.14830973744392395 +Epoch 120, Loss: 0.1307568997144699 +Epoch 130, Loss: 0.11958160996437073 +Epoch 140, Loss: 0.11175033450126648 +Epoch 150, Loss: 0.10093678534030914 +Epoch 160, Loss: 0.09153331816196442 +Epoch 170, Loss: 0.08651120215654373 +Epoch 180, Loss: 0.08223185688257217 +Epoch 190, Loss: 0.07656009495258331 +Epoch 200, Loss: 0.07042022049427032 +Epoch 210, Loss: 0.06695592403411865 +Epoch 220, Loss: 0.06429944187402725 +Epoch 230, Loss: 0.05929045379161835 +Epoch 240, Loss: 0.057165972888469696 +Epoch 250, Loss: 0.051971834152936935 +Epoch 260, Loss: 0.049137163907289505 +Epoch 270, Loss: 0.05022639036178589 +Epoch 280, Loss: 0.04587507247924805 +Epoch 290, Loss: 0.0445941798388958 +Test accuracy: 91.05% +Epoch 0, Loss: 1.3826606273651123 +Epoch 10, Loss: 1.0984176397323608 +Epoch 20, Loss: 0.7669848799705505 +Epoch 30, Loss: 0.5294274687767029 +Epoch 40, Loss: 0.39221251010894775 +Epoch 50, Loss: 0.3020528256893158 +Epoch 60, Loss: 0.24343527853488922 +Epoch 70, Loss: 0.20464839041233063 +Epoch 80, Loss: 0.17704762518405914 +Epoch 90, Loss: 0.1569633185863495 +Epoch 100, Loss: 0.13882580399513245 +Epoch 110, Loss: 0.12671992182731628 +Epoch 120, Loss: 0.11637222766876221 +Epoch 130, Loss: 0.10826421529054642 +Epoch 140, Loss: 0.09950011968612671 +Epoch 150, Loss: 0.09180482476949692 +Epoch 160, Loss: 0.08572282642126083 +Epoch 170, Loss: 0.0823575109243393 +Epoch 180, Loss: 0.07749641686677933 +Epoch 190, Loss: 0.07182508707046509 +Epoch 200, Loss: 0.06789364665746689 +Epoch 210, Loss: 0.06550376862287521 +Epoch 220, Loss: 0.06195583567023277 +Epoch 230, Loss: 0.05872749537229538 +Epoch 240, Loss: 0.053702615201473236 +Epoch 250, Loss: 0.05306908115744591 +Epoch 260, Loss: 0.05137871950864792 +Epoch 270, Loss: 0.049318939447402954 +Epoch 280, Loss: 0.04580686613917351 +Epoch 290, Loss: 0.044371098279953 +Test accuracy: 90.92% +Epoch 0, Loss: 1.3848209381103516 +Epoch 10, Loss: 1.119219183921814 +Epoch 20, Loss: 0.8201384544372559 +Epoch 30, Loss: 0.5937950611114502 +Epoch 40, Loss: 0.4384722113609314 +Epoch 50, Loss: 0.3334139883518219 +Epoch 60, Loss: 0.26496750116348267 +Epoch 70, Loss: 0.22082428634166718 +Epoch 80, Loss: 0.18821445107460022 +Epoch 90, Loss: 0.1646648347377777 +Epoch 100, Loss: 0.14354340732097626 +Epoch 110, Loss: 0.13074816763401031 +Epoch 120, Loss: 0.11716555058956146 +Epoch 130, Loss: 0.10724300891160965 +Epoch 140, Loss: 0.09895888715982437 +Epoch 150, Loss: 0.09082840383052826 +Epoch 160, Loss: 0.08426725119352341 +Epoch 170, Loss: 0.07982366532087326 +Epoch 180, Loss: 0.07526379823684692 +Epoch 190, Loss: 0.06912045925855637 +Epoch 200, Loss: 0.06772597134113312 +Epoch 210, Loss: 0.06282033026218414 +Epoch 220, Loss: 0.059743285179138184 +Epoch 230, Loss: 0.05877618119120598 +Epoch 240, Loss: 0.055198270827531815 +Epoch 250, Loss: 0.05159980431199074 +Epoch 260, Loss: 0.04955882951617241 +Epoch 270, Loss: 0.046840112656354904 +Epoch 280, Loss: 0.04390140622854233 +Epoch 290, Loss: 0.041762545704841614 +Test accuracy: 91.59% +Epoch 0, Loss: 1.383770227432251 +Epoch 10, Loss: 1.0896528959274292 +Epoch 20, Loss: 0.7701332569122314 +Epoch 30, Loss: 0.5454296469688416 +Epoch 40, Loss: 0.40217941999435425 +Epoch 50, Loss: 0.3161930441856384 +Epoch 60, Loss: 0.2558056116104126 +Epoch 70, Loss: 0.2137678563594818 +Epoch 80, Loss: 0.18174190819263458 +Epoch 90, Loss: 0.16005893051624298 +Epoch 100, Loss: 0.14162446558475494 +Epoch 110, Loss: 0.12855881452560425 +Epoch 120, Loss: 0.11673103272914886 +Epoch 130, Loss: 0.10845540463924408 +Epoch 140, Loss: 0.0993514135479927 +Epoch 150, Loss: 0.09348983317613602 +Epoch 160, Loss: 0.08649608492851257 +Epoch 170, Loss: 0.08101441711187363 +Epoch 180, Loss: 0.07835818082094193 +Epoch 190, Loss: 0.07367177307605743 +Epoch 200, Loss: 0.0673162117600441 +Epoch 210, Loss: 0.06505701690912247 +Epoch 220, Loss: 0.06244097277522087 +Epoch 230, Loss: 0.060322459787130356 +Epoch 240, Loss: 0.056219495832920074 +Epoch 250, Loss: 0.05414546653628349 +Epoch 260, Loss: 0.049389131367206573 +Epoch 270, Loss: 0.04645014554262161 +Epoch 280, Loss: 0.04794395714998245 +Epoch 290, Loss: 0.046805258840322495 +Test accuracy: 91.37% +Epoch 0, Loss: 1.3848869800567627 +Epoch 10, Loss: 1.1415517330169678 +Epoch 20, Loss: 0.8271083831787109 +Epoch 30, Loss: 0.593532383441925 +Epoch 40, Loss: 0.43975353240966797 +Epoch 50, Loss: 0.3394010663032532 +Epoch 60, Loss: 0.27721089124679565 +Epoch 70, Loss: 0.23275278508663177 +Epoch 80, Loss: 0.19248615205287933 +Epoch 90, Loss: 0.17021343111991882 +Epoch 100, Loss: 0.1474553644657135 +Epoch 110, Loss: 0.13279081881046295 +Epoch 120, Loss: 0.11835190653800964 +Epoch 130, Loss: 0.11075008660554886 +Epoch 140, Loss: 0.10062368959188461 +Epoch 150, Loss: 0.09186974912881851 +Epoch 160, Loss: 0.08742234110832214 +Epoch 170, Loss: 0.08018413186073303 +Epoch 180, Loss: 0.07427702099084854 +Epoch 190, Loss: 0.07273058593273163 +Epoch 200, Loss: 0.06565229594707489 +Epoch 210, Loss: 0.06247269734740257 +Epoch 220, Loss: 0.05947086587548256 +Epoch 230, Loss: 0.05738385021686554 +Epoch 240, Loss: 0.05357109755277634 +Epoch 250, Loss: 0.052517205476760864 +Epoch 260, Loss: 0.04910625144839287 +Epoch 270, Loss: 0.0464986115694046 +Epoch 280, Loss: 0.04401065409183502 +Epoch 290, Loss: 0.04397358000278473 +Test accuracy: 91.01% +Epoch 0, Loss: 1.3863470554351807 +Epoch 10, Loss: 1.1289676427841187 +Epoch 20, Loss: 0.8292677402496338 +Epoch 30, Loss: 0.5969521999359131 +Epoch 40, Loss: 0.4456614851951599 +Epoch 50, Loss: 0.3532816171646118 +Epoch 60, Loss: 0.28768596053123474 +Epoch 70, Loss: 0.24054232239723206 +Epoch 80, Loss: 0.2096235305070877 +Epoch 90, Loss: 0.18270732462406158 +Epoch 100, Loss: 0.16387008130550385 +Epoch 110, Loss: 0.1457819938659668 +Epoch 120, Loss: 0.13335537910461426 +Epoch 130, Loss: 0.1227153092622757 +Epoch 140, Loss: 0.11031785607337952 +Epoch 150, Loss: 0.10321132838726044 +Epoch 160, Loss: 0.09422120451927185 +Epoch 170, Loss: 0.08884133398532867 +Epoch 180, Loss: 0.08366558700799942 +Epoch 190, Loss: 0.07928808033466339 +Epoch 200, Loss: 0.07693471759557724 +Epoch 210, Loss: 0.07111167907714844 +Epoch 220, Loss: 0.06463683396577835 +Epoch 230, Loss: 0.0632750540971756 +Epoch 240, Loss: 0.06065581738948822 +Epoch 250, Loss: 0.05647266283631325 +Epoch 260, Loss: 0.05358172580599785 +Epoch 270, Loss: 0.05190248787403107 +Epoch 280, Loss: 0.04948115721344948 +Epoch 290, Loss: 0.04616904258728027 +Test accuracy: 91.10% +Epoch 0, Loss: 1.3856383562088013 +Epoch 10, Loss: 1.3354319334030151 +Epoch 20, Loss: 1.2997283935546875 +Epoch 30, Loss: 1.2544620037078857 +Epoch 40, Loss: 1.202277660369873 +Epoch 50, Loss: 1.150067925453186 +Epoch 60, Loss: 1.0905120372772217 +Epoch 70, Loss: 1.0298227071762085 +Epoch 80, Loss: 0.9690593481063843 +Epoch 90, Loss: 0.9132449626922607 +Epoch 100, Loss: 0.8632150888442993 +Epoch 110, Loss: 0.8129356503486633 +Epoch 120, Loss: 0.7653511762619019 +Epoch 130, Loss: 0.7220459580421448 +Epoch 140, Loss: 0.684099555015564 +Epoch 150, Loss: 0.6484101414680481 +Epoch 160, Loss: 0.6253386735916138 +Epoch 170, Loss: 0.5875153541564941 +Epoch 180, Loss: 0.562648355960846 +Epoch 190, Loss: 0.5390223860740662 +Epoch 200, Loss: 0.5173934698104858 +Epoch 210, Loss: 0.4925963580608368 +Epoch 220, Loss: 0.47441110014915466 +Epoch 230, Loss: 0.4574279487133026 +Epoch 240, Loss: 0.4426535665988922 +Epoch 250, Loss: 0.4226399064064026 +Epoch 260, Loss: 0.412920206785202 +Epoch 270, Loss: 0.39781853556632996 +Epoch 280, Loss: 0.3851633369922638 +Epoch 290, Loss: 0.3761897683143616 +Test accuracy: 85.71% +Epoch 0, Loss: 1.386695146560669 +Epoch 10, Loss: 1.338485836982727 +Epoch 20, Loss: 1.3050031661987305 +Epoch 30, Loss: 1.2646158933639526 +Epoch 40, Loss: 1.2152546644210815 +Epoch 50, Loss: 1.1620216369628906 +Epoch 60, Loss: 1.1039303541183472 +Epoch 70, Loss: 1.044708490371704 +Epoch 80, Loss: 0.9817558526992798 +Epoch 90, Loss: 0.9320535063743591 +Epoch 100, Loss: 0.8697975873947144 +Epoch 110, Loss: 0.8215330839157104 +Epoch 120, Loss: 0.7760911583900452 +Epoch 130, Loss: 0.7363480925559998 +Epoch 140, Loss: 0.694184422492981 +Epoch 150, Loss: 0.661904513835907 +Epoch 160, Loss: 0.6283566355705261 +Epoch 170, Loss: 0.6005991697311401 +Epoch 180, Loss: 0.5726521015167236 +Epoch 190, Loss: 0.5448635220527649 +Epoch 200, Loss: 0.5251779556274414 +Epoch 210, Loss: 0.5017178058624268 +Epoch 220, Loss: 0.48022231459617615 +Epoch 230, Loss: 0.4640999436378479 +Epoch 240, Loss: 0.4493457078933716 +Epoch 250, Loss: 0.43427833914756775 +Epoch 260, Loss: 0.41643497347831726 +Epoch 270, Loss: 0.40307679772377014 +Epoch 280, Loss: 0.39004412293434143 +Epoch 290, Loss: 0.3791075050830841 +Test accuracy: 86.03% +Epoch 0, Loss: 1.388074517250061 +Epoch 10, Loss: 1.341900110244751 +Epoch 20, Loss: 1.3116806745529175 +Epoch 30, Loss: 1.2714180946350098 +Epoch 40, Loss: 1.2225046157836914 +Epoch 50, Loss: 1.1677474975585938 +Epoch 60, Loss: 1.1097471714019775 +Epoch 70, Loss: 1.0495038032531738 +Epoch 80, Loss: 0.9861969947814941 +Epoch 90, Loss: 0.9294684529304504 +Epoch 100, Loss: 0.8738232851028442 +Epoch 110, Loss: 0.8258387446403503 +Epoch 120, Loss: 0.7740745544433594 +Epoch 130, Loss: 0.7387092709541321 +Epoch 140, Loss: 0.692976176738739 +Epoch 150, Loss: 0.6619712710380554 +Epoch 160, Loss: 0.6303406953811646 +Epoch 170, Loss: 0.5993388891220093 +Epoch 180, Loss: 0.5716776847839355 +Epoch 190, Loss: 0.5491597056388855 +Epoch 200, Loss: 0.5285013914108276 +Epoch 210, Loss: 0.5032870769500732 +Epoch 220, Loss: 0.4838803708553314 +Epoch 230, Loss: 0.465840220451355 +Epoch 240, Loss: 0.45213934779167175 +Epoch 250, Loss: 0.43544551730155945 +Epoch 260, Loss: 0.4196232855319977 +Epoch 270, Loss: 0.4050526022911072 +Epoch 280, Loss: 0.3910712003707886 +Epoch 290, Loss: 0.38157156109809875 +Test accuracy: 85.89% +Epoch 0, Loss: 1.386720061302185 +Epoch 10, Loss: 1.336807370185852 +Epoch 20, Loss: 1.3032662868499756 +Epoch 30, Loss: 1.2608561515808105 +Epoch 40, Loss: 1.2147012948989868 +Epoch 50, Loss: 1.1614080667495728 +Epoch 60, Loss: 1.1030889749526978 +Epoch 70, Loss: 1.0390300750732422 +Epoch 80, Loss: 0.9808788299560547 +Epoch 90, Loss: 0.9240671396255493 +Epoch 100, Loss: 0.869154155254364 +Epoch 110, Loss: 0.8194164633750916 +Epoch 120, Loss: 0.7746202349662781 +Epoch 130, Loss: 0.7307946681976318 +Epoch 140, Loss: 0.6925490498542786 +Epoch 150, Loss: 0.6603503823280334 +Epoch 160, Loss: 0.6283406615257263 +Epoch 170, Loss: 0.5978911519050598 +Epoch 180, Loss: 0.5762145519256592 +Epoch 190, Loss: 0.5507312417030334 +Epoch 200, Loss: 0.5274470448493958 +Epoch 210, Loss: 0.5046626329421997 +Epoch 220, Loss: 0.48388978838920593 +Epoch 230, Loss: 0.46785852313041687 +Epoch 240, Loss: 0.4476351737976074 +Epoch 250, Loss: 0.4338156282901764 +Epoch 260, Loss: 0.42257243394851685 +Epoch 270, Loss: 0.40479525923728943 +Epoch 280, Loss: 0.3915402591228485 +Epoch 290, Loss: 0.3789765238761902 +Test accuracy: 85.40% +Epoch 0, Loss: 1.3866122961044312 +Epoch 10, Loss: 1.339869499206543 +Epoch 20, Loss: 1.3048230409622192 +Epoch 30, Loss: 1.264005184173584 +Epoch 40, Loss: 1.2146918773651123 +Epoch 50, Loss: 1.1608622074127197 +Epoch 60, Loss: 1.1020792722702026 +Epoch 70, Loss: 1.0412178039550781 +Epoch 80, Loss: 0.979544997215271 +Epoch 90, Loss: 0.9224792122840881 +Epoch 100, Loss: 0.8681771755218506 +Epoch 110, Loss: 0.8171263337135315 +Epoch 120, Loss: 0.7708286046981812 +Epoch 130, Loss: 0.7302399277687073 +Epoch 140, Loss: 0.69070965051651 +Epoch 150, Loss: 0.6538421511650085 +Epoch 160, Loss: 0.6242704391479492 +Epoch 170, Loss: 0.5971605777740479 +Epoch 180, Loss: 0.5705674886703491 +Epoch 190, Loss: 0.5472131967544556 +Epoch 200, Loss: 0.5239838361740112 +Epoch 210, Loss: 0.5028822422027588 +Epoch 220, Loss: 0.48218202590942383 +Epoch 230, Loss: 0.4672088921070099 +Epoch 240, Loss: 0.45144543051719666 +Epoch 250, Loss: 0.4316701889038086 +Epoch 260, Loss: 0.4177497625350952 +Epoch 270, Loss: 0.40532806515693665 +Epoch 280, Loss: 0.39168936014175415 +Epoch 290, Loss: 0.3781132102012634 +Test accuracy: 85.63% +Epoch 0, Loss: 1.387765645980835 +Epoch 10, Loss: 1.3390659093856812 +Epoch 20, Loss: 1.308984637260437 +Epoch 30, Loss: 1.2701245546340942 +Epoch 40, Loss: 1.2237361669540405 +Epoch 50, Loss: 1.1701873540878296 +Epoch 60, Loss: 1.1122909784317017 +Epoch 70, Loss: 1.0500390529632568 +Epoch 80, Loss: 0.989942193031311 +Epoch 90, Loss: 0.9338418841362 +Epoch 100, Loss: 0.8760179281234741 +Epoch 110, Loss: 0.8301389813423157 +Epoch 120, Loss: 0.7784904837608337 +Epoch 130, Loss: 0.7431216835975647 +Epoch 140, Loss: 0.7009871006011963 +Epoch 150, Loss: 0.6695210337638855 +Epoch 160, Loss: 0.6382499933242798 +Epoch 170, Loss: 0.6067412495613098 +Epoch 180, Loss: 0.5770332217216492 +Epoch 190, Loss: 0.5540964603424072 +Epoch 200, Loss: 0.5331671237945557 +Epoch 210, Loss: 0.5136419534683228 +Epoch 220, Loss: 0.4892466068267822 +Epoch 230, Loss: 0.47081586718559265 +Epoch 240, Loss: 0.4534133970737457 +Epoch 250, Loss: 0.43870699405670166 +Epoch 260, Loss: 0.423858642578125 +Epoch 270, Loss: 0.4098002016544342 +Epoch 280, Loss: 0.3971051275730133 +Epoch 290, Loss: 0.38590413331985474 +Test accuracy: 86.03% +Epoch 0, Loss: 1.3882101774215698 +Epoch 10, Loss: 1.3392043113708496 +Epoch 20, Loss: 1.3123550415039062 +Epoch 30, Loss: 1.2740874290466309 +Epoch 40, Loss: 1.231658697128296 +Epoch 50, Loss: 1.1821045875549316 +Epoch 60, Loss: 1.1263608932495117 +Epoch 70, Loss: 1.0640459060668945 +Epoch 80, Loss: 1.0064043998718262 +Epoch 90, Loss: 0.9430237412452698 +Epoch 100, Loss: 0.8879773616790771 +Epoch 110, Loss: 0.8407379984855652 +Epoch 120, Loss: 0.7860231399536133 +Epoch 130, Loss: 0.7492243647575378 +Epoch 140, Loss: 0.7055196166038513 +Epoch 150, Loss: 0.6693282723426819 +Epoch 160, Loss: 0.6390535831451416 +Epoch 170, Loss: 0.6068737506866455 +Epoch 180, Loss: 0.5801503658294678 +Epoch 190, Loss: 0.551289439201355 +Epoch 200, Loss: 0.5289596319198608 +Epoch 210, Loss: 0.5089682936668396 +Epoch 220, Loss: 0.4852566123008728 +Epoch 230, Loss: 0.46899497509002686 +Epoch 240, Loss: 0.45141100883483887 +Epoch 250, Loss: 0.43725764751434326 +Epoch 260, Loss: 0.4228028953075409 +Epoch 270, Loss: 0.40874677896499634 +Epoch 280, Loss: 0.39891839027404785 +Epoch 290, Loss: 0.38337385654449463 +Test accuracy: 87.09% +Epoch 0, Loss: 1.3857406377792358 +Epoch 10, Loss: 1.3394603729248047 +Epoch 20, Loss: 1.3032432794570923 +Epoch 30, Loss: 1.2647303342819214 +Epoch 40, Loss: 1.2224031686782837 +Epoch 50, Loss: 1.1690388917922974 +Epoch 60, Loss: 1.111138105392456 +Epoch 70, Loss: 1.050923466682434 +Epoch 80, Loss: 0.9908185601234436 +Epoch 90, Loss: 0.9348291754722595 +Epoch 100, Loss: 0.8781996965408325 +Epoch 110, Loss: 0.8274774551391602 +Epoch 120, Loss: 0.7822405099868774 +Epoch 130, Loss: 0.7399448156356812 +Epoch 140, Loss: 0.6995310187339783 +Epoch 150, Loss: 0.659605860710144 +Epoch 160, Loss: 0.6285926103591919 +Epoch 170, Loss: 0.5962896943092346 +Epoch 180, Loss: 0.575654149055481 +Epoch 190, Loss: 0.5512329936027527 +Epoch 200, Loss: 0.5269112586975098 +Epoch 210, Loss: 0.5062901973724365 +Epoch 220, Loss: 0.4867621660232544 +Epoch 230, Loss: 0.4644978940486908 +Epoch 240, Loss: 0.45066967606544495 +Epoch 250, Loss: 0.43495509028434753 +Epoch 260, Loss: 0.41975855827331543 +Epoch 270, Loss: 0.40761810541152954 +Epoch 280, Loss: 0.3926023840904236 +Epoch 290, Loss: 0.38347458839416504 +Test accuracy: 86.87% +Epoch 0, Loss: 1.3866584300994873 +Epoch 10, Loss: 1.3372126817703247 +Epoch 20, Loss: 1.3006731271743774 +Epoch 30, Loss: 1.257016658782959 +Epoch 40, Loss: 1.206939935684204 +Epoch 50, Loss: 1.1510721445083618 +Epoch 60, Loss: 1.0940532684326172 +Epoch 70, Loss: 1.0344902276992798 +Epoch 80, Loss: 0.9759387969970703 +Epoch 90, Loss: 0.9211553931236267 +Epoch 100, Loss: 0.8687557578086853 +Epoch 110, Loss: 0.8190392851829529 +Epoch 120, Loss: 0.7752752900123596 +Epoch 130, Loss: 0.727931022644043 +Epoch 140, Loss: 0.6936454176902771 +Epoch 150, Loss: 0.6604636311531067 +Epoch 160, Loss: 0.6260617971420288 +Epoch 170, Loss: 0.5959858894348145 +Epoch 180, Loss: 0.5714829564094543 +Epoch 190, Loss: 0.5446476936340332 +Epoch 200, Loss: 0.5261052846908569 +Epoch 210, Loss: 0.5028197169303894 +Epoch 220, Loss: 0.4841241240501404 +Epoch 230, Loss: 0.46374669671058655 +Epoch 240, Loss: 0.44918176531791687 +Epoch 250, Loss: 0.43480220437049866 +Epoch 260, Loss: 0.4212958514690399 +Epoch 270, Loss: 0.4046294093132019 +Epoch 280, Loss: 0.39070841670036316 +Epoch 290, Loss: 0.3819621205329895 +Test accuracy: 86.47% +Epoch 0, Loss: 1.3854776620864868 +Epoch 10, Loss: 1.3381930589675903 +Epoch 20, Loss: 1.3067851066589355 +Epoch 30, Loss: 1.2682241201400757 +Epoch 40, Loss: 1.2213189601898193 +Epoch 50, Loss: 1.1694705486297607 +Epoch 60, Loss: 1.1126426458358765 +Epoch 70, Loss: 1.0513190031051636 +Epoch 80, Loss: 0.9892155528068542 +Epoch 90, Loss: 0.9331904053688049 +Epoch 100, Loss: 0.8813503980636597 +Epoch 110, Loss: 0.8304709196090698 +Epoch 120, Loss: 0.785024106502533 +Epoch 130, Loss: 0.7440558075904846 +Epoch 140, Loss: 0.7034114599227905 +Epoch 150, Loss: 0.6703025698661804 +Epoch 160, Loss: 0.6341114640235901 +Epoch 170, Loss: 0.6044325828552246 +Epoch 180, Loss: 0.5762320756912231 +Epoch 190, Loss: 0.5533671975135803 +Epoch 200, Loss: 0.5304248332977295 +Epoch 210, Loss: 0.5072226524353027 +Epoch 220, Loss: 0.48603588342666626 +Epoch 230, Loss: 0.4708762466907501 +Epoch 240, Loss: 0.45298340916633606 +Epoch 250, Loss: 0.43822839856147766 +Epoch 260, Loss: 0.4220751225948334 +Epoch 270, Loss: 0.4062444865703583 +Epoch 280, Loss: 0.39379462599754333 +Epoch 290, Loss: 0.3815148174762726 +Test accuracy: 86.20% +Epoch 0, Loss: 1.3860297203063965 +Epoch 10, Loss: 1.3795382976531982 +Epoch 20, Loss: 1.3728879690170288 +Epoch 30, Loss: 1.3666510581970215 +Epoch 40, Loss: 1.360615611076355 +Epoch 50, Loss: 1.3547699451446533 +Epoch 60, Loss: 1.3504811525344849 +Epoch 70, Loss: 1.345394253730774 +Epoch 80, Loss: 1.3410271406173706 +Epoch 90, Loss: 1.3370840549468994 +Epoch 100, Loss: 1.3319860696792603 +Epoch 110, Loss: 1.3287663459777832 +Epoch 120, Loss: 1.3236045837402344 +Epoch 130, Loss: 1.3187066316604614 +Epoch 140, Loss: 1.3134760856628418 +Epoch 150, Loss: 1.309443712234497 +Epoch 160, Loss: 1.3036530017852783 +Epoch 170, Loss: 1.29814875125885 +Epoch 180, Loss: 1.2935230731964111 +Epoch 190, Loss: 1.2883464097976685 +Epoch 200, Loss: 1.2827820777893066 +Epoch 210, Loss: 1.2790447473526 +Epoch 220, Loss: 1.272382378578186 +Epoch 230, Loss: 1.2647428512573242 +Epoch 240, Loss: 1.2607409954071045 +Epoch 250, Loss: 1.2546697854995728 +Epoch 260, Loss: 1.2479528188705444 +Epoch 270, Loss: 1.240837812423706 +Epoch 280, Loss: 1.2361053228378296 +Epoch 290, Loss: 1.2321377992630005 +Test accuracy: 60.66% +Epoch 0, Loss: 1.38577139377594 +Epoch 10, Loss: 1.3799281120300293 +Epoch 20, Loss: 1.3745825290679932 +Epoch 30, Loss: 1.370116949081421 +Epoch 40, Loss: 1.364267349243164 +Epoch 50, Loss: 1.3592544794082642 +Epoch 60, Loss: 1.3547974824905396 +Epoch 70, Loss: 1.350380778312683 +Epoch 80, Loss: 1.3463624715805054 +Epoch 90, Loss: 1.3414767980575562 +Epoch 100, Loss: 1.3370773792266846 +Epoch 110, Loss: 1.334169626235962 +Epoch 120, Loss: 1.3282729387283325 +Epoch 130, Loss: 1.3241548538208008 +Epoch 140, Loss: 1.3199421167373657 +Epoch 150, Loss: 1.31524658203125 +Epoch 160, Loss: 1.3110345602035522 +Epoch 170, Loss: 1.305497407913208 +Epoch 180, Loss: 1.3018211126327515 +Epoch 190, Loss: 1.2980438470840454 +Epoch 200, Loss: 1.2911089658737183 +Epoch 210, Loss: 1.2868766784667969 +Epoch 220, Loss: 1.2803963422775269 +Epoch 230, Loss: 1.2751110792160034 +Epoch 240, Loss: 1.2687554359436035 +Epoch 250, Loss: 1.2639803886413574 +Epoch 260, Loss: 1.257830023765564 +Epoch 270, Loss: 1.2523967027664185 +Epoch 280, Loss: 1.2445656061172485 +Epoch 290, Loss: 1.2396665811538696 +Test accuracy: 63.86% +Epoch 0, Loss: 1.387155532836914 +Epoch 10, Loss: 1.3790655136108398 +Epoch 20, Loss: 1.372223973274231 +Epoch 30, Loss: 1.3657172918319702 +Epoch 40, Loss: 1.3604539632797241 +Epoch 50, Loss: 1.3558162450790405 +Epoch 60, Loss: 1.351557970046997 +Epoch 70, Loss: 1.3459389209747314 +Epoch 80, Loss: 1.3419463634490967 +Epoch 90, Loss: 1.3381143808364868 +Epoch 100, Loss: 1.3343733549118042 +Epoch 110, Loss: 1.327976107597351 +Epoch 120, Loss: 1.3237546682357788 +Epoch 130, Loss: 1.3187122344970703 +Epoch 140, Loss: 1.3152732849121094 +Epoch 150, Loss: 1.3118443489074707 +Epoch 160, Loss: 1.3067084550857544 +Epoch 170, Loss: 1.3015553951263428 +Epoch 180, Loss: 1.2971346378326416 +Epoch 190, Loss: 1.29056715965271 +Epoch 200, Loss: 1.2854831218719482 +Epoch 210, Loss: 1.2803955078125 +Epoch 220, Loss: 1.2747743129730225 +Epoch 230, Loss: 1.2673754692077637 +Epoch 240, Loss: 1.2615034580230713 +Epoch 250, Loss: 1.255134105682373 +Epoch 260, Loss: 1.2508686780929565 +Epoch 270, Loss: 1.2425991296768188 +Epoch 280, Loss: 1.2374308109283447 +Epoch 290, Loss: 1.2309070825576782 +Test accuracy: 63.46% +Epoch 0, Loss: 1.3873851299285889 +Epoch 10, Loss: 1.3802788257598877 +Epoch 20, Loss: 1.3745602369308472 +Epoch 30, Loss: 1.3688946962356567 +Epoch 40, Loss: 1.3638883829116821 +Epoch 50, Loss: 1.3590489625930786 +Epoch 60, Loss: 1.3541451692581177 +Epoch 70, Loss: 1.3492435216903687 +Epoch 80, Loss: 1.3445440530776978 +Epoch 90, Loss: 1.3405756950378418 +Epoch 100, Loss: 1.3366690874099731 +Epoch 110, Loss: 1.332205057144165 +Epoch 120, Loss: 1.3267927169799805 +Epoch 130, Loss: 1.3222546577453613 +Epoch 140, Loss: 1.3183057308197021 +Epoch 150, Loss: 1.3128889799118042 +Epoch 160, Loss: 1.3077714443206787 +Epoch 170, Loss: 1.302864909172058 +Epoch 180, Loss: 1.2958368062973022 +Epoch 190, Loss: 1.2891159057617188 +Epoch 200, Loss: 1.2854524850845337 +Epoch 210, Loss: 1.278706431388855 +Epoch 220, Loss: 1.2746104001998901 +Epoch 230, Loss: 1.2703077793121338 +Epoch 240, Loss: 1.2612271308898926 +Epoch 250, Loss: 1.255312204360962 +Epoch 260, Loss: 1.251732587814331 +Epoch 270, Loss: 1.244471788406372 +Epoch 280, Loss: 1.2407699823379517 +Epoch 290, Loss: 1.233461856842041 +Test accuracy: 65.73% +Epoch 0, Loss: 1.3832277059555054 +Epoch 10, Loss: 1.37636137008667 +Epoch 20, Loss: 1.3695911169052124 +Epoch 30, Loss: 1.3634392023086548 +Epoch 40, Loss: 1.3562637567520142 +Epoch 50, Loss: 1.351300597190857 +Epoch 60, Loss: 1.3461272716522217 +Epoch 70, Loss: 1.3406587839126587 +Epoch 80, Loss: 1.3369110822677612 +Epoch 90, Loss: 1.3328498601913452 +Epoch 100, Loss: 1.3281925916671753 +Epoch 110, Loss: 1.325737714767456 +Epoch 120, Loss: 1.320705533027649 +Epoch 130, Loss: 1.3159388303756714 +Epoch 140, Loss: 1.3117774724960327 +Epoch 150, Loss: 1.3061243295669556 +Epoch 160, Loss: 1.302114725112915 +Epoch 170, Loss: 1.2967753410339355 +Epoch 180, Loss: 1.2914220094680786 +Epoch 190, Loss: 1.2866501808166504 +Epoch 200, Loss: 1.2810221910476685 +Epoch 210, Loss: 1.2751187086105347 +Epoch 220, Loss: 1.2708996534347534 +Epoch 230, Loss: 1.2646141052246094 +Epoch 240, Loss: 1.2607479095458984 +Epoch 250, Loss: 1.252542495727539 +Epoch 260, Loss: 1.2470415830612183 +Epoch 270, Loss: 1.2425073385238647 +Epoch 280, Loss: 1.2369582653045654 +Epoch 290, Loss: 1.2302207946777344 +Test accuracy: 60.88% +Epoch 0, Loss: 1.3867220878601074 +Epoch 10, Loss: 1.3786284923553467 +Epoch 20, Loss: 1.3716838359832764 +Epoch 30, Loss: 1.365020990371704 +Epoch 40, Loss: 1.3586559295654297 +Epoch 50, Loss: 1.3531476259231567 +Epoch 60, Loss: 1.3486244678497314 +Epoch 70, Loss: 1.3419864177703857 +Epoch 80, Loss: 1.3368057012557983 +Epoch 90, Loss: 1.3333314657211304 +Epoch 100, Loss: 1.3270559310913086 +Epoch 110, Loss: 1.3223577737808228 +Epoch 120, Loss: 1.316636085510254 +Epoch 130, Loss: 1.311095118522644 +Epoch 140, Loss: 1.3056859970092773 +Epoch 150, Loss: 1.301555871963501 +Epoch 160, Loss: 1.297316551208496 +Epoch 170, Loss: 1.2919607162475586 +Epoch 180, Loss: 1.2855054140090942 +Epoch 190, Loss: 1.2784719467163086 +Epoch 200, Loss: 1.276496171951294 +Epoch 210, Loss: 1.2690597772598267 +Epoch 220, Loss: 1.2602885961532593 +Epoch 230, Loss: 1.2554899454116821 +Epoch 240, Loss: 1.2498961687088013 +Epoch 250, Loss: 1.2439302206039429 +Epoch 260, Loss: 1.238137125968933 +Epoch 270, Loss: 1.2340922355651855 +Epoch 280, Loss: 1.227622151374817 +Epoch 290, Loss: 1.2207907438278198 +Test accuracy: 63.11% +Epoch 0, Loss: 1.3898009061813354 +Epoch 10, Loss: 1.381048560142517 +Epoch 20, Loss: 1.3736839294433594 +Epoch 30, Loss: 1.367870807647705 +Epoch 40, Loss: 1.361847996711731 +Epoch 50, Loss: 1.3576929569244385 +Epoch 60, Loss: 1.3538405895233154 +Epoch 70, Loss: 1.3487781286239624 +Epoch 80, Loss: 1.3441390991210938 +Epoch 90, Loss: 1.339915156364441 +Epoch 100, Loss: 1.3359299898147583 +Epoch 110, Loss: 1.3332853317260742 +Epoch 120, Loss: 1.3281327486038208 +Epoch 130, Loss: 1.3239026069641113 +Epoch 140, Loss: 1.320779800415039 +Epoch 150, Loss: 1.3157356977462769 +Epoch 160, Loss: 1.3122646808624268 +Epoch 170, Loss: 1.307783603668213 +Epoch 180, Loss: 1.3032479286193848 +Epoch 190, Loss: 1.3000149726867676 +Epoch 200, Loss: 1.2951464653015137 +Epoch 210, Loss: 1.2897989749908447 +Epoch 220, Loss: 1.2841242551803589 +Epoch 230, Loss: 1.2797390222549438 +Epoch 240, Loss: 1.273450493812561 +Epoch 250, Loss: 1.2674319744110107 +Epoch 260, Loss: 1.2616360187530518 +Epoch 270, Loss: 1.2550935745239258 +Epoch 280, Loss: 1.2530564069747925 +Epoch 290, Loss: 1.2460349798202515 +Test accuracy: 59.77% +Epoch 0, Loss: 1.389028549194336 +Epoch 10, Loss: 1.3818751573562622 +Epoch 20, Loss: 1.3755642175674438 +Epoch 30, Loss: 1.3695260286331177 +Epoch 40, Loss: 1.3627463579177856 +Epoch 50, Loss: 1.3577320575714111 +Epoch 60, Loss: 1.351837396621704 +Epoch 70, Loss: 1.3462090492248535 +Epoch 80, Loss: 1.3416149616241455 +Epoch 90, Loss: 1.3378134965896606 +Epoch 100, Loss: 1.333032250404358 +Epoch 110, Loss: 1.327195644378662 +Epoch 120, Loss: 1.3227595090866089 +Epoch 130, Loss: 1.3180726766586304 +Epoch 140, Loss: 1.313250184059143 +Epoch 150, Loss: 1.3096504211425781 +Epoch 160, Loss: 1.3036686182022095 +Epoch 170, Loss: 1.2990573644638062 +Epoch 180, Loss: 1.2932946681976318 +Epoch 190, Loss: 1.2897038459777832 +Epoch 200, Loss: 1.2824410200119019 +Epoch 210, Loss: 1.2780354022979736 +Epoch 220, Loss: 1.2724515199661255 +Epoch 230, Loss: 1.2671042680740356 +Epoch 240, Loss: 1.2615429162979126 +Epoch 250, Loss: 1.2532715797424316 +Epoch 260, Loss: 1.2510560750961304 +Epoch 270, Loss: 1.2431304454803467 +Epoch 280, Loss: 1.2377821207046509 +Epoch 290, Loss: 1.2319620847702026 +Test accuracy: 63.42% +Epoch 0, Loss: 1.383753776550293 +Epoch 10, Loss: 1.376002311706543 +Epoch 20, Loss: 1.368654727935791 +Epoch 30, Loss: 1.3618149757385254 +Epoch 40, Loss: 1.3563220500946045 +Epoch 50, Loss: 1.3516428470611572 +Epoch 60, Loss: 1.3472957611083984 +Epoch 70, Loss: 1.343319296836853 +Epoch 80, Loss: 1.3387399911880493 +Epoch 90, Loss: 1.3342573642730713 +Epoch 100, Loss: 1.3296329975128174 +Epoch 110, Loss: 1.3260403871536255 +Epoch 120, Loss: 1.3216831684112549 +Epoch 130, Loss: 1.318429946899414 +Epoch 140, Loss: 1.3121932744979858 +Epoch 150, Loss: 1.3093724250793457 +Epoch 160, Loss: 1.304092526435852 +Epoch 170, Loss: 1.300302505493164 +Epoch 180, Loss: 1.2931517362594604 +Epoch 190, Loss: 1.2899062633514404 +Epoch 200, Loss: 1.284695029258728 +Epoch 210, Loss: 1.2801835536956787 +Epoch 220, Loss: 1.2740936279296875 +Epoch 230, Loss: 1.2690677642822266 +Epoch 240, Loss: 1.263343334197998 +Epoch 250, Loss: 1.2593649625778198 +Epoch 260, Loss: 1.2537325620651245 +Epoch 270, Loss: 1.247396469116211 +Epoch 280, Loss: 1.2419204711914062 +Epoch 290, Loss: 1.2373046875 +Test accuracy: 63.37% +Epoch 0, Loss: 1.383812427520752 +Epoch 10, Loss: 1.3756884336471558 +Epoch 20, Loss: 1.3683658838272095 +Epoch 30, Loss: 1.361901044845581 +Epoch 40, Loss: 1.3560905456542969 +Epoch 50, Loss: 1.3505574464797974 +Epoch 60, Loss: 1.3447190523147583 +Epoch 70, Loss: 1.3405718803405762 +Epoch 80, Loss: 1.3357807397842407 +Epoch 90, Loss: 1.3308002948760986 +Epoch 100, Loss: 1.3265812397003174 +Epoch 110, Loss: 1.3223625421524048 +Epoch 120, Loss: 1.3162163496017456 +Epoch 130, Loss: 1.3121325969696045 +Epoch 140, Loss: 1.3068469762802124 +Epoch 150, Loss: 1.3016555309295654 +Epoch 160, Loss: 1.2966961860656738 +Epoch 170, Loss: 1.2910687923431396 +Epoch 180, Loss: 1.285752296447754 +Epoch 190, Loss: 1.2797771692276 +Epoch 200, Loss: 1.2757643461227417 +Epoch 210, Loss: 1.2701423168182373 +Epoch 220, Loss: 1.2646886110305786 +Epoch 230, Loss: 1.2596099376678467 +Epoch 240, Loss: 1.2514822483062744 +Epoch 250, Loss: 1.2458446025848389 +Epoch 260, Loss: 1.2386866807937622 +Epoch 270, Loss: 1.2338426113128662 +Epoch 280, Loss: 1.2272775173187256 +Epoch 290, Loss: 1.2212222814559937 +Test accuracy: 66.13% +Epoch 0, Loss: 1.3874765634536743 +Epoch 10, Loss: 1.067463755607605 +Epoch 20, Loss: 0.6783822178840637 +Epoch 30, Loss: 0.4463275074958801 +Epoch 40, Loss: 0.3164753317832947 +Epoch 50, Loss: 0.23647016286849976 +Epoch 60, Loss: 0.18847264349460602 +Epoch 70, Loss: 0.15382438898086548 +Epoch 80, Loss: 0.12900568544864655 +Epoch 90, Loss: 0.11192696541547775 +Epoch 100, Loss: 0.09816785901784897 +Epoch 110, Loss: 0.08634591847658157 +Epoch 120, Loss: 0.07584633678197861 +Epoch 130, Loss: 0.0692240372300148 +Epoch 140, Loss: 0.06407640129327774 +Epoch 150, Loss: 0.0585264228284359 +Epoch 160, Loss: 0.053302060812711716 +Epoch 170, Loss: 0.04806649312376976 +Epoch 180, Loss: 0.045993488281965256 +Epoch 190, Loss: 0.04202434420585632 +Epoch 200, Loss: 0.03886780887842178 +Epoch 210, Loss: 0.03547762334346771 +Epoch 220, Loss: 0.03397811949253082 +Epoch 230, Loss: 0.031844280660152435 +Epoch 240, Loss: 0.029220927506685257 +Epoch 250, Loss: 0.027862457558512688 +Epoch 260, Loss: 0.02525787428021431 +Epoch 270, Loss: 0.024278441444039345 +Epoch 280, Loss: 0.02357105351984501 +Epoch 290, Loss: 0.023007985204458237 +Test accuracy: 89.94% +Epoch 0, Loss: 1.3852708339691162 +Epoch 10, Loss: 1.054416537284851 +Epoch 20, Loss: 0.6772258877754211 +Epoch 30, Loss: 0.4433435797691345 +Epoch 40, Loss: 0.31798920035362244 +Epoch 50, Loss: 0.23925520479679108 +Epoch 60, Loss: 0.18925631046295166 +Epoch 70, Loss: 0.15566615760326385 +Epoch 80, Loss: 0.13043691217899323 +Epoch 90, Loss: 0.11376454681158066 +Epoch 100, Loss: 0.09859293699264526 +Epoch 110, Loss: 0.08803988993167877 +Epoch 120, Loss: 0.07755599915981293 +Epoch 130, Loss: 0.07085702568292618 +Epoch 140, Loss: 0.0634341910481453 +Epoch 150, Loss: 0.05918705090880394 +Epoch 160, Loss: 0.05488862842321396 +Epoch 170, Loss: 0.05044674128293991 +Epoch 180, Loss: 0.04674658551812172 +Epoch 190, Loss: 0.04385967552661896 +Epoch 200, Loss: 0.040384676307439804 +Epoch 210, Loss: 0.03942592069506645 +Epoch 220, Loss: 0.035894572734832764 +Epoch 230, Loss: 0.03385775908827782 +Epoch 240, Loss: 0.030840961262583733 +Epoch 250, Loss: 0.02939765527844429 +Epoch 260, Loss: 0.027806835249066353 +Epoch 270, Loss: 0.02740936353802681 +Epoch 280, Loss: 0.024984799325466156 +Epoch 290, Loss: 0.024055128917098045 +Test accuracy: 90.16% +Epoch 0, Loss: 1.3821508884429932 +Epoch 10, Loss: 1.0561176538467407 +Epoch 20, Loss: 0.6634842753410339 +Epoch 30, Loss: 0.4395027756690979 +Epoch 40, Loss: 0.31024330854415894 +Epoch 50, Loss: 0.23760563135147095 +Epoch 60, Loss: 0.19198358058929443 +Epoch 70, Loss: 0.15892238914966583 +Epoch 80, Loss: 0.13399791717529297 +Epoch 90, Loss: 0.11569993942975998 +Epoch 100, Loss: 0.10149084031581879 +Epoch 110, Loss: 0.09147081524133682 +Epoch 120, Loss: 0.08107725530862808 +Epoch 130, Loss: 0.07442522794008255 +Epoch 140, Loss: 0.06836871057748795 +Epoch 150, Loss: 0.06164887174963951 +Epoch 160, Loss: 0.05836782604455948 +Epoch 170, Loss: 0.05296963453292847 +Epoch 180, Loss: 0.04890672117471695 +Epoch 190, Loss: 0.04675951227545738 +Epoch 200, Loss: 0.04329504445195198 +Epoch 210, Loss: 0.04134885221719742 +Epoch 220, Loss: 0.04050209000706673 +Epoch 230, Loss: 0.03629496693611145 +Epoch 240, Loss: 0.03369198739528656 +Epoch 250, Loss: 0.030605368316173553 +Epoch 260, Loss: 0.02859177067875862 +Epoch 270, Loss: 0.028894968330860138 +Epoch 280, Loss: 0.025926537811756134 +Epoch 290, Loss: 0.02474960684776306 +Test accuracy: 90.97% +Epoch 0, Loss: 1.3844565153121948 +Epoch 10, Loss: 1.0730445384979248 +Epoch 20, Loss: 0.710331916809082 +Epoch 30, Loss: 0.4630332589149475 +Epoch 40, Loss: 0.33402159810066223 +Epoch 50, Loss: 0.25355374813079834 +Epoch 60, Loss: 0.1988237351179123 +Epoch 70, Loss: 0.16404129564762115 +Epoch 80, Loss: 0.13771285116672516 +Epoch 90, Loss: 0.11684159189462662 +Epoch 100, Loss: 0.10204107314348221 +Epoch 110, Loss: 0.09201960265636444 +Epoch 120, Loss: 0.08094263076782227 +Epoch 130, Loss: 0.07440201193094254 +Epoch 140, Loss: 0.06615255773067474 +Epoch 150, Loss: 0.06164715439081192 +Epoch 160, Loss: 0.055340882390737534 +Epoch 170, Loss: 0.051569096744060516 +Epoch 180, Loss: 0.04713509976863861 +Epoch 190, Loss: 0.04368722438812256 +Epoch 200, Loss: 0.04084376245737076 +Epoch 210, Loss: 0.03796910494565964 +Epoch 220, Loss: 0.03477945551276207 +Epoch 230, Loss: 0.03263785317540169 +Epoch 240, Loss: 0.029837993904948235 +Epoch 250, Loss: 0.02883114293217659 +Epoch 260, Loss: 0.02661779895424843 +Epoch 270, Loss: 0.026580875739455223 +Epoch 280, Loss: 0.024952787905931473 +Epoch 290, Loss: 0.02256852574646473 +Test accuracy: 91.59% +Epoch 0, Loss: 1.38632333278656 +Epoch 10, Loss: 1.1046336889266968 +Epoch 20, Loss: 0.7147361636161804 +Epoch 30, Loss: 0.46570900082588196 +Epoch 40, Loss: 0.32970255613327026 +Epoch 50, Loss: 0.2507985532283783 +Epoch 60, Loss: 0.19545689225196838 +Epoch 70, Loss: 0.15655118227005005 +Epoch 80, Loss: 0.1357061117887497 +Epoch 90, Loss: 0.11410698294639587 +Epoch 100, Loss: 0.10148950666189194 +Epoch 110, Loss: 0.08997734636068344 +Epoch 120, Loss: 0.0803527757525444 +Epoch 130, Loss: 0.07418295741081238 +Epoch 140, Loss: 0.06652792543172836 +Epoch 150, Loss: 0.06038978323340416 +Epoch 160, Loss: 0.05454983934760094 +Epoch 170, Loss: 0.05152028053998947 +Epoch 180, Loss: 0.04764968529343605 +Epoch 190, Loss: 0.04433104023337364 +Epoch 200, Loss: 0.040820010006427765 +Epoch 210, Loss: 0.037328172475099564 +Epoch 220, Loss: 0.035973433405160904 +Epoch 230, Loss: 0.03332097828388214 +Epoch 240, Loss: 0.031043803319334984 +Epoch 250, Loss: 0.0290310550481081 +Epoch 260, Loss: 0.027536239475011826 +Epoch 270, Loss: 0.02632780186831951 +Epoch 280, Loss: 0.024033483117818832 +Epoch 290, Loss: 0.022634491324424744 +Test accuracy: 89.27% +Epoch 0, Loss: 1.3843873739242554 +Epoch 10, Loss: 1.0619406700134277 +Epoch 20, Loss: 0.686832845211029 +Epoch 30, Loss: 0.45911654829978943 +Epoch 40, Loss: 0.331683486700058 +Epoch 50, Loss: 0.25245893001556396 +Epoch 60, Loss: 0.20554570853710175 +Epoch 70, Loss: 0.17023701965808868 +Epoch 80, Loss: 0.14175981283187866 +Epoch 90, Loss: 0.12204577773809433 +Epoch 100, Loss: 0.10895740240812302 +Epoch 110, Loss: 0.09793075174093246 +Epoch 120, Loss: 0.08795321732759476 +Epoch 130, Loss: 0.07864753156900406 +Epoch 140, Loss: 0.07067254930734634 +Epoch 150, Loss: 0.06451980769634247 +Epoch 160, Loss: 0.05865974351763725 +Epoch 170, Loss: 0.05577090382575989 +Epoch 180, Loss: 0.050967998802661896 +Epoch 190, Loss: 0.04696378484368324 +Epoch 200, Loss: 0.043882664293050766 +Epoch 210, Loss: 0.04107605665922165 +Epoch 220, Loss: 0.03847348317503929 +Epoch 230, Loss: 0.03661123290657997 +Epoch 240, Loss: 0.033368226140737534 +Epoch 250, Loss: 0.03269749507308006 +Epoch 260, Loss: 0.028885208070278168 +Epoch 270, Loss: 0.027570735663175583 +Epoch 280, Loss: 0.02562701888382435 +Epoch 290, Loss: 0.02525600604712963 +Test accuracy: 90.57% +Epoch 0, Loss: 1.3871558904647827 +Epoch 10, Loss: 1.047757863998413 +Epoch 20, Loss: 0.6660506725311279 +Epoch 30, Loss: 0.4432315528392792 +Epoch 40, Loss: 0.31964024901390076 +Epoch 50, Loss: 0.24003298580646515 +Epoch 60, Loss: 0.19147908687591553 +Epoch 70, Loss: 0.15789635479450226 +Epoch 80, Loss: 0.13196001946926117 +Epoch 90, Loss: 0.11424954235553741 +Epoch 100, Loss: 0.09994752705097198 +Epoch 110, Loss: 0.09078388661146164 +Epoch 120, Loss: 0.08044452965259552 +Epoch 130, Loss: 0.07296669483184814 +Epoch 140, Loss: 0.06730815768241882 +Epoch 150, Loss: 0.06034279242157936 +Epoch 160, Loss: 0.0566176101565361 +Epoch 170, Loss: 0.05265028402209282 +Epoch 180, Loss: 0.04775981977581978 +Epoch 190, Loss: 0.04513464495539665 +Epoch 200, Loss: 0.041416607797145844 +Epoch 210, Loss: 0.04010189324617386 +Epoch 220, Loss: 0.03683558478951454 +Epoch 230, Loss: 0.03430496156215668 +Epoch 240, Loss: 0.03156527504324913 +Epoch 250, Loss: 0.030238548293709755 +Epoch 260, Loss: 0.02915024384856224 +Epoch 270, Loss: 0.026880210265517235 +Epoch 280, Loss: 0.026081807911396027 +Epoch 290, Loss: 0.023943206295371056 +Test accuracy: 91.14% +Epoch 0, Loss: 1.386413335800171 +Epoch 10, Loss: 1.048764705657959 +Epoch 20, Loss: 0.6843260526657104 +Epoch 30, Loss: 0.46039268374443054 +Epoch 40, Loss: 0.32699844241142273 +Epoch 50, Loss: 0.24869318306446075 +Epoch 60, Loss: 0.19406282901763916 +Epoch 70, Loss: 0.16181665658950806 +Epoch 80, Loss: 0.1376464068889618 +Epoch 90, Loss: 0.11747945845127106 +Epoch 100, Loss: 0.10445190221071243 +Epoch 110, Loss: 0.09270777553319931 +Epoch 120, Loss: 0.08477930724620819 +Epoch 130, Loss: 0.07784077525138855 +Epoch 140, Loss: 0.06940216571092606 +Epoch 150, Loss: 0.06323472410440445 +Epoch 160, Loss: 0.05836314335465431 +Epoch 170, Loss: 0.054405830800533295 +Epoch 180, Loss: 0.05071453005075455 +Epoch 190, Loss: 0.04591118171811104 +Epoch 200, Loss: 0.04307187721133232 +Epoch 210, Loss: 0.03997289761900902 +Epoch 220, Loss: 0.03783107548952103 +Epoch 230, Loss: 0.03589954972267151 +Epoch 240, Loss: 0.031642649322748184 +Epoch 250, Loss: 0.03053920902311802 +Epoch 260, Loss: 0.03029511868953705 +Epoch 270, Loss: 0.02805154025554657 +Epoch 280, Loss: 0.025976194068789482 +Epoch 290, Loss: 0.025346120819449425 +Test accuracy: 91.63% +Epoch 0, Loss: 1.3834385871887207 +Epoch 10, Loss: 1.067949891090393 +Epoch 20, Loss: 0.6924917697906494 +Epoch 30, Loss: 0.44859543442726135 +Epoch 40, Loss: 0.31720036268234253 +Epoch 50, Loss: 0.24212992191314697 +Epoch 60, Loss: 0.19584137201309204 +Epoch 70, Loss: 0.16107097268104553 +Epoch 80, Loss: 0.13758118450641632 +Epoch 90, Loss: 0.1179446130990982 +Epoch 100, Loss: 0.10369428247213364 +Epoch 110, Loss: 0.09283469617366791 +Epoch 120, Loss: 0.0831422358751297 +Epoch 130, Loss: 0.07428879290819168 +Epoch 140, Loss: 0.0676073431968689 +Epoch 150, Loss: 0.06248991936445236 +Epoch 160, Loss: 0.05665847286581993 +Epoch 170, Loss: 0.052359774708747864 +Epoch 180, Loss: 0.04915008321404457 +Epoch 190, Loss: 0.04647364839911461 +Epoch 200, Loss: 0.0414678119122982 +Epoch 210, Loss: 0.03947015851736069 +Epoch 220, Loss: 0.03781025484204292 +Epoch 230, Loss: 0.03484208881855011 +Epoch 240, Loss: 0.032503534108400345 +Epoch 250, Loss: 0.029529675841331482 +Epoch 260, Loss: 0.02908073551952839 +Epoch 270, Loss: 0.027112390846014023 +Epoch 280, Loss: 0.025462275370955467 +Epoch 290, Loss: 0.023507624864578247 +Test accuracy: 90.97% +Epoch 0, Loss: 1.3886669874191284 +Epoch 10, Loss: 1.0352568626403809 +Epoch 20, Loss: 0.6569599509239197 +Epoch 30, Loss: 0.44058194756507874 +Epoch 40, Loss: 0.3156588673591614 +Epoch 50, Loss: 0.23933663964271545 +Epoch 60, Loss: 0.19251661002635956 +Epoch 70, Loss: 0.15509356558322906 +Epoch 80, Loss: 0.13165895640850067 +Epoch 90, Loss: 0.11541791260242462 +Epoch 100, Loss: 0.10022888332605362 +Epoch 110, Loss: 0.08981309831142426 +Epoch 120, Loss: 0.08195148408412933 +Epoch 130, Loss: 0.07381998747587204 +Epoch 140, Loss: 0.06855233013629913 +Epoch 150, Loss: 0.06178320571780205 +Epoch 160, Loss: 0.057345714420080185 +Epoch 170, Loss: 0.052385929971933365 +Epoch 180, Loss: 0.04967733100056648 +Epoch 190, Loss: 0.0456743985414505 +Epoch 200, Loss: 0.04094957187771797 +Epoch 210, Loss: 0.039331212639808655 +Epoch 220, Loss: 0.03772267326712608 +Epoch 230, Loss: 0.03665687516331673 +Epoch 240, Loss: 0.03380919247865677 +Epoch 250, Loss: 0.03189712390303612 +Epoch 260, Loss: 0.03115144744515419 +Epoch 270, Loss: 0.029183916747570038 +Epoch 280, Loss: 0.026530131697654724 +Epoch 290, Loss: 0.024518776684999466 +Test accuracy: 91.28% +Epoch 0, Loss: 1.385849118232727 +Epoch 10, Loss: 1.3265222311019897 +Epoch 20, Loss: 1.2804220914840698 +Epoch 30, Loss: 1.217090368270874 +Epoch 40, Loss: 1.1472423076629639 +Epoch 50, Loss: 1.0661448240280151 +Epoch 60, Loss: 0.9873985052108765 +Epoch 70, Loss: 0.9079370498657227 +Epoch 80, Loss: 0.836845338344574 +Epoch 90, Loss: 0.769946813583374 +Epoch 100, Loss: 0.7046175599098206 +Epoch 110, Loss: 0.6552135944366455 +Epoch 120, Loss: 0.6092166304588318 +Epoch 130, Loss: 0.567923367023468 +Epoch 140, Loss: 0.5307979583740234 +Epoch 150, Loss: 0.497842937707901 +Epoch 160, Loss: 0.46959102153778076 +Epoch 170, Loss: 0.44535455107688904 +Epoch 180, Loss: 0.42228564620018005 +Epoch 190, Loss: 0.40384694933891296 +Epoch 200, Loss: 0.3784506916999817 +Epoch 210, Loss: 0.36353880167007446 +Epoch 220, Loss: 0.34822753071784973 +Epoch 230, Loss: 0.33366093039512634 +Epoch 240, Loss: 0.3206911087036133 +Epoch 250, Loss: 0.3073151707649231 +Epoch 260, Loss: 0.29586759209632874 +Epoch 270, Loss: 0.28530389070510864 +Epoch 280, Loss: 0.2739761173725128 +Epoch 290, Loss: 0.2650948464870453 +Test accuracy: 87.14% +Epoch 0, Loss: 1.3858861923217773 +Epoch 10, Loss: 1.3249680995941162 +Epoch 20, Loss: 1.2719627618789673 +Epoch 30, Loss: 1.2107160091400146 +Epoch 40, Loss: 1.1379495859146118 +Epoch 50, Loss: 1.0599664449691772 +Epoch 60, Loss: 0.9795934557914734 +Epoch 70, Loss: 0.8991076350212097 +Epoch 80, Loss: 0.828214168548584 +Epoch 90, Loss: 0.762455940246582 +Epoch 100, Loss: 0.7042278051376343 +Epoch 110, Loss: 0.6504243016242981 +Epoch 120, Loss: 0.6010153889656067 +Epoch 130, Loss: 0.5611854791641235 +Epoch 140, Loss: 0.5259332060813904 +Epoch 150, Loss: 0.5019460916519165 +Epoch 160, Loss: 0.46549081802368164 +Epoch 170, Loss: 0.4431762099266052 +Epoch 180, Loss: 0.4185296297073364 +Epoch 190, Loss: 0.40060579776763916 +Epoch 200, Loss: 0.3813239634037018 +Epoch 210, Loss: 0.36035746335983276 +Epoch 220, Loss: 0.34493857622146606 +Epoch 230, Loss: 0.3274811804294586 +Epoch 240, Loss: 0.3154754638671875 +Epoch 250, Loss: 0.30601146817207336 +Epoch 260, Loss: 0.2942091226577759 +Epoch 270, Loss: 0.2825600802898407 +Epoch 280, Loss: 0.27349263429641724 +Epoch 290, Loss: 0.26169097423553467 +Test accuracy: 89.19% +Epoch 0, Loss: 1.3892686367034912 +Epoch 10, Loss: 1.3264918327331543 +Epoch 20, Loss: 1.2821309566497803 +Epoch 30, Loss: 1.2241543531417847 +Epoch 40, Loss: 1.1527628898620605 +Epoch 50, Loss: 1.0798721313476562 +Epoch 60, Loss: 0.9996446967124939 +Epoch 70, Loss: 0.9248473048210144 +Epoch 80, Loss: 0.8480497002601624 +Epoch 90, Loss: 0.7818760871887207 +Epoch 100, Loss: 0.718585729598999 +Epoch 110, Loss: 0.6661965847015381 +Epoch 120, Loss: 0.6194986701011658 +Epoch 130, Loss: 0.5799052119255066 +Epoch 140, Loss: 0.539110541343689 +Epoch 150, Loss: 0.5114725232124329 +Epoch 160, Loss: 0.475462943315506 +Epoch 170, Loss: 0.4530255198478699 +Epoch 180, Loss: 0.42731788754463196 +Epoch 190, Loss: 0.4062214493751526 +Epoch 200, Loss: 0.3846113383769989 +Epoch 210, Loss: 0.36739209294319153 +Epoch 220, Loss: 0.35021981596946716 +Epoch 230, Loss: 0.3359534740447998 +Epoch 240, Loss: 0.32147854566574097 +Epoch 250, Loss: 0.3079267740249634 +Epoch 260, Loss: 0.29711124300956726 +Epoch 270, Loss: 0.28602662682533264 +Epoch 280, Loss: 0.27718454599380493 +Epoch 290, Loss: 0.2660936415195465 +Test accuracy: 88.65% +Epoch 0, Loss: 1.384501338005066 +Epoch 10, Loss: 1.3229933977127075 +Epoch 20, Loss: 1.2700822353363037 +Epoch 30, Loss: 1.2034366130828857 +Epoch 40, Loss: 1.1290794610977173 +Epoch 50, Loss: 1.0495617389678955 +Epoch 60, Loss: 0.9705623388290405 +Epoch 70, Loss: 0.8948508501052856 +Epoch 80, Loss: 0.8174646496772766 +Epoch 90, Loss: 0.7531384825706482 +Epoch 100, Loss: 0.691925048828125 +Epoch 110, Loss: 0.6413865685462952 +Epoch 120, Loss: 0.5971706509590149 +Epoch 130, Loss: 0.5585820078849792 +Epoch 140, Loss: 0.5222283005714417 +Epoch 150, Loss: 0.491690456867218 +Epoch 160, Loss: 0.46587130427360535 +Epoch 170, Loss: 0.4378415644168854 +Epoch 180, Loss: 0.4138462543487549 +Epoch 190, Loss: 0.39656391739845276 +Epoch 200, Loss: 0.3765210211277008 +Epoch 210, Loss: 0.3586532175540924 +Epoch 220, Loss: 0.3435365855693817 +Epoch 230, Loss: 0.32702216506004333 +Epoch 240, Loss: 0.3174561560153961 +Epoch 250, Loss: 0.30165717005729675 +Epoch 260, Loss: 0.2932646572589874 +Epoch 270, Loss: 0.2813974618911743 +Epoch 280, Loss: 0.2715732157230377 +Epoch 290, Loss: 0.25849977135658264 +Test accuracy: 87.54% +Epoch 0, Loss: 1.3851324319839478 +Epoch 10, Loss: 1.320092797279358 +Epoch 20, Loss: 1.268241286277771 +Epoch 30, Loss: 1.2031739950180054 +Epoch 40, Loss: 1.1322429180145264 +Epoch 50, Loss: 1.054099678993225 +Epoch 60, Loss: 0.9737205505371094 +Epoch 70, Loss: 0.8967822194099426 +Epoch 80, Loss: 0.8236565589904785 +Epoch 90, Loss: 0.753260612487793 +Epoch 100, Loss: 0.6939094662666321 +Epoch 110, Loss: 0.6459782719612122 +Epoch 120, Loss: 0.6024181246757507 +Epoch 130, Loss: 0.5634509325027466 +Epoch 140, Loss: 0.5252828001976013 +Epoch 150, Loss: 0.4919945001602173 +Epoch 160, Loss: 0.46577879786491394 +Epoch 170, Loss: 0.44149452447891235 +Epoch 180, Loss: 0.4166182279586792 +Epoch 190, Loss: 0.3980611264705658 +Epoch 200, Loss: 0.375694215297699 +Epoch 210, Loss: 0.3593606948852539 +Epoch 220, Loss: 0.34356945753097534 +Epoch 230, Loss: 0.3312913179397583 +Epoch 240, Loss: 0.3176392912864685 +Epoch 250, Loss: 0.30341872572898865 +Epoch 260, Loss: 0.29315653443336487 +Epoch 270, Loss: 0.2826539874076843 +Epoch 280, Loss: 0.2717828154563904 +Epoch 290, Loss: 0.2626603841781616 +Test accuracy: 88.21% +Epoch 0, Loss: 1.3871229887008667 +Epoch 10, Loss: 1.3265448808670044 +Epoch 20, Loss: 1.2789490222930908 +Epoch 30, Loss: 1.2218691110610962 +Epoch 40, Loss: 1.1504466533660889 +Epoch 50, Loss: 1.0722618103027344 +Epoch 60, Loss: 0.9902675151824951 +Epoch 70, Loss: 0.9098939299583435 +Epoch 80, Loss: 0.8327316641807556 +Epoch 90, Loss: 0.7642946839332581 +Epoch 100, Loss: 0.7034876346588135 +Epoch 110, Loss: 0.6530368328094482 +Epoch 120, Loss: 0.6062459945678711 +Epoch 130, Loss: 0.5642675757408142 +Epoch 140, Loss: 0.5285374522209167 +Epoch 150, Loss: 0.49618154764175415 +Epoch 160, Loss: 0.46786952018737793 +Epoch 170, Loss: 0.44183313846588135 +Epoch 180, Loss: 0.4205295145511627 +Epoch 190, Loss: 0.39491599798202515 +Epoch 200, Loss: 0.37906551361083984 +Epoch 210, Loss: 0.360034316778183 +Epoch 220, Loss: 0.3454146683216095 +Epoch 230, Loss: 0.3279151916503906 +Epoch 240, Loss: 0.31599244475364685 +Epoch 250, Loss: 0.3041403591632843 +Epoch 260, Loss: 0.2919367849826813 +Epoch 270, Loss: 0.28357306122779846 +Epoch 280, Loss: 0.2697754502296448 +Epoch 290, Loss: 0.261039137840271 +Test accuracy: 87.27% +Epoch 0, Loss: 1.3852839469909668 +Epoch 10, Loss: 1.3253865242004395 +Epoch 20, Loss: 1.2769137620925903 +Epoch 30, Loss: 1.2148140668869019 +Epoch 40, Loss: 1.142877221107483 +Epoch 50, Loss: 1.0658490657806396 +Epoch 60, Loss: 0.977547824382782 +Epoch 70, Loss: 0.9026222825050354 +Epoch 80, Loss: 0.8267560005187988 +Epoch 90, Loss: 0.7591168284416199 +Epoch 100, Loss: 0.6957808136940002 +Epoch 110, Loss: 0.6463800668716431 +Epoch 120, Loss: 0.5964411497116089 +Epoch 130, Loss: 0.558790385723114 +Epoch 140, Loss: 0.5214003920555115 +Epoch 150, Loss: 0.4893108308315277 +Epoch 160, Loss: 0.46289941668510437 +Epoch 170, Loss: 0.4353120028972626 +Epoch 180, Loss: 0.41312113404273987 +Epoch 190, Loss: 0.39406055212020874 +Epoch 200, Loss: 0.373324453830719 +Epoch 210, Loss: 0.3556101322174072 +Epoch 220, Loss: 0.34260475635528564 +Epoch 230, Loss: 0.32477256655693054 +Epoch 240, Loss: 0.31148046255111694 +Epoch 250, Loss: 0.30114755034446716 +Epoch 260, Loss: 0.29087668657302856 +Epoch 270, Loss: 0.2775155305862427 +Epoch 280, Loss: 0.2687210142612457 +Epoch 290, Loss: 0.26033589243888855 +Test accuracy: 87.54% +Epoch 0, Loss: 1.3856257200241089 +Epoch 10, Loss: 1.3256466388702393 +Epoch 20, Loss: 1.2776832580566406 +Epoch 30, Loss: 1.2156682014465332 +Epoch 40, Loss: 1.1452374458312988 +Epoch 50, Loss: 1.0661565065383911 +Epoch 60, Loss: 0.9796164631843567 +Epoch 70, Loss: 0.8962565660476685 +Epoch 80, Loss: 0.825164258480072 +Epoch 90, Loss: 0.7586708068847656 +Epoch 100, Loss: 0.7005423307418823 +Epoch 110, Loss: 0.6493676900863647 +Epoch 120, Loss: 0.6051300764083862 +Epoch 130, Loss: 0.5641951560974121 +Epoch 140, Loss: 0.5275656580924988 +Epoch 150, Loss: 0.4936482608318329 +Epoch 160, Loss: 0.4664749801158905 +Epoch 170, Loss: 0.44268688559532166 +Epoch 180, Loss: 0.41798052191734314 +Epoch 190, Loss: 0.39927762746810913 +Epoch 200, Loss: 0.3765777349472046 +Epoch 210, Loss: 0.3638908565044403 +Epoch 220, Loss: 0.3448803126811981 +Epoch 230, Loss: 0.33119720220565796 +Epoch 240, Loss: 0.31752443313598633 +Epoch 250, Loss: 0.3070186972618103 +Epoch 260, Loss: 0.2946450114250183 +Epoch 270, Loss: 0.2832857072353363 +Epoch 280, Loss: 0.272739976644516 +Epoch 290, Loss: 0.2629532814025879 +Test accuracy: 87.36% +Epoch 0, Loss: 1.388873815536499 +Epoch 10, Loss: 1.3277572393417358 +Epoch 20, Loss: 1.2837543487548828 +Epoch 30, Loss: 1.2224948406219482 +Epoch 40, Loss: 1.152023196220398 +Epoch 50, Loss: 1.0736078023910522 +Epoch 60, Loss: 0.9936288595199585 +Epoch 70, Loss: 0.9127224683761597 +Epoch 80, Loss: 0.8396973609924316 +Epoch 90, Loss: 0.7721556425094604 +Epoch 100, Loss: 0.7081357836723328 +Epoch 110, Loss: 0.6608964800834656 +Epoch 120, Loss: 0.6130333542823792 +Epoch 130, Loss: 0.574974536895752 +Epoch 140, Loss: 0.5355361104011536 +Epoch 150, Loss: 0.5035005211830139 +Epoch 160, Loss: 0.47504085302352905 +Epoch 170, Loss: 0.4513353705406189 +Epoch 180, Loss: 0.42843931913375854 +Epoch 190, Loss: 0.40580978989601135 +Epoch 200, Loss: 0.3851432800292969 +Epoch 210, Loss: 0.36732885241508484 +Epoch 220, Loss: 0.3518797755241394 +Epoch 230, Loss: 0.3371000289916992 +Epoch 240, Loss: 0.3231557607650757 +Epoch 250, Loss: 0.3081547021865845 +Epoch 260, Loss: 0.29808923602104187 +Epoch 270, Loss: 0.2861308157444 +Epoch 280, Loss: 0.276825875043869 +Epoch 290, Loss: 0.2662350535392761 +Test accuracy: 88.47% +Epoch 0, Loss: 1.3878858089447021 +Epoch 10, Loss: 1.3272334337234497 +Epoch 20, Loss: 1.279119610786438 +Epoch 30, Loss: 1.2191959619522095 +Epoch 40, Loss: 1.1467050313949585 +Epoch 50, Loss: 1.0679579973220825 +Epoch 60, Loss: 0.9867734313011169 +Epoch 70, Loss: 0.9046793580055237 +Epoch 80, Loss: 0.8329952955245972 +Epoch 90, Loss: 0.7644380927085876 +Epoch 100, Loss: 0.7035682201385498 +Epoch 110, Loss: 0.6532503366470337 +Epoch 120, Loss: 0.610675036907196 +Epoch 130, Loss: 0.5674054622650146 +Epoch 140, Loss: 0.5281076431274414 +Epoch 150, Loss: 0.49621036648750305 +Epoch 160, Loss: 0.46644073724746704 +Epoch 170, Loss: 0.44498276710510254 +Epoch 180, Loss: 0.4174004793167114 +Epoch 190, Loss: 0.3989889919757843 +Epoch 200, Loss: 0.3785349428653717 +Epoch 210, Loss: 0.36147597432136536 +Epoch 220, Loss: 0.34295958280563354 +Epoch 230, Loss: 0.3311804533004761 +Epoch 240, Loss: 0.31710538268089294 +Epoch 250, Loss: 0.302785187959671 +Epoch 260, Loss: 0.29433363676071167 +Epoch 270, Loss: 0.2809721827507019 +Epoch 280, Loss: 0.2724763751029968 +Epoch 290, Loss: 0.2607510983943939 +Test accuracy: 88.16% +Epoch 0, Loss: 1.3867794275283813 +Epoch 10, Loss: 1.376668930053711 +Epoch 20, Loss: 1.3683172464370728 +Epoch 30, Loss: 1.3602169752120972 +Epoch 40, Loss: 1.3527424335479736 +Epoch 50, Loss: 1.3471602201461792 +Epoch 60, Loss: 1.340323805809021 +Epoch 70, Loss: 1.3348414897918701 +Epoch 80, Loss: 1.3302451372146606 +Epoch 90, Loss: 1.3233281373977661 +Epoch 100, Loss: 1.3181182146072388 +Epoch 110, Loss: 1.3124592304229736 +Epoch 120, Loss: 1.3046841621398926 +Epoch 130, Loss: 1.2988942861557007 +Epoch 140, Loss: 1.2913388013839722 +Epoch 150, Loss: 1.2844595909118652 +Epoch 160, Loss: 1.2761948108673096 +Epoch 170, Loss: 1.2686182260513306 +Epoch 180, Loss: 1.261560320854187 +Epoch 190, Loss: 1.2534818649291992 +Epoch 200, Loss: 1.2461392879486084 +Epoch 210, Loss: 1.2372725009918213 +Epoch 220, Loss: 1.2303367853164673 +Epoch 230, Loss: 1.2214299440383911 +Epoch 240, Loss: 1.2129168510437012 +Epoch 250, Loss: 1.2038817405700684 +Epoch 260, Loss: 1.1978306770324707 +Epoch 270, Loss: 1.1883095502853394 +Epoch 280, Loss: 1.1791048049926758 +Epoch 290, Loss: 1.1709686517715454 +Test accuracy: 67.42% +Epoch 0, Loss: 1.385494589805603 +Epoch 10, Loss: 1.3750554323196411 +Epoch 20, Loss: 1.3655152320861816 +Epoch 30, Loss: 1.3574988842010498 +Epoch 40, Loss: 1.3489928245544434 +Epoch 50, Loss: 1.3419299125671387 +Epoch 60, Loss: 1.3359766006469727 +Epoch 70, Loss: 1.3301030397415161 +Epoch 80, Loss: 1.3234121799468994 +Epoch 90, Loss: 1.3172566890716553 +Epoch 100, Loss: 1.310295581817627 +Epoch 110, Loss: 1.3028886318206787 +Epoch 120, Loss: 1.2973763942718506 +Epoch 130, Loss: 1.289670705795288 +Epoch 140, Loss: 1.282389760017395 +Epoch 150, Loss: 1.274135947227478 +Epoch 160, Loss: 1.266817569732666 +Epoch 170, Loss: 1.2591602802276611 +Epoch 180, Loss: 1.2514657974243164 +Epoch 190, Loss: 1.2427494525909424 +Epoch 200, Loss: 1.2337229251861572 +Epoch 210, Loss: 1.2259761095046997 +Epoch 220, Loss: 1.2183630466461182 +Epoch 230, Loss: 1.2081905603408813 +Epoch 240, Loss: 1.1985019445419312 +Epoch 250, Loss: 1.192220687866211 +Epoch 260, Loss: 1.1823307275772095 +Epoch 270, Loss: 1.1737788915634155 +Epoch 280, Loss: 1.1642489433288574 +Epoch 290, Loss: 1.1546896696090698 +Test accuracy: 66.80% +Epoch 0, Loss: 1.3875349760055542 +Epoch 10, Loss: 1.3778610229492188 +Epoch 20, Loss: 1.3692495822906494 +Epoch 30, Loss: 1.3611432313919067 +Epoch 40, Loss: 1.353784203529358 +Epoch 50, Loss: 1.3467525243759155 +Epoch 60, Loss: 1.3397636413574219 +Epoch 70, Loss: 1.3344160318374634 +Epoch 80, Loss: 1.326788067817688 +Epoch 90, Loss: 1.32123601436615 +Epoch 100, Loss: 1.3155707120895386 +Epoch 110, Loss: 1.3098602294921875 +Epoch 120, Loss: 1.303164005279541 +Epoch 130, Loss: 1.296690583229065 +Epoch 140, Loss: 1.2892265319824219 +Epoch 150, Loss: 1.2824459075927734 +Epoch 160, Loss: 1.2750836610794067 +Epoch 170, Loss: 1.2684162855148315 +Epoch 180, Loss: 1.260772705078125 +Epoch 190, Loss: 1.2519028186798096 +Epoch 200, Loss: 1.244397759437561 +Epoch 210, Loss: 1.2363299131393433 +Epoch 220, Loss: 1.227518916130066 +Epoch 230, Loss: 1.2184327840805054 +Epoch 240, Loss: 1.2121552228927612 +Epoch 250, Loss: 1.2028404474258423 +Epoch 260, Loss: 1.194047451019287 +Epoch 270, Loss: 1.1869583129882812 +Epoch 280, Loss: 1.1751352548599243 +Epoch 290, Loss: 1.1663058996200562 +Test accuracy: 65.60% +Epoch 0, Loss: 1.3868252038955688 +Epoch 10, Loss: 1.3745886087417603 +Epoch 20, Loss: 1.3646256923675537 +Epoch 30, Loss: 1.3555570840835571 +Epoch 40, Loss: 1.3472402095794678 +Epoch 50, Loss: 1.3404853343963623 +Epoch 60, Loss: 1.3331972360610962 +Epoch 70, Loss: 1.326676607131958 +Epoch 80, Loss: 1.3200161457061768 +Epoch 90, Loss: 1.3127110004425049 +Epoch 100, Loss: 1.3060848712921143 +Epoch 110, Loss: 1.2996827363967896 +Epoch 120, Loss: 1.2910174131393433 +Epoch 130, Loss: 1.2847975492477417 +Epoch 140, Loss: 1.2768230438232422 +Epoch 150, Loss: 1.2697203159332275 +Epoch 160, Loss: 1.2609909772872925 +Epoch 170, Loss: 1.253592610359192 +Epoch 180, Loss: 1.2451789379119873 +Epoch 190, Loss: 1.2364780902862549 +Epoch 200, Loss: 1.2279103994369507 +Epoch 210, Loss: 1.2204900979995728 +Epoch 220, Loss: 1.2109463214874268 +Epoch 230, Loss: 1.203569769859314 +Epoch 240, Loss: 1.1947566270828247 +Epoch 250, Loss: 1.1830898523330688 +Epoch 260, Loss: 1.1763516664505005 +Epoch 270, Loss: 1.167993426322937 +Epoch 280, Loss: 1.158808946609497 +Epoch 290, Loss: 1.150697112083435 +Test accuracy: 66.36% +Epoch 0, Loss: 1.3881527185440063 +Epoch 10, Loss: 1.3774017095565796 +Epoch 20, Loss: 1.3679587841033936 +Epoch 30, Loss: 1.3597453832626343 +Epoch 40, Loss: 1.3521815538406372 +Epoch 50, Loss: 1.345231294631958 +Epoch 60, Loss: 1.3389146327972412 +Epoch 70, Loss: 1.332497477531433 +Epoch 80, Loss: 1.3270409107208252 +Epoch 90, Loss: 1.3209913969039917 +Epoch 100, Loss: 1.3150359392166138 +Epoch 110, Loss: 1.3083736896514893 +Epoch 120, Loss: 1.3003836870193481 +Epoch 130, Loss: 1.2937498092651367 +Epoch 140, Loss: 1.2867255210876465 +Epoch 150, Loss: 1.2793020009994507 +Epoch 160, Loss: 1.2716150283813477 +Epoch 170, Loss: 1.2648067474365234 +Epoch 180, Loss: 1.2556291818618774 +Epoch 190, Loss: 1.2481666803359985 +Epoch 200, Loss: 1.2384443283081055 +Epoch 210, Loss: 1.2326347827911377 +Epoch 220, Loss: 1.222812533378601 +Epoch 230, Loss: 1.214564561843872 +Epoch 240, Loss: 1.206665277481079 +Epoch 250, Loss: 1.1979613304138184 +Epoch 260, Loss: 1.1904103755950928 +Epoch 270, Loss: 1.1805342435836792 +Epoch 280, Loss: 1.1736000776290894 +Epoch 290, Loss: 1.1630973815917969 +Test accuracy: 69.78% +Epoch 0, Loss: 1.3903766870498657 +Epoch 10, Loss: 1.3787014484405518 +Epoch 20, Loss: 1.3699434995651245 +Epoch 30, Loss: 1.3617647886276245 +Epoch 40, Loss: 1.3538963794708252 +Epoch 50, Loss: 1.3481515645980835 +Epoch 60, Loss: 1.3414578437805176 +Epoch 70, Loss: 1.3366855382919312 +Epoch 80, Loss: 1.329655647277832 +Epoch 90, Loss: 1.324479341506958 +Epoch 100, Loss: 1.3185538053512573 +Epoch 110, Loss: 1.3129019737243652 +Epoch 120, Loss: 1.3044853210449219 +Epoch 130, Loss: 1.2989052534103394 +Epoch 140, Loss: 1.2929084300994873 +Epoch 150, Loss: 1.2839481830596924 +Epoch 160, Loss: 1.2777001857757568 +Epoch 170, Loss: 1.2693812847137451 +Epoch 180, Loss: 1.2617483139038086 +Epoch 190, Loss: 1.2545360326766968 +Epoch 200, Loss: 1.246327519416809 +Epoch 210, Loss: 1.2367959022521973 +Epoch 220, Loss: 1.2292941808700562 +Epoch 230, Loss: 1.2203010320663452 +Epoch 240, Loss: 1.2102550268173218 +Epoch 250, Loss: 1.2041304111480713 +Epoch 260, Loss: 1.195600152015686 +Epoch 270, Loss: 1.186301827430725 +Epoch 280, Loss: 1.1765146255493164 +Epoch 290, Loss: 1.1679247617721558 +Test accuracy: 68.00% +Epoch 0, Loss: 1.3888648748397827 +Epoch 10, Loss: 1.37844979763031 +Epoch 20, Loss: 1.3701868057250977 +Epoch 30, Loss: 1.3616294860839844 +Epoch 40, Loss: 1.3539979457855225 +Epoch 50, Loss: 1.3466511964797974 +Epoch 60, Loss: 1.3398425579071045 +Epoch 70, Loss: 1.3336836099624634 +Epoch 80, Loss: 1.3264548778533936 +Epoch 90, Loss: 1.320734977722168 +Epoch 100, Loss: 1.3143736124038696 +Epoch 110, Loss: 1.3072158098220825 +Epoch 120, Loss: 1.3025093078613281 +Epoch 130, Loss: 1.2948111295700073 +Epoch 140, Loss: 1.2889394760131836 +Epoch 150, Loss: 1.2815979719161987 +Epoch 160, Loss: 1.2732720375061035 +Epoch 170, Loss: 1.265243649482727 +Epoch 180, Loss: 1.2577362060546875 +Epoch 190, Loss: 1.2508089542388916 +Epoch 200, Loss: 1.2424347400665283 +Epoch 210, Loss: 1.23204505443573 +Epoch 220, Loss: 1.224858283996582 +Epoch 230, Loss: 1.2149536609649658 +Epoch 240, Loss: 1.2063817977905273 +Epoch 250, Loss: 1.195885419845581 +Epoch 260, Loss: 1.189403772354126 +Epoch 270, Loss: 1.1797205209732056 +Epoch 280, Loss: 1.1700397729873657 +Epoch 290, Loss: 1.1620997190475464 +Test accuracy: 67.33% +Epoch 0, Loss: 1.3865242004394531 +Epoch 10, Loss: 1.3752286434173584 +Epoch 20, Loss: 1.365169644355774 +Epoch 30, Loss: 1.3568527698516846 +Epoch 40, Loss: 1.3497159481048584 +Epoch 50, Loss: 1.3433306217193604 +Epoch 60, Loss: 1.3366672992706299 +Epoch 70, Loss: 1.3313441276550293 +Epoch 80, Loss: 1.325051188468933 +Epoch 90, Loss: 1.3185020685195923 +Epoch 100, Loss: 1.3125321865081787 +Epoch 110, Loss: 1.3057215213775635 +Epoch 120, Loss: 1.299203872680664 +Epoch 130, Loss: 1.2920863628387451 +Epoch 140, Loss: 1.2854795455932617 +Epoch 150, Loss: 1.2788721323013306 +Epoch 160, Loss: 1.2699272632598877 +Epoch 170, Loss: 1.263468623161316 +Epoch 180, Loss: 1.2553989887237549 +Epoch 190, Loss: 1.2480347156524658 +Epoch 200, Loss: 1.2396881580352783 +Epoch 210, Loss: 1.2325763702392578 +Epoch 220, Loss: 1.2215306758880615 +Epoch 230, Loss: 1.2134813070297241 +Epoch 240, Loss: 1.2057102918624878 +Epoch 250, Loss: 1.197303295135498 +Epoch 260, Loss: 1.1869391202926636 +Epoch 270, Loss: 1.1774864196777344 +Epoch 280, Loss: 1.168349027633667 +Epoch 290, Loss: 1.1604793071746826 +Test accuracy: 65.95% +Epoch 0, Loss: 1.384558916091919 +Epoch 10, Loss: 1.3741422891616821 +Epoch 20, Loss: 1.3645217418670654 +Epoch 30, Loss: 1.3560175895690918 +Epoch 40, Loss: 1.3491029739379883 +Epoch 50, Loss: 1.3425869941711426 +Epoch 60, Loss: 1.3366934061050415 +Epoch 70, Loss: 1.3306400775909424 +Epoch 80, Loss: 1.3250110149383545 +Epoch 90, Loss: 1.3190463781356812 +Epoch 100, Loss: 1.3127683401107788 +Epoch 110, Loss: 1.306785225868225 +Epoch 120, Loss: 1.2991727590560913 +Epoch 130, Loss: 1.2921305894851685 +Epoch 140, Loss: 1.2852638959884644 +Epoch 150, Loss: 1.279348373413086 +Epoch 160, Loss: 1.2713042497634888 +Epoch 170, Loss: 1.2642701864242554 +Epoch 180, Loss: 1.2549188137054443 +Epoch 190, Loss: 1.2473379373550415 +Epoch 200, Loss: 1.2397106885910034 +Epoch 210, Loss: 1.2311936616897583 +Epoch 220, Loss: 1.2233930826187134 +Epoch 230, Loss: 1.2148624658584595 +Epoch 240, Loss: 1.2053266763687134 +Epoch 250, Loss: 1.1984001398086548 +Epoch 260, Loss: 1.1892021894454956 +Epoch 270, Loss: 1.1785937547683716 +Epoch 280, Loss: 1.170796275138855 +Epoch 290, Loss: 1.1631505489349365 +Test accuracy: 67.47% +Epoch 0, Loss: 1.386955738067627 +Epoch 10, Loss: 1.375248670578003 +Epoch 20, Loss: 1.3653608560562134 +Epoch 30, Loss: 1.3561418056488037 +Epoch 40, Loss: 1.3486380577087402 +Epoch 50, Loss: 1.341729998588562 +Epoch 60, Loss: 1.3353455066680908 +Epoch 70, Loss: 1.3288049697875977 +Epoch 80, Loss: 1.322670340538025 +Epoch 90, Loss: 1.3161109685897827 +Epoch 100, Loss: 1.3098984956741333 +Epoch 110, Loss: 1.302764892578125 +Epoch 120, Loss: 1.2952282428741455 +Epoch 130, Loss: 1.2896045446395874 +Epoch 140, Loss: 1.2808841466903687 +Epoch 150, Loss: 1.2728548049926758 +Epoch 160, Loss: 1.2653076648712158 +Epoch 170, Loss: 1.2589432001113892 +Epoch 180, Loss: 1.2505241632461548 +Epoch 190, Loss: 1.242138385772705 +Epoch 200, Loss: 1.2352077960968018 +Epoch 210, Loss: 1.2268190383911133 +Epoch 220, Loss: 1.2183279991149902 +Epoch 230, Loss: 1.20862877368927 +Epoch 240, Loss: 1.1997631788253784 +Epoch 250, Loss: 1.1920052766799927 +Epoch 260, Loss: 1.1831735372543335 +Epoch 270, Loss: 1.1740572452545166 +Epoch 280, Loss: 1.166300892829895 +Epoch 290, Loss: 1.1556040048599243 +Test accuracy: 70.18% +Best parameters: {'epochs': 300, 'hidden_features': 64, 'learning_rate': 0.01} +Best accuracy: 91.0458388963062 \ No newline at end of file diff --git a/recognition/multi-layer_GCN_model_s4696681/predict.py b/recognition/multi-layer_GCN_model_s4696681/predict.py new file mode 100644 index 000000000..0d24cf8c6 --- /dev/null +++ b/recognition/multi-layer_GCN_model_s4696681/predict.py @@ -0,0 +1,39 @@ +import train +import torch +import dataset as dataset +from sklearn.manifold import TSNE +import matplotlib.pyplot as plt +import modules +import numpy as np +from scipy.linalg import sqrtm + +def main(): + # Train the model using train.py and get the best model, get evaluation of model and the parameters of the best model + model, best_parameters, best_accuracy = train.train_model() + + print(f"Best Model Parameters: {best_parameters}") + print(f"Test accuracy with best model: {best_accuracy:.2f}%") + torch.save(model.state_dict(), "trained_model.pth") + + + # Visualisation TSNE after model is trained + model.eval() + with torch.no_grad(): + # Do a forward pass to compute embeddings + _ = model(dataset.all_features_tensor, dataset.adjacency_normed_tensor) + embeddings = model.get_embeddings().cpu().numpy() + + tsne = TSNE(n_components=2, random_state=99) + embeddings_2d = tsne.fit_transform(embeddings) + + number_of_classes = 4 + + plt.figure(figsize=(10, 8)) + for label in range(number_of_classes): + indices = np.where(dataset.node_labels[:, 1] == label) + plt.scatter(embeddings_2d[indices, 0], embeddings_2d[indices, 1], label=str(label), s=5) + plt.legend() + plt.title('t-SNE visualization of GCN embeddings') + plt.show() +if __name__ == "__main__": + main() diff --git a/recognition/multi-layer_GCN_model_s4696681/train.py b/recognition/multi-layer_GCN_model_s4696681/train.py new file mode 100644 index 000000000..155e94120 --- /dev/null +++ b/recognition/multi-layer_GCN_model_s4696681/train.py @@ -0,0 +1,107 @@ +import modules as modules +import dataset as dataset +import torch +import torch.nn.functional as F +from sklearn.model_selection import KFold, ParameterGrid +import numpy as np + +device = torch.device("cuda" if torch.cuda.is_available() else "cpu") + +"""Initialise initial model variables""" +feature_vectors = dataset.create_feature_vectors() +node_labels = dataset.convert_labels() +all_features_tensor, train_labels_tensor, test_labels_tensor, train_tensor, test_tensor, test_mask = dataset.create_tensors() +adjacency_normed_tensor = torch.FloatTensor(dataset.adjacency_normed).to(device) + + +"""Creates the tensors for each split of the nested cross validaton""" +def create_new_tensors(train_indices, test_indices): + node_ids = feature_vectors[:, 0] + + train_mask = np.isin(node_ids, train_indices) + test_mask = np.isin(node_ids, test_indices) + + train_labels_tensor = torch.LongTensor(node_labels[train_mask, 1]).to(device) + test_labels_tensor = torch.LongTensor(node_labels[test_mask, 1]).to(device) + + train_tensor = torch.BoolTensor(train_mask).to(device) + test_tensor = torch.BoolTensor(test_mask).to(device) + + train_mask_tensor = torch.BoolTensor(train_mask).to(device) + test_mask_tensor = torch.BoolTensor(test_mask).to(device) + return train_tensor, test_tensor, train_labels_tensor, test_labels_tensor, train_mask_tensor, test_mask_tensor + +out_features = 4 +param_grid = { + 'learning_rate': [0.01, 0.001, 0.0001], + 'epochs': [100, 200, 300], + 'hidden_features': [32, 64, 128] +} +parameter_combinations = list(ParameterGrid(param_grid)) + +"""runs train and evaluate for a certain combination of hyperparameters and train test split + This is called by the train_model() function which conducts the nested cross validation""" +def train_and_evaluate(model, train_tensor1, test_tensor1, train_labels_tensor1, test_labels_tensor1, train_mask_tensor1, test_mask_tensor1, parameters): + learning_rate = parameters['learning_rate'] + epochs = parameters['epochs'] + hidden_features = parameters['hidden_features'] + optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate) + + # Training of model + model.train() + for epoch in range(epochs): + optimizer.zero_grad() + out = model(all_features_tensor, adjacency_normed_tensor) + loss = F.nll_loss(out[train_tensor1], train_labels_tensor1) + + loss.backward() + optimizer.step() + + if epoch % 10 == 0: + print(f"Epoch {epoch}, Loss: {loss.item()}") + + # Evaluation of model + model.eval() + with torch.no_grad(): + test_out = model(all_features_tensor, adjacency_normed_tensor) + pred = test_out[test_mask_tensor1].argmax(dim=1) + correct = (pred == test_labels_tensor1).sum().item() + acc = correct / test_labels_tensor1.size(0) + print(f"Test accuracy: {acc * 100:.2f}%") + test_accuracy = acc*100 + + return test_accuracy + +"""Runs 10-fold nested cross validation on GCN model to train model with different + hyperparameters, evaluates each iteration of the model and returns best model and its hyperparameters and accuracy""" +def train_model(): + k_folds = 10 + kf = KFold(n_splits=k_folds, shuffle=True) + + best_parameters = None + best_accuracy = -1 + best_model = None + + for parameters in parameter_combinations: + fold_accuracies = [] + + for train_indices, test_indices in kf.split(all_features_tensor): + model = modules.GCN(in_features=all_features_tensor.shape[1], hidden_features=parameters['hidden_features'], out_features=out_features).to(device) + train_tensor1, test_tensor1, train_labels_tensor1, test_labels_tensor1, train_mask_tensor1, test_mask_tensor1 = create_new_tensors(train_indices, test_indices) + accuracy = train_and_evaluate(model, train_tensor1, test_tensor1, train_labels_tensor1, test_labels_tensor1, train_mask_tensor1, test_mask_tensor1, parameters) + fold_accuracies.append(accuracy) + + mean_accuracy = sum(fold_accuracies) / len(fold_accuracies) + + if mean_accuracy > best_accuracy: + best_parameters = parameters + best_accuracy = mean_accuracy + best_model = model + + return best_model, best_parameters, best_accuracy + + +if __name__ == "__main__": + model, best_parameters, best_accuracy = train_model() + print(f"Best parameters: {best_parameters}") + print(f"Best accuracy: {best_accuracy}") diff --git a/recognition/multi-layer_GCN_model_s4696681/tsne_after.png b/recognition/multi-layer_GCN_model_s4696681/tsne_after.png new file mode 100644 index 000000000..75607454a Binary files /dev/null and b/recognition/multi-layer_GCN_model_s4696681/tsne_after.png differ diff --git a/recognition/multi-layer_GCN_model_s4696681/tsne_before.png b/recognition/multi-layer_GCN_model_s4696681/tsne_before.png new file mode 100644 index 000000000..9cd9b497e Binary files /dev/null and b/recognition/multi-layer_GCN_model_s4696681/tsne_before.png differ