No | Activation function | Epoch | No of layers | Dropout | Value of the training loss function | Percentage of correct answers for the training group | Value of the loss function of the test group | Percentage of correct answers in the test group |
---|---|---|---|---|---|---|---|---|
1 | ReLU | 5 | 18 | 0.25 | 0.3356 | 0.8668 | 0.3266 | 0.8654 |
2 | ReLU | 20 | 18 | 0.25 | 0.0891 | 0.9562 | 0.2104 | 0.9314 |
3 | ReLU | 50 | 18 | 0.25 | 0.0526 | 0.9851 | 0.6082 | 0.9111 |
4 | ReLU | 5 | 24 | 0.25 | 0.351 | 0.8524 | 0.3188 | 0.8705 |
5 | ReLU | 20 | 24 | 0.25 | 0.0798 | 0.9251 | 0.1958 | 0.9314 |
6 | ReLU | 50 | 24 | 0.25 | 0.0628 | 0.9982 | 0.4614 | 0.8908 |
7 | ReLU | 5 | 18 | 0.5 | 0.3752 | 0.8226 | 0.3225 | 0.8705 |
8 | ReLU | 20 | 18 | 0.5 | 0.2151 | 0.9198 | 0.2514 | 0.9213 |
9 | Sigmoid | 20 | 18 | 0.25 | 0.7456 | 0.4659 | 0.7285 | 0.4785 |