Early perceptron models and multilayer neural networks only have one or two to four layers, and the network parameters are also around tens of thousands. With the development of deep learning and the improvement of computing capabilities, models such as AlexNet(8layers), VGG16(16 layers), GoogleNet(22 layers), REsNet50(50 layers), and DenseNet121(121 layers) have been proposed successively, while the size of inputting pictures has also gradually increased from 28 * 28 to 244 * 244 to 299 * 299 and even alrger. These changes make the total number of parameters of the network reach ten million levels, as shown in Figure 1-13.
The increase of network scale enhances the capacity of the neural networks correspondingly, so that the networks can learn more complex data modalities and the model performance can be improved accordingly. On the other hand, the increase of the network scale also means that we need more training data and computational power to avoid overfitting.
댓글 없음:
댓글 쓰기