卷积神经网络与深度学习的崛起
立即解锁
发布时间: 2025-09-13 01:59:31 阅读量: 4 订阅数: 44 AIGC 

### 卷积神经网络与深度学习的崛起
#### 1. 卷积神经网络基础
卷积神经网络(CNN)在处理图像等多维数据方面表现出色。以特定的卷积层为例,它有16个过滤器,每个过滤器是3×3的正方形。Keras会自动初始化过滤器中的值,就像初始化全连接网络的权重一样,该层的激活函数通常是ReLU。
下面是构建一个用于CIFAR - 10数据集的卷积神经网络的代码:
```python
import numpy as np
from keras.models import Sequential
from keras.layers import Conv2D, Dropout, Dense
from keras.layers import BatchNormalization, Flatten
from keras.optimizers import Adam
from keras.utils import to_categorical
from keras.datasets import cifar10
(X_train_raw, Y_train_raw), (X_test_raw, Y_test_raw) = cifar10.load_data()
X_train = X_train_raw / 255
X_test_all = X_test_raw / 255
X_validation, X_test = np.split(X_test_all, 2)
Y_train = to_categorical(Y_train_raw)
Y_validation, Y_test = np.split(to_categorical(Y_test_raw), 2)
model = Sequential()
model.add(Conv2D(16, (3, 3), activation='relu'))
model.add(BatchNormalization())
model.add(Dropout(0.5))
model.add(Conv2D(32, (3, 3), activation='relu'))
model.add(BatchNormalization())
model.add(Dropout(0.5))
model.add(Flatten())
model.add(Dense(1000, activation='relu'))
model.add(BatchNormalization())
model.add(Dropout(0.5))
model.add(Dense(512, activation='relu'))
model.add(BatchNormalization())
model.add(Dropout(0.5))
model.add(Dense(10, activation='softmax'))
model.compile(loss='categorical_crossentropy',
optimizer=Adam(),
metrics=['accuracy'])
history = model.fit(X_train, Y_train,
validation_data=(X_validat
```
0
0
复制全文
相关推荐









