差别
这里会显示出您选择的修订版和当前版本之间的差别。
| 两侧同时换到之前的修订记录 前一修订版 后一修订版 | 前一修订版 | ||
| 深度学习:反向传播算法 [2026/03/02 21:31] – [填空题] 张叶安 | 深度学习:反向传播算法 [2026/03/02 21:37] (当前版本) – [计算题答案] 张叶安 | ||
|---|---|---|---|
| 行 407: | 行 407: | ||
| ==== 填空题 ==== | ==== 填空题 ==== | ||
| - | < | ||
| - | 6. 梯度下降中,参数更新公式为$\theta_{t+1} = \theta_t - \eta$______。 | + | 6. 梯度下降中,参数更新公式为$\theta_{t+1} = \theta_t - \eta$$\_\_$。 |
| - | 7. 在反向传播中,$\boldsymbol{\delta}^{(l)} = \frac{\partial \mathcal{L}}{\partial \mathbf{z}^{(l)}}$被称为______项。 | + | 7. 在反向传播中,$\boldsymbol{\delta}^{(l)} = \frac{\partial \mathcal{L}}{\partial \mathbf{z}^{(l)}}$被称为$\_\_$项。 |
| - | 8. Sigmoid函数的导数可以表示为$\sigma' | + | 8. Sigmoid函数的导数可以表示为$\sigma' |
| - | 9. Momentum优化中,动量系数$\gamma$通常设为______。 | + | 9. Momentum优化中,动量系数$\gamma$通常设为$\_\_$。 |
| + | |||
| + | 10. 梯度裁剪的两种主要方式是$\_\_$裁剪和$\_\_$裁剪。 | ||
| - | 10. 梯度裁剪的两种主要方式是______裁剪和______裁剪。 | ||
| - | </ | ||
| ==== 计算题 ==== | ==== 计算题 ==== | ||
| 行 442: | 行 441: | ||
| 1. **答案:B** | 1. **答案:B** | ||
| - | 解析:反向传播基于链式法则高效计算梯度。 | + | |
| + | 解析:反向传播基于链式法则高效计算梯度。 | ||
| 2. **答案:C** | 2. **答案:C** | ||
| - | 解析:ReLU在负区间梯度为0,仍可能出现" | + | |
| + | 解析:ReLU在负区间梯度为0,仍可能出现" | ||
| 3. **答案:A** | 3. **答案:A** | ||
| - | 解析:$\mathbf{m}_t$是一阶矩(梯度均值),$\mathbf{v}_t$是二阶矩(梯度方差)。 | + | |
| + | 解析:$\mathbf{m}_t$是一阶矩(梯度均值),$\mathbf{v}_t$是二阶矩(梯度方差)。 | ||
| 4. **答案:C** | 4. **答案:C** | ||
| - | 解析:ReLU主要解决梯度消失问题,对梯度爆炸影响不大。 | + | |
| + | 解析:ReLU主要解决梯度消失问题,对梯度爆炸影响不大。 | ||
| 5. **答案:B** | 5. **答案:B** | ||
| - | 解析:小批量通常为32-512,平衡计算效率和梯度准确性。 | + | |
| + | 解析:小批量通常为32-512,平衡计算效率和梯度准确性。 | ||
| ==== 填空题答案 ==== | ==== 填空题答案 ==== | ||
| 行 472: | 行 476: | ||
| 11. **解答**: | 11. **解答**: | ||
| | | ||
| - | | + | **前向传播**: |
| - | $$z = 0.5 \times 1 + (-0.3) \times 2 = 0.5 - 0.6 = -0.1$$ | + | |
| - | $$\hat{y} = \sigma(-0.1) = \frac{1}{1 + e^{0.1}} \approx \frac{1}{1.105} \approx 0.475$$ | + | $$z = 0.5 \times 1 + (-0.3) \times 2 = 0.5 - 0.6 = -0.1$$ |
| + | |||
| + | $$\hat{y} = \sigma(-0.1) = \frac{1}{1 + e^{0.1}} \approx \frac{1}{1.105} \approx 0.475$$ | ||
| | | ||
| - | | + | **反向传播**: |
| - | $$\frac{\partial \mathcal{L}}{\partial \hat{y}} = -\frac{y}{\hat{y}} + \frac{1-y}{1-\hat{y}} = -\frac{1}{0.475} \approx -2.105$$ | + | |
| + | $$\frac{\partial \mathcal{L}}{\partial \hat{y}} = -\frac{y}{\hat{y}} + \frac{1-y}{1-\hat{y}} = -\frac{1}{0.475} \approx -2.105$$ | ||
| | | ||
| - | | + | $$\frac{\partial \hat{y}}{\partial z} = \sigma(-0.1)(1-\sigma(-0.1)) = 0.475 \times 0.525 \approx 0.249$$ |
| | | ||
| - | | + | $$\frac{\partial z}{\partial w_1} = x_1 = 1, \quad \frac{\partial z}{\partial w_2} = x_2 = 2$$ |
| | | ||
| - | | + | $$\frac{\partial \mathcal{L}}{\partial w_1} = -2.105 \times 0.249 \times 1 \approx -0.524$$ |
| - | $$\frac{\partial \mathcal{L}}{\partial w_2} = -2.105 \times 0.249 \times 2 \approx -1.048$$ | + | |
| + | $$\frac{\partial \mathcal{L}}{\partial w_2} = -2.105 \times 0.249 \times 2 \approx -1.048$$ | ||
| 12. **解答**: | 12. **解答**: | ||
| | | ||
| - | | + | **计算一阶矩**: |
| - | $$\mathbf{m}_t = 0.9 \times [0.1, 0.1] + 0.1 \times [0.2, -0.3]$$ | + | |
| - | $$= [0.09, 0.09] + [0.02, -0.03] = [0.11, 0.06]$$ | + | $$\mathbf{m}_t = 0.9 \times [0.1, 0.1] + 0.1 \times [0.2, -0.3]$$ |
| + | |||
| + | $$= [0.09, 0.09] + [0.02, -0.03] = [0.11, 0.06]$$ | ||
| | | ||
| - | | + | **计算二阶矩**: |
| - | $$\mathbf{v}_t = 0.999 \times [0.01, 0.01] + 0.001 \times [0.04, 0.09]$$ | + | |
| - | $$= [0.00999, 0.00999] + [0.00004, 0.00009] = [0.01003, 0.01008]$$ | + | $$\mathbf{v}_t = 0.999 \times [0.01, 0.01] + 0.001 \times [0.04, 0.09]$$ |
| + | |||
| + | $$= [0.00999, 0.00999] + [0.00004, 0.00009] = [0.01003, 0.01008]$$ | ||
| | | ||
| - | | + | **偏差校正**: |
| - | $$1 - \beta_1^t = 1 - 0.9^{10} = 1 - 0.349 = 0.651$$ | + | |
| - | $$1 - \beta_2^t = 1 - 0.999^{10} = 1 - 0.990 = 0.010$$ | + | $$1 - \beta_1^t = 1 - 0.9^{10} = 1 - 0.349 = 0.651$$ |
| + | |||
| + | $$1 - \beta_2^t = 1 - 0.999^{10} = 1 - 0.990 = 0.010$$ | ||
| | | ||
| - | | + | $$\hat{\mathbf{m}}_t = \frac{[0.11, |
| - | $$\hat{\mathbf{v}}_t = \frac{[0.01003, | + | |
| + | $$\hat{\mathbf{v}}_t = \frac{[0.01003, | ||
| - | --- | ||
| - | **本章完** | ||