【吴恩达深度学习】全笔记【更新中...】

发布于:2023-01-04 ⋅ 阅读:(259) ⋅ 点赞:(0)

1.1 欢迎

  1. Neural Networks and Deep Learning.
  2. Improving Deep Neural Networks: Hyper-parameter tuning, Regularization and Optimization.
  3. Structuring your Machine Learning project.
  4. Convolutional Neural Networks.
  5. Neural Language Processing: Building sequence models.

1.2 什么是神经网络

1.3 用神经网络进行监督学习 【supervised learning】

在这里插入图片描述

1.4 为什么深度学习会兴起?

Data & Computation & Algorithms
在这里插入图片描述

1.5 关于这门课

  • Introduction
  • Basics of Neural Network Programming
  • One hidden layer Neural Networks
  • Deep Neural Networks

1.6 课程资源


2.1 二分分类

2.2 Logistic 回归【Logistic Regression】

y ^ = σ ( w T x + b ) \hat{y}=\sigma{(w^T x+b)} y^=σ(wTx+b) using sigmoid function σ = 1 1 + e − z \sigma = \frac{1}{1+e^{-z}} σ=1+ez1.

2.3 logistic 回归损失函数

L ( y ^ , y ) = 1 2 ( y ^ − y ) 2 \mathcal{L}(\hat{y},y) = \frac{1}{2} (\hat{y}-y)^2 L(y^,y)=21(y^y)2 ×
L ( y ^ , y ) = − ( y log ⁡ y ^ + ( 1 − y ) log ⁡ ( 1 − y ^ ) ) \mathcal{L}(\hat{y},y) = -(y \log \hat{y} + (1-y) \log (1-\hat{y} )) L(y^,y)=(ylogy^+(1y)log(1y^))

2.4 梯度下降法

J ( w , b ) = 1 m ∑ i = 1 m L ( y ^ ( i ) , y ( i ) ) = − 1 m ∑ i = 1 m ( y ( i ) log ⁡ y ^ ( i ) + ( 1 − y ( i ) ) log ⁡ ( 1 − y ^ ( i ) ) ) J(w, b) = \frac{1}{m} \sum^m_{i=1} \mathcal{L}(\hat{y}^{(i)},y^{(i)}) = - \frac{1}{m} \sum^m_{i=1} (y^{(i)} \log \hat{y}^{(i)} + (1-y^{(i)}) \log (1-\hat{y}^{(i)} )) J(w,b)=m1i=1mL(y^(i),y(i))=m1i=1m(y(i)logy^(i)+(1y(i))log(1y^(i)))

w : = w = α d J d w w := w=\alpha \frac{dJ}{dw} w:=w=αdwdJ in code, d w dw dw means d J d w \frac{dJ}{dw} dwdJ.

2.5 导数

2.6 更多导数的例子

2.7 计算图

2.8 使用计算图求导

2.9 logistic 回归中的梯度下降法

2.10 m 个样本的梯度下降

2.11 向量化

import numpy as np
a = np.array([1, 2, 3, 4])

import time

a = np.random.rand(10000000)
b = np.random.rand(10000000)

tic = time.time()
c = np.dot(a, b)
toc = time.time()

print("vectorized version:"+str(100*(toc-tic))+"ms")

c = 0
tic = time.time()
for i in range(10000000):
  c += a[i] * b[i]
toc = time.time()

print("for loop version:"+str(100*(toc-tic))+"ms")
vectorized version: 1.7638683319091797ms
for loop version: 431.3936233520508ms


参考文献:

https://www.coursera.org/deeplearning-ai

本文含有隐藏内容,请 开通VIP 后查看

网站公告

今日签到

点亮在社区的每一天
去签到