Ziyodullayeva Dilnoza Gradient Descent


Download 12.36 Kb.
bet1/4
Sana20.11.2023
Hajmi12.36 Kb.
#1790299
  1   2   3   4
Bog'liq
graddes


Ziyodullayeva Dilnoza
Gradient Descent

Kontur

  • Motivatsiya
  • Gradient tushish algoritmi
  • Muammolar va muqobillar
  • Stokastik gradient tushishi
  • Parallel gradient tushishi
  • HOGWILD!

Motivatsiya

  • Funktsiya konveks bo'lsa, global minimal/maksimani topish uchun yaxshi
  • Agar funktsiya qavariq bo'lmasa, mahalliy minimal/maksimani topish yaxshidir
  • U Mashinani o'rganishda ko'plab modellarni optimallashtirish uchun ishlatiladi:

  • U quyidagilar bilan birgalikda qo'llaniladi:
  • Neyron tarmoqlari
  • Chiziqli regressiya
  • Logistik regressiya
  • Orqaga tarqalish algoritmi
  • Vektorli mashinalarni qo'llab-quvvatlash

Funktsiyaga misol

Quickest ever review of multivariate calculus

  • Derivative
  • Partial Derivative
  • Gradient Vector

Derivative

  • Slope of the tangent line

𝑓(𝑥)=𝑥𝗍2
𝑓′(𝑥) =𝑑𝑓/𝑑𝑥 =2𝑥
𝑓′′(𝑥)=𝑑𝗍2 𝑓/𝑑𝑥 = 2

Partial Derivative – Multivariate Functions


For multivariate functions (e.g two variables) we need partial derivatives
– one per dimension. Examples of multivariate functions:
𝑓(𝑥,𝑦)=𝑥𝗍2 +𝑦𝗍2
𝑓(𝑥,𝑦)=cos𝗍2 (𝑥) +𝑦𝗍2
𝑓(𝑥,𝑦)=cos𝗍2 (𝑥) +cos𝗍2 (𝑦)
Convex!
𝑓(𝑥,𝑦)=−𝑥𝗍2 −𝑦𝗍2
Concave!

Partial Derivative – Cont’d


To visualize the partial derivative for each of the dimensions x and y, we can imagine a plane that “cuts” our surface along the two dimensions and once again we get the slope of the tangent line.
surface: 𝑓(𝑥,𝑦)=9−𝑥𝗍2 −𝑦𝗍2
plane: 𝑦=1
cut: 𝑓(𝑥,1)=8−𝑥𝗍2
slope / derivative of cut: 𝑓′(𝑥)=−2𝑥

Partial Derivative – Cont’d 2


If we partially differentiate a function with respect to x, we pretend y is constant
𝑓(𝑥,𝑦)=9−𝑥𝗍2 −𝑦𝗍2
𝑓(𝑥,𝑦)=9−𝑥𝗍2 −𝑐𝗍2
𝑓↓𝑥 =𝜕𝑓/𝜕𝑥 =−2𝑥
𝑓(𝑥,𝑦)=9−𝑐𝗍2 −𝑦𝗍2
𝑓↓𝑦 =𝜕𝑓/𝜕𝑦 =−2𝑦
Download 12.36 Kb.

Do'stlaringiz bilan baham:
  1   2   3   4




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling