5ç« ã§ã¯ãèªåèªèº«ã«åºé¡ããæŒç¿åé¡ã2åäœã£ãŠã¿ãã
1åç®ã¯ã6æ25æ¥ä»ãšã³ããªãŒã«æžããã2ç« ããŒã»ãããã³ã«ããè«çã²ãŒããã誀差éäŒææ³ã§æ©æ¢°åŠç¿ãããŠã¿ããããšãããã®ã ãåæ¥ä»ã§ã¯æ°å€åŸ®åæ³ã§è§£ããã
ãŸãã¯ããŒã»ãããã³ã«ããè«çã²ãŒãã®æ倱é¢æ°ãæ±ãã Python ã¹ã¯ãªãããããèšç®ã°ã©ããã§è¡šçŸããŠã¿ãããŸãã¯é äŒæã®ã¿å³ç€ºããŠãããO'REILLYããŒãããäœãDeep Learning âPythonã§åŠã¶ãã£ãŒãã©ãŒãã³ã°ã®çè«ãšå®è£ ãïŒä»¥äž âããã¹ãâïŒã§èšç®ã°ã©ããç»å Žããã®ã¯5ç« ãšä»é²Aã®ã¿ã ããåŠç¿ããåŽãšããŠã¯ãããã¡ãã£ãšåãã䜿ã£ãŠãããã£ãã
ã¹ãã³ãµãŒãªã³ã¯
Â
Â
Sigmoidã¬ã€ã€ã®éäŒæã®èšç®ã¯ãããã¹ãP143ïœ146ã«ãŠãããã«è§£èª¬ãããŠãããããã§ã¯ããã¹ãP145å³5-20ã®å®æå³ãæš¡åãããã®ã ãã瀺ãã
ããã¹ãã§ã¯ããã«å€åœ¢ãè¡ã£ãŠãçµè«ãšã㊠y(1ïŒy) ãšãã圢ã®åŒãåŸãŠããïŒP146ïŒã
ããã«å£ã£ãŠã2ä¹å誀差ã®éäŒæãæ±ããåŒãèšç®ããŠã¿ãã2ä¹å誀差ã®éäŒæã¯ããã¹ãã«ã¯èŒã£ãŠããªãããP130ïœ132 ã® z = t **2ã t = x + y ãšããäŸé¡ãæ¹é ããã°ãç°¡åã«æ±ããããããªã 6æ25æ¥ä»åŒãšã³ããªãŒ ã§ã¯ã0.5 ãä¹ããã®ãå¿ããŠããã®ã«æ°ãã€ããã0.5 ãæããæ¹ããéäŒæãäžããåŒã®åœ¢ãç°¡åã«ãªãã
çµè«ãšããŠåŒã®åœ¢ã¯ xïŒt ãšãªãã
ããããæåã®èšç®ã°ã©ãã«æžã蟌ãã§ã¿ããããªãã¡ããŒã»ãããã³ã«ããè«çã²ãŒãã®éäŒæã¯ãäžå³ã®ããã«ãªãã¯ãã ã
ãã®å³ãåºã«ã6æ25æ¥ä»åŒãšã³ããªãŒ ã«ç€ºããã¯ã©ã¹ âPerceptrnâ ãæžãçŽããŠã¿ããæ¹é ç¹ã¯ãéäŒæãæ±ããã¡ãœãã âbackwordâ ãè¿œå ããããšãçµæã芳å¯ããç®çã§ãéã¿ãW ã®åæå€ãã¬ãŠã¹ååžã§äžãã代ããã« 0.5ã0.5ã0.5 ãšããå®æ°ã®åæå€ãäžããããšãåè¿°ã®éã2ä¹å誀差ã®ä¿æ° 0.5 ãå¿ããŠããã®ã§è¿œå ããããšã ã
ç»é¢äžãã Anaconda Prompt ã«ã³ããŒïŒããŒã¹ãã§ããã¯ãã§ããããã ãå®è¡ã«ã¯ch01ïœch08 ã®ãããããã«ã¬ã³ããã£ã¬ã¯ããªã«ããŠããå¿ èŠãããïŒããŒãããäœãDeep Learningãã®èªè ã«ããéããªã泚éïŒã
import sys, os
sys.path.append(os.pardir)
import numpy as np
from common.functions import sigmoid
class Perceptrn:
  def __init__(self):
    self.W = np.array([0.5, 0.5, 0.5])
    self.dW = np.zeros(3)
    self.z1 = None
  def predict(self, x):
    return x[0]* self.W[0]+ x[1] *self.W[1] + self.W[2]
  def loss(self, x, t):
    self.z1= sigmoid(self.predict(x))
    out = 0.5*(self.z1-t)**2
    return out
  def backward(self, x, t):
    dZ1 = (1-self.z1)*self.z1
    dZ2 = (self.z1-t)*dZ1
    self.dW[0] = dZ2*x[0]
    self.dW[1] = dZ2*x[1]
    self.dW[2] = dZ2
    return self.dW
ãã®ã¯ã©ã¹ã®åäœã確èªããããã«ã次ã®æºåãè¡ãã
from common.gradient import numerical_gradient
pct = Perceptrn()
print("W = " + str(pct.W))
f = lambda w: pct.loss(x, t)x, t =(np.array([0, 0]), 0)Â
ã¯ã©ã¹ ânumerical_cradientâ ã®ã€ã³ããŒãã¯ãæ°å€åŸ®åãšã®çµæã®æ¯èŒã®ããã
x ã¯å ¥åããŒã¿ãt ã¯æåž«ããŒã¿ã§ããã
print("p = " + str(pct.predict(x)) )
print("l = " + str(pct.loss(x,t)))
print("dW(n.g.):"+ str(numerical_gradient(f,pct.W)))
print("dW(b.p.):"+ str(pct.backward(x, t)))
âpâ ã¯æšå®å€ãâlâ ã¯æ倱ãâdW(n.g)â ã¯æ°å€åŸ®åã«ããæ±ããå€ãâdWâ ã¯éäŒææ³ã«ããæ±ããå€ã§ããã
å®éã«å®è¡ãããŠã¿ãã¹ã¯ãªãŒã³ã·ã§ããã以äžã«ç€ºãã
x ãš t ã®çµã¿åãããããããå€ããªããäžã®4è¡ã貌ãä»ãå®è¡ããŠãçµæã確èªãããäžå³ã«ç€ºãã®ã¯ORã²ãŒãã®å®è¡çµæã§ããã
æ°å€åŸ®åæ³ã§æ±ããå€ãšã誀差éäŒææ³ã§æ±ããå€ããå°æ°ç¹ä»¥äž5æ¡ã®ã¬ãã«ã§äžèŽããŠãããšããããšã¯ã倧ããªééãã¯ããŠããªããšããããšã ããã
ãã ããåãã°ããããšããã€ããã§äœã£ãã¹ã¯ãªãããªã®ã§ãããã®ïŒãã«æžããéãããã¹ãèè ã®æè€ããããPython ã®ããã°ã©ãã³ã°ã«æ £ãã人ã ã£ããããã£ãŠãŒãããªæžãæ¹ã¯ããªãã ãããšãæãã
ãã®é ç¶ãã
ãŒãããäœãDeep Learning âPythonã§åŠã¶ãã£ãŒãã©ãŒãã³ã°ã®çè«ãšå®è£
- äœè : æè€åº·æ¯
- åºç瀟/ã¡ãŒã«ãŒ: ãªã©ã€ãªãŒãžã£ãã³
- çºå£²æ¥: 2016/09/24
- ã¡ãã£ã¢: åè¡æ¬ïŒãœããã«ããŒïŒ
- ãã®ååãå«ãããã° (16件) ãèŠã