<output id="qn6qe"></output>

    1. <output id="qn6qe"><tt id="qn6qe"></tt></output>
    2. <strike id="qn6qe"></strike>

      亚洲 日本 欧洲 欧美 视频,日韩中文字幕有码av,一本一道av中文字幕无码,国产线播放免费人成视频播放,人妻少妇偷人无码视频,日夜啪啪一区二区三区,国产尤物精品自在拍视频首页,久热这里只有精品12
      項目 內容
      這個作業屬于哪個課程 人工智能實戰2019
      這個作業的要求在哪 作業要求
      我在這個課程的目標是 了解人工智能相關知識,提高編程能力
      這個作業在哪個具體方面幫助我實現目標 學會使用激活函數和初步掌握分類問題的解決
      作業正文 http://www.rzrgm.cn/-myq123/p/10669933.html)

      一 . 代碼及結果

      import numpy as np
      
      import matplotlib.pyplot as plt
      
      
      def Read_LOGIC_Data(logic):
          if logic == 'AND':
      
              X = np.array([0,0,1,1,0,1,0,1]).reshape(2,4)
      
              Y = np.array([0,0,0,1]).reshape(1,4)
          elif logic == 'OR':
              X = np.array([0,0,1,1,0,1,0,1]).reshape(2,4)
              Y = np.array([0,1,1,1]).reshape(1,4)
          else:
              print("error,please input again")
          return X,Y
      
      
      def Sigmoid(x):
      
          s=1/(1+np.exp(-x))
      
          return s
      
      
      
      # 前向計算
      
      def ForwardCalculationBatch(W, B, batch_X):
      
          Z = np.dot(W, batch_X) + B
      
          A = Sigmoid(Z)
      
          return A
      
      
      
      # 反向計算
      
      def BackPropagationBatch(batch_X, batch_Y, A):
      
          m = batch_X.shape[1]
      
          dZ = A - batch_Y
      
          # dZ列相加,即一行內的所有元素相加
      
          dB = dZ.sum(axis=1, keepdims=True)/m
      
          dW = np.dot(dZ, batch_X.T)/m
      
          return dW, dB
      
      
      
      # 更新權重參數
      
      def UpdateWeights(W, B, dW, dB, eta):
      
          W = W - eta * dW
      
          B = B - eta * dB
      
          return W, B
      
      
      
      # 計算損失函數值
      
      def CheckLoss(W, B, X, Y):
      
          m = X.shape[1]
      
          A = ForwardCalculationBatch(W,B,X)
      
          
      
          p4 = np.multiply(1-Y ,np.log(1-A))
      
          p5 = np.multiply(Y, np.log(A))
      
      
      
          LOSS = np.sum(-(p4 + p5))  #binary classification
      
          loss = LOSS / m
      
          return loss
      
      
      
      # 初始化權重值
      
      def InitialWeights(num_input, num_output, method):
      
          if method == "zero":
      
              # zero
      
              W = np.zeros((num_output, num_input))
      
          elif method == "norm":
      
              # normalize
      
              W = np.random.normal(size=(num_output, num_input))
      
          elif method == "xavier":
      
              # xavier
      
              W=np.random.uniform(
      
                  -np.sqrt(6/(num_input+num_output)),
      
                  np.sqrt(6/(num_input+num_output)),
      
                  size=(num_output,num_input))
      
      
      
          B = np.zeros((num_output, 1))
      
          return W,B
      
      
      
      
      
      def train(X, Y, ForwardCalculationBatch, CheckLoss):
      
          num_example = X.shape[1]
      
          num_feature = X.shape[0]
      
          num_category = Y.shape[0]
      
          # hyper parameters
      
          eta = 0.5
      
          max_epoch = 10000
      
          # W(num_category, num_feature), B(num_category, 1)
      
          W, B = InitialWeights(num_feature, num_category, "zero")
      
          # calculate loss to decide the stop condition
      
          loss = 5        # initialize loss (larger than 0)
      
          error = 2e-3    # stop condition
      
      
      
          # if num_example=200, batch_size=10, then iteration=200/10=20
      
          for epoch in range(max_epoch):
      
              for i in range(num_example):
      
                  # get x and y value for one sample
      
                  x = X[:,i].reshape(num_feature,1)
      
                  y = Y[:,i].reshape(1,1)
      
                  # get z from x,y
      
                  batch_a = ForwardCalculationBatch(W, B, x)
      
                  # calculate gradient of w and b
      
                  dW, dB = BackPropagationBatch(x, y, batch_a)
      
                  # update w,b
      
                  W, B = UpdateWeights(W, B, dW, dB, eta)
      
                  # end if
      
              # end for
      
              # calculate loss for this batch
      
              loss = CheckLoss(W,B,X,Y)
      
              print(epoch,i,loss,W,B)
      
              # end if
      
              if loss < error:
      
                  break
      
          # end for
      
      
      
          return W,B
      
      
      
      def ShowResult(W,B,X,Y,title):
      
          w = -W[0,0]/W[0,1]
      
          b = -B[0,0]/W[0,1]
      
          x = np.array([0,1])
      
          y = w * x + b
      
          plt.plot(x,y)
      
         
      
          for i in range(X.shape[1]):
      
              if Y[0,i] == 0:
      
                  plt.scatter(X[0,i],X[1,i],marker="o",c='b',s=64)
      
              else:
      
                  plt.scatter(X[0,i],X[1,i],marker="^",c='r',s=64)
      
          plt.axis([-0.1,1.1,-0.1,1.1])
      
          plt.title(title)
      
          plt.show()
      
      
      
      
      
      
      
      def Test(W,B,logic):
      
          n1 = input("input number one:")
      
          x1 = float(n1)
      
          n2 = input("input number two:")
      
          x2 = float(n2)
      
          a = ForwardCalculationBatch(W, B, np.array([x1,x2]).reshape(2,1))
      
          print(a)
          if logic == 'AND':
      
              y = x1 and x2
          else:
              y = x1 or x2
      
          if np.abs(a-y) < 1e-2:
      
              print("True")
      
          else:
      
              print("False")
      
      
      
      
      
      if __name__ == '__main__':
      
          # SGD, MiniBatch, FullBatch
      
          # read data
          logic=input("'OR' or 'AND':")
          X,Y = Read_LOGIC_Data(logic)
      
          W, B = train(X, Y, ForwardCalculationBatch, CheckLoss)
      
      
      
          print("w=",W)
      
          print("b=",B)
      
          ShowResult(W,B,X,Y,logic)
      
          # test
      
          while True:
      
              Test(W,B,logic)
      

      邏輯或門:w=[11.74573383,11.747490636] b=-5.41268583

      邏輯與門: w=[11.76694002,11.76546912] b=-17.81530488

      posted on 2019-04-08 13:21  myq123  閱讀(352)  評論(0)    收藏  舉報

      主站蜘蛛池模板: 国产综合久久99久久| 国产精品爽爽va在线观看网站 | 蜜臀午夜一区二区在线播放| 亚洲第一无码专区天堂| 久久精品国产免费观看频道| 亚洲天堂在线观看完整版| 可以直接看的无码av| 一级女性全黄久久生活片| 国产中文字幕精品视频| 日韩高清亚洲日韩精品一区二区| 精品国产中文字幕在线| 人妻丝袜无码专区视频网站| 国产人妻精品一区二区三区不卡| 人人妻人人做人人爽夜欢视频 | 国产高清吹潮免费视频| 又粗又硬又黄a级毛片| 亚洲午夜久久久影院伊人| 亚洲av成人在线一区| 男人的天堂va在线无码| 欧美亚洲国产日韩电影在线| 国产精品久久久久久爽爽爽| 日韩av一中美av一中文字慕| 亚洲高清国产拍精品5G| 动漫av纯肉无码av在线播放| 亚洲精品国男人在线视频| 亚洲激情一区二区三区视频| 亚洲成人av综合一区| 国产一区二区一卡二卡| 精品无码人妻| 青青青青国产免费线在线观看 | 亚洲老女人区一区二视频| 亚洲中文无码手机永久| 国内少妇偷人精品免费| 国产综合视频精品一区二区| 国产二区三区不卡免费| 国内精品久久人妻无码妲| 又黄又刺激又黄又舒服| 亚洲无人区码一二三四区| 国产亚欧女人天堂AV在线| 97人人添人人澡人人澡人人澡| 日韩精品一区二区三区激情视频 |