摘要:
一、Activation function Sigmod\[g(z) = \frac{1}{1 + e^{-z}} \] For Binary Classification(Logistic regression)/Output layer is binary. ReLU——most common 閱讀全文
posted @ 2025-07-29 19:54
鐵鼠
閱讀(11)
評論(0)
推薦(0)
摘要:
1.how to build layers Manually(1x3matrix) x = np.arrat([200.0, 17.0]) layer_1 = Dense(units=3, activation='sigmoid') a1 = layer_1(x) layer_2 = Dense(u 閱讀全文
posted @ 2025-07-29 15:29
鐵鼠
閱讀(4)
評論(0)
推薦(0)

浙公網安備 33010602011771號