소프트맥스함수,softmax_function

Difference between r1.2 and the current

@@ -11,8 +11,8 @@

----
MKLINK
[[softmax]]? ... Google:Softmax
[[softmax_loss]]
[[softmax]]? [[소프트맥스,softmax]]? ... Google:Softmax
[[softmax_loss]] { [[소프트맥스,softmax]] [[소프트맥스함수,softmax_function]] [[손실,loss]] [[손실함수,loss_function]] }
[[softmax_regression]] { curr see https://wikidocs.net/35476 ... [[회귀,regression]] }
[[정규화,normalization]] ... softmax는 입력을 0~1사이로 정규화하며 출력값의 총합은 항상 1이라고.
[[디리클레_분포,Dirichlet_distribution]]
@@ -28,5 +28,4 @@

분류는
[[활성화함수,activation_function]]



Softmax (activation) function maps a vector of $K$ real values
into a probability distribution of $K$ possible outcomes,
i.e., a vector of $K$ real values that sum to 1.

Often used in the output layer.

즉 결과 벡터의 원소들의 합이 1이 되게.

$o_i = \frac{e^{z_i}}{\sum\nolimits_{i=1}^K e^{z_i}}$
(Kwak, Slide 2, p37)






AKA softargmax function