Machine Learning
激活函数
sigmoid
$$ \tag{表达式} sigmoid(x) = \frac{1}{1 + e^{-x}} $$
$$ \tag{导函数} sigmoid^{'}(x) = sigmoid(x) \cdot (1 - sigmoid(x)) $$
softmax
$$ \tag{表达式} softmax(x) = \frac{e^{x_{i}}}{\sum_{j=1}^{n}{e^{x_{j}}}} $$
tanh
$$ \tag{表达式} tanh(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}} $$
TensorFlow
损失函数 Losses
tf.keras.losses.MeanSquaredError
tf.keras.losses.BinaryCrossentropy
tf.keras.losses.CategoricalCrossentropy
tf.keras.losses.SparseCategoricalCrossentropy
激活函数 Activations
tf.keras.activations.sigmoid
tf.keras.activations.softmax
tf.keras.activations.tanh
tf.keras.activations.relu
神经网络层 Layers
tf.keras.layers.Input
tf.keras.layers.Dense
tf.keras.layers.Conv2D
tf.keras.layers.MaxPool2D
tf.keras.layers.Dropout
tf.keras.layers.Flatten
tf.keras.layers.LSTM
tf.keras.layers.Activation