- GradientTape
It records intermediate processes of functions, operations on Tape one by one. In other words, computational operation is saved in Tape.
- Reverse mode automatic differentiation
TensorFlow calculates the differentiation of x for the loss by using the Reverse mode automatic differentiation with
dx = tape.gradient(loss, x)
Backpropagate the differential of x for the loss,
repeat the task of updating the value of x,
learning to find the answer of the x.
Start at x = 10.0 and repeats 4 times, if found the answer with x = 4.0.
def train_func():
with tf.GradientTape() as tape:
loss=tf.math.abs(a*x-y)
dx=tape.gradient(loss, x)
print('x={}, dx={:.2f}'.format(x.numpy(), dx))
x.assign(x-dx)
for i in range(6):
train_func()
>>>
x=10.0, dx=2.00
x=8.0, dx=2.00
x=6.0, dx=2.00
x=4.0, dx=0.00
x=4.0, dx=0.00
x=4.0, dx=0.00
'Deep Learning > Tensorflow' 카테고리의 다른 글
fit (0) | 2023.12.20 |
---|---|
Layers-Sequential, input_shape, Flatten, Dense (0) | 2023.12.19 |
reduce_sum, cast, argmax, image_dataset_from_directory, one_hot, reduce_mean, assign_sub, boolean_mask, random.normal, zeros (0) | 2023.12.12 |
LSTM (0) | 2022.09.15 |
Keras-Preprocessing, One-hot encoding, Word Embedding , Modeling, Compile (0) | 2021.04.09 |