Deep Learning/Tensorflow

GradientTape

Naranjito 2023. 12. 12. 19:20
  • GradientTape

It records intermediate processes of functions, operations on Tape one by one. In other words, computational operation is saved in Tape.

 

  • Reverse mode automatic differentiation 

TensorFlow calculates the differentiation of x for the loss by using the Reverse mode automatic differentiation with 

dx = tape.gradient(loss, x)

 

Backpropagate the differential of x for the loss, 

repeat the task of updating the value of x,

learning to find the answer of the x. 



Start at  x = 10.0 and repeats 4 times, if found the answer with x = 4.0.

def train_func():
    with tf.GradientTape() as tape:
        loss=tf.math.abs(a*x-y)
        
    dx=tape.gradient(loss, x)
    print('x={}, dx={:.2f}'.format(x.numpy(), dx))
    
    x.assign(x-dx)
    
for i in range(6):
    train_func()
    
>>>
x=10.0, dx=2.00
x=8.0, dx=2.00
x=6.0, dx=2.00
x=4.0, dx=0.00
x=4.0, dx=0.00
x=4.0, dx=0.00

 

https://rfriend.tistory.com/556