Step 1. Initialize weight and bias : requires_grad=True (if False, This will prevent updating of model weights during fine-tuning.) 2. Set the optimizer and learning rate 3. Set hypothesis 4. Get cost 5. Initialize optimizer : zero_grad() 6. backward : differenciate cost function -> get gradient -> update w, b when backpropagation 7. step : apply learning rate to w, b -> update w, b optimizer(di..