¡Hola, Mundo!

  • 홈
  • 태그
  • 방명록

GradientTape 1

GradientTape

GradientTape It records intermediate processes of functions, operations on Tape one by one. In other words, computational operation is saved in Tape. Reverse mode automatic differentiation TensorFlow calculates the differentiation of x for the loss by using the Reverse mode automatic differentiation with dx = tape.gradient(loss, x) Backpropagate the differential of x for the loss, repeat the tas..

Deep Learning/Tensorflow 2023.12.12
이전
1
다음
더보기
  • 분류 전체보기 (352)
    • Autonomous Vehicle (55)
      • Theory (14)
      • Sensors (11)
      • Video Geometry (24)
    • Deep Learning (20)
      • PyTorch (10)
      • Tensorflow (8)
      • CNN (7)
      • Object Detection (16)
    • Machine Learning (18)
    • Reinforcement Learning (4)
    • Analyze Data (29)
      • Python (2)
      • Python Libraries (20)
      • Measure of similarity (7)
    • KAFKA (6)
    • d3.js (20)
    • Environment (9)
      • Anaconda (4)
      • VisualStudioCode (3)
      • Jupyter (2)
    • JavaScript (10)
    • C# (6)
    • Linux (15)
      • terminal (9)
      • Nvidia (3)
    • Docker (17)
    • Git (7)
    • Concept (8)
      • Network (3)
    • Elastic Stack (6)
      • Elasticsearch (5)
      • Logstash (1)
    • Basic Python (49)
      • FastAPI (3)
      • Data Structure (1)
      • Workbook (5)
    • DataBase (11)
      • MYSQL (6)
      • MariaDB (1)
    • Math (20)
    • About me (1)

Tag

textdistance, Sigmoid function, randn, selectall, docker-compose, 3D Rotation Matrix, zeros, classmethod, forward propagation, kafka, Filter, Regular Expression, d3js, batch size, axis, abstractmethod, yield from, nvidia-smi, Step Function, global variable,

최근글과 인기글

  • 최근글
  • 인기글

최근댓글

공지사항

페이스북 트위터 플러그인

  • Facebook
  • Twitter

Archives

Calendar

«   2025/12   »
일 월 화 수 목 금 토
1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31

Copyright © Kakao Corp. All rights reserved.

티스토리툴바