¡Hola, Mundo!

  • 홈
  • 태그
  • 방명록

ReLU function 1

Activation function

Activation function It takes the output of a neuron and decide whether this neuron is going to fire or not, in other words, "should this neuron 'fire' or not?" Activation function Step function Hardly used. Sigmoid function Used in Binary Classification. Used in output layer usually. Vanishing Gradient possible. ↓ Hyperbolic tangent function ReLU(Rectified Linear Unit) function f(x) = max(0,x) I..

Deep Learning 2022.03.17
이전
1
다음
더보기
  • 분류 전체보기 (352)
    • Autonomous Vehicle (55)
      • Theory (14)
      • Sensors (11)
      • Video Geometry (24)
    • Deep Learning (20)
      • PyTorch (10)
      • Tensorflow (8)
      • CNN (7)
      • Object Detection (16)
    • Machine Learning (18)
    • Reinforcement Learning (4)
    • Analyze Data (29)
      • Python (2)
      • Python Libraries (20)
      • Measure of similarity (7)
    • KAFKA (6)
    • d3.js (20)
    • Environment (9)
      • Anaconda (4)
      • VisualStudioCode (3)
      • Jupyter (2)
    • JavaScript (10)
    • C# (6)
    • Linux (15)
      • terminal (9)
      • Nvidia (3)
    • Docker (17)
    • Git (7)
    • Concept (8)
      • Network (3)
    • Elastic Stack (6)
      • Elasticsearch (5)
      • Logstash (1)
    • Basic Python (49)
      • FastAPI (3)
      • Data Structure (1)
      • Workbook (5)
    • DataBase (11)
      • MYSQL (6)
      • MariaDB (1)
    • Math (20)
    • About me (1)

Tag

textdistance, 3D Rotation Matrix, Regular Expression, forward propagation, d3js, Filter, nvidia-smi, zeros, Sigmoid function, docker-compose, kafka, selectall, yield from, batch size, abstractmethod, Step Function, global variable, classmethod, randn, axis,

최근글과 인기글

  • 최근글
  • 인기글

최근댓글

공지사항

페이스북 트위터 플러그인

  • Facebook
  • Twitter

Archives

Calendar

«   2026/01   »
일 월 화 수 목 금 토
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31

Copyright © Kakao Corp. All rights reserved.

티스토리툴바