¡Hola, Mundo!

  • 홈
  • 태그
  • 방명록

dropout 1

The ways to avoid the model Overfitting-Dropout, Gradient Clipping

Data Augmentation When the amount of data is small, the model can easily memorize even certain patterns or noises, increasing the probability of overfitting. Therefore, as the amount of data increases, the model can learn common patterns of data to prevent overfitting.Reduce model complexity Reduce the number of hidden layers or parameters.For example, The complex overfitting neural network that..

Deep Learning 2021.04.08
이전
1
다음
더보기
  • 분류 전체보기 (359) N
    • Autonomous Vehicle (56)
      • Theory (14)
      • Sensors (11)
      • Video Geometry (25)
    • Deep Learning (21) N
      • PyTorch (11)
      • Tensorflow (8)
      • CNN (7)
      • Object Detection (16)
    • Machine Learning (19)
    • Reinforcement Learning (4)
    • Analyze Data (29)
      • Python (2)
      • Python Libraries (20)
      • Measure of similarity (7)
    • KAFKA (6)
    • d3.js (20)
    • Environment (9)
      • Anaconda (4)
      • VisualStudioCode (3)
      • Jupyter (2)
    • JavaScript (10)
    • C# (6)
    • Linux (15)
      • terminal (9)
      • Nvidia (3)
    • Docker (17)
    • Git (7)
    • Concept (8)
      • Network (3)
    • Elastic Stack (6)
      • Elasticsearch (5)
      • Logstash (1)
    • Basic Python (42) N
      • FastAPI (3)
      • Data Structure (1)
      • Workbook (5)
    • DataBase (11)
      • MYSQL (6)
      • MariaDB (1)
    • Math (21)
    • About me (1)

Tag

axis, d3js, classmethod, yield from, selectall, Sigmoid function, zeros, textdistance, abstractmethod, 3D Rotation Matrix, docker-compose, Filter, nvidia-smi, global variable, batch size, kafka, randn, randint, forward propagation, Regular Expression,

최근글과 인기글

  • 최근글
  • 인기글

최근댓글

공지사항

페이스북 트위터 플러그인

  • Facebook
  • Twitter

Archives

Calendar

«   2026/01   »
일 월 화 수 목 금 토
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31

Copyright © Kakao Corp. All rights reserved.

티스토리툴바