- Word2Vec
Steps
1. Convert words to one-hot-vector
2. Get W(lookup table) : V * M (V : len_of_words * M : Hyperparameter, usually used square of 2)
- Neural networks Emedding
Steps
1. Conver words to integer
2. Get embedding table(lookup table) : num_embeddings * embedding_dim (num_embeddings : len_of_words * embedding_dim : Hyperparameter)
Reference : https://wikidocs.net/60854
'Deep Learning > PyTorch' 카테고리의 다른 글
PyTorch-permute vs transpose (0) | 2022.12.05 |
---|---|
RNN (0) | 2022.08.22 |
PyTorch-randint, scatter_, log_softmax, nll_loss, cross_entropy (0) | 2022.08.10 |
Linear Regression-requires_grad, zero_grad, backward, step (0) | 2022.08.08 |
PyTorch-view, squeeze, unsqueeze, cat, stack, size (0) | 2022.08.02 |