1. 程式人生 > >PyTorch學習筆記之初識word_embedding

PyTorch學習筆記之初識word_embedding

spa clas eve rom embed con nbsp from print

 1 import torch
 2 import torch.nn as nn
 3 from torch.autograd import Variable
 4 
 5 word2id = {hello: 0, world: 1}
 6 # you have 2 words, and then need 5 dim each word
 7 embeds = nn.Embedding(2, 5)
 8 # we need variable, because we need use element of nn.Embedding
 9 hello_idx = torch.LongTensor([word2id[
hello]]) 10 print(hello_idx) 11 hello_idx = Variable(hello_idx) 12 print(hello_idx) 13 # achieve the initial word_embedding of ‘hello‘ 14 hello_embed = embeds(hello_idx) 15 print(hello_embed) 16 ‘‘‘ 17 0 18 [torch.LongTensor of size 1] 19 20 Variable containing: 21 0 22 [torch.LongTensor of size 1]
23 24 Variable containing: 25 1.1842 0.6819 -0.8768 -1.5130 0.7650 26 [torch.FloatTensor of size 1x5] 27 ‘‘‘

PyTorch學習筆記之初識word_embedding