1. 程式人生 > >【Error】: Trying to backward through the graph a second time, but the buffers have already been

【Error】: Trying to backward through the graph a second time, but the buffers have already been

錯誤:

RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time.

程式碼:

        optimizer.zero_grad() # word
        features = encoder(src)     
        loss = decoder(features, trg)   
        loss.backward() 
        grad_norm = torch.nn.utils.clip_grad_norm(decoder.parameters(), opt.grad_clip)############3
        optimizer.step()

        optimizer.zero_grad() # pos
        # features = encoder(src)  

  
        loss = decoder.forward_pos(features, trg_pos)  
        loss.backward() 
        grad_norm = torch.nn.utils.clip_grad_norm(decoder.parameters(), opt.grad_clip)
        optimizer.step()

程式碼更改為:

        optimizer.zero_grad() # word
        features = encoder(src)     
        loss = decoder(features, trg)   
        loss.backward() 
        grad_norm = torch.nn.utils.clip_grad_norm(decoder.parameters(), opt.grad_clip)
        optimizer.step()

        optimizer.zero_grad() # pos
        features = encoder(src)  

  
        loss = decoder.forward_pos(features, trg_pos)  
        loss.backward() 
        grad_norm = torch.nn.utils.clip_grad_norm(decoder.parameters(), opt.grad_clip)
        optimizer.step()

錯誤消失。

參考網址:https://blog.csdn.net/u010829672/article/details/79538853