I want to use output variables of NN as an input in another function,but met with error like this 'Only Tensors created explicitly by the user (graph leaves) support the deepcopy protocol at the moment'.The out variables require gradient.
I tried by changing the output variables to numpy values, but in that case the back propagataion does not work because it see numpy values as variables which does not need gradient.
output = model(SOC[13])
# Three output values of NN
Rs=output[0]
R1=output[1]
C1=output[2]
# Using these variables in another function
num1=[Rs*R1*C1,R1+Rs]
den1=[C1*R1,1]
G = control.tf(num,den)
It should work, but it gives error.
14 num=[Rs*R1*C1,R1+Rs]
15 den=[C1*R1,1]
---> 16 G = control.tf(num,den)
~\Anaconda3\lib\site-packages\control\xferfcn.py in __init__(self, *args)
106
107 """
--> 108 args = deepcopy(args)
109 if len(args) == 2:
110 # The user provided a numerator and a denominator.
~\Anaconda3\lib\site-packages\torch\tensor.py in __deepcopy__(self, memo)
16 def __deepcopy__(self, memo):
17 if not self.is_leaf:
---> 18 raise RuntimeError("Only Tensors created explicitly by the user "
19 "(graph leaves) support the deepcopy protocol at the moment")
20 if id(self) in memo:
I met a similar problem once. To be brief, the mistake is caused by deepcopy, which is not suitable for non-leaf node. You can print the Rs, R1 and C1 to check whether they are leaf node.
If they are leaf node, there is "requires_grad=True" and is not "grad_fn=SliceBackward" or "grad_fn=CopySlices". I guess that non-leaf node has grad_fn, which is used to propagate gradients.
#---------------------------------------------------------------------------------
>>>import torch
>>>q = torch.nn.Parameter(torch.Tensor(3,3))
>>>q
Parameter containing:
tensor([[8.7551e-37, 0.0000e+00, 0.0000e+00],
[0.0000e+00, 0.0000e+00, 0.0000e+00],
[0.0000e+00, 0.0000e+00, 0.0000e+00]], requires_grad=True)
#q is leaf node
>>>p = q[0,:]
>>>p
tensor([8.7551e-37, 0.0000e+00, 0.0000e+00], grad_fn=<SliceBackward>)
#p is non-leaf node
>>>q[0,0] = 0
>>>q
Parameter containing:
tensor([[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], grad_fn=<CopySlices>)
#if slice operation is made on q, q becomes non-leaf node. The deepcopy is not suitable for q any more.
#-----------------------------------------------------------------------------
In pytorch, you can use the #tensor_name#.detach() function
new_tensor = _tensor_.detach()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With