Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to solve the run time error "Only Tensors created explicitly by the user (graph leaves) support the deepcopy protocol at the moment"

I want to use output variables of NN as an input in another function,but met with error like this 'Only Tensors created explicitly by the user (graph leaves) support the deepcopy protocol at the moment'.The out variables require gradient.

I tried by changing the output variables to numpy values, but in that case the back propagataion does not work because it see numpy values as variables which does not need gradient.

output = model(SOC[13])

# Three output values of NN
Rs=output[0]
R1=output[1]
C1=output[2]

# Using these variables in another function

num1=[Rs*R1*C1,R1+Rs]
den1=[C1*R1,1]
G = control.tf(num,den)

It should work, but it gives error.

     14             num=[Rs*R1*C1,R1+Rs]
     15             den=[C1*R1,1]
---> 16             G = control.tf(num,den)
~\Anaconda3\lib\site-packages\control\xferfcn.py in __init__(self, *args)
    106 
    107         """
--> 108         args = deepcopy(args)
    109         if len(args) == 2:
    110             # The user provided a numerator and a denominator.
~\Anaconda3\lib\site-packages\torch\tensor.py in __deepcopy__(self, memo)
     16     def __deepcopy__(self, memo):
     17         if not self.is_leaf:
---> 18             raise RuntimeError("Only Tensors created explicitly by the user "
     19                                "(graph leaves) support the deepcopy protocol at the moment")
     20         if id(self) in memo:
like image 615
mazhar abbas Avatar asked Dec 06 '25 02:12

mazhar abbas


2 Answers

I met a similar problem once. To be brief, the mistake is caused by deepcopy, which is not suitable for non-leaf node. You can print the Rs, R1 and C1 to check whether they are leaf node.

If they are leaf node, there is "requires_grad=True" and is not "grad_fn=SliceBackward" or "grad_fn=CopySlices". I guess that non-leaf node has grad_fn, which is used to propagate gradients.

#---------------------------------------------------------------------------------
>>>import torch
>>>q = torch.nn.Parameter(torch.Tensor(3,3))
>>>q
Parameter containing:
tensor([[8.7551e-37, 0.0000e+00, 0.0000e+00],
        [0.0000e+00, 0.0000e+00, 0.0000e+00],
        [0.0000e+00, 0.0000e+00, 0.0000e+00]], requires_grad=True)
#q is leaf node
>>>p = q[0,:]
>>>p
tensor([8.7551e-37, 0.0000e+00, 0.0000e+00], grad_fn=<SliceBackward>)
#p is non-leaf node
>>>q[0,0] = 0
>>>q
Parameter containing:
tensor([[0., 0., 0.],
        [0., 0., 0.],
        [0., 0., 0.]], grad_fn=<CopySlices>)
#if slice operation is made on q, q becomes non-leaf node. The deepcopy is not suitable for q any more.
#-----------------------------------------------------------------------------
like image 76
DELY Avatar answered Dec 07 '25 16:12

DELY


In pytorch, you can use the #tensor_name#.detach() function

new_tensor = _tensor_.detach()
like image 26
pristine Avatar answered Dec 07 '25 17:12

pristine