Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to move YoloV8 model onto GPU?

Tags:

yolo

I am creating a YOLOV8 model and loading some pre-trained weights. I then want to use that model to run inference on some images however I want to specify that the inference should run on GPU - is it possible to do this when creating the YOLO model?

I am loading the model like this:

model = YOLO("yolov8n.pt") 

but when I pass in a device like so:

model = YOLO("yolov8n.pt", device='gpu') 

I get an unexpected argument error:

TypeError: __init__() got an unexpected keyword argument 'device'
like image 857
219CID Avatar asked Oct 15 '25 11:10

219CID


1 Answers

In order to move a YOLO model to GPU you must use the pytorch .to syntax like so:

model = YOLO("yolov8n.pt") 
model.to('cuda')

some useful docs here

You can also explicitly run a prediction and specify the device. See docs here

model.predict(source, save=True, imgsz=320, conf=0.5,device='xyz')
like image 112
219CID Avatar answered Oct 17 '25 21:10

219CID



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!