Fix logger error in DeepStream 6.0 / 6.0.1 + Change output classes format + Fixes
This commit is contained in:
@@ -43,30 +43,6 @@ Generate the ONNX model file (example for DAMO-YOLO-S*)
|
||||
python3 export_damoyolo.py -w damoyolo_tinynasL25_S_477.pth -c configs/damoyolo_tinynasL25_S.py --dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 11 or lower. The default opset is 11.
|
||||
|
||||
```
|
||||
--opset 11
|
||||
```
|
||||
|
||||
**NOTE**: To change the inference size (defaut: 640)
|
||||
|
||||
```
|
||||
@@ -88,6 +64,30 @@ or
|
||||
-s 1280 1280
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6.0)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6.1)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 11 or lower. The default opset is 11.
|
||||
|
||||
```
|
||||
--opset 11
|
||||
```
|
||||
|
||||
#### 5. Copy generated files
|
||||
|
||||
Copy the generated ONNX model file and labels.txt file (if generated) to the `DeepStream-Yolo` folder.
|
||||
|
||||
@@ -41,13 +41,13 @@ pip3 install onnx onnxsim onnxruntime
|
||||
python3 export_ppyoloe.py -w ppyoloe_plus_crn_s_80e_coco.pdparams -c configs/ppyoloe/ppyoloe_plus_crn_s_80e_coco.yml --dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6)
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6.0)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6)
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6.1)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
|
||||
@@ -46,30 +46,6 @@ Generate the ONNX model file (example for YOLO-NAS S)
|
||||
python3 export_yolonas.py -m yolo_nas_s -w yolo_nas_s_coco.pth --dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 14.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
**NOTE**: Model names
|
||||
|
||||
```
|
||||
@@ -88,6 +64,18 @@ or
|
||||
-m yolo_nas_l
|
||||
```
|
||||
|
||||
**NOTE**: Number of classes (example for 80 classes)
|
||||
|
||||
```
|
||||
-n 80
|
||||
```
|
||||
|
||||
or
|
||||
|
||||
```
|
||||
--classes 80
|
||||
```
|
||||
|
||||
**NOTE**: To change the inference size (defaut: 640)
|
||||
|
||||
```
|
||||
@@ -109,6 +97,30 @@ or
|
||||
-s 1280 1280
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6.0)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6.1)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 14.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
#### 5. Copy generated file
|
||||
|
||||
Copy the generated ONNX model file to the `DeepStream-Yolo` folder.
|
||||
|
||||
@@ -55,37 +55,13 @@ Generate the ONNX model file
|
||||
python3 export_yolor.py -w yolor-p6.pt --dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 12.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
**NOTE**: To convert a P6 model
|
||||
|
||||
```
|
||||
--p6
|
||||
```
|
||||
|
||||
**NOTE**: To change the inference size (defaut: 640)
|
||||
**NOTE**: To change the inference size (defaut: 640 / 1280 for `--p6` models)
|
||||
|
||||
```
|
||||
-s SIZE
|
||||
@@ -106,6 +82,30 @@ or
|
||||
-s 1280 1280
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6.0)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6.1)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 12.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
#### 5. Copy generated files
|
||||
|
||||
Copy the generated ONNX model file and labels.txt file (if generated) to the `DeepStream-Yolo` folder
|
||||
|
||||
@@ -46,13 +46,13 @@ Generate the ONNX model file (example for YOLOX-s)
|
||||
python3 export_yolox.py -w yolox_s.pth -c exps/default/yolox_s.py --dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6)
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6.0)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6)
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6.1)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
|
||||
@@ -47,37 +47,13 @@ Generate the ONNX model file (example for YOLOv5s)
|
||||
python3 export_yoloV5.py -w yolov5s.pt --dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 17.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
**NOTE**: To convert a P6 model
|
||||
|
||||
```
|
||||
--p6
|
||||
```
|
||||
|
||||
**NOTE**: To change the inference size (defaut: 640)
|
||||
**NOTE**: To change the inference size (defaut: 640 / 1280 for `--p6` models)
|
||||
|
||||
```
|
||||
-s SIZE
|
||||
@@ -98,6 +74,30 @@ or
|
||||
-s 1280 1280
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6.0)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6.1)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 17.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
#### 5. Copy generated files
|
||||
|
||||
Copy the generated ONNX model file and labels.txt file (if generated) to the `DeepStream-Yolo` folder.
|
||||
|
||||
@@ -47,37 +47,13 @@ Generate the ONNX model file (example for YOLOv6-S 4.0)
|
||||
python3 export_yoloV6.py -w yolov6s.pt --dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 13.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
**NOTE**: To convert a P6 model
|
||||
|
||||
```
|
||||
--p6
|
||||
```
|
||||
|
||||
**NOTE**: To change the inference size (defaut: 640)
|
||||
**NOTE**: To change the inference size (defaut: 640 / 1280 for `--p6` models)
|
||||
|
||||
```
|
||||
-s SIZE
|
||||
@@ -98,6 +74,30 @@ or
|
||||
-s 1280 1280
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6.0)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6.1)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 13.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
#### 5. Copy generated file
|
||||
|
||||
Copy the generated ONNX model file to the `DeepStream-Yolo` folder.
|
||||
|
||||
@@ -49,37 +49,13 @@ Generate the ONNX model file (example for YOLOv7)
|
||||
python3 export_yoloV7.py -w yolov7.pt --dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 12.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
**NOTE**: To convert a P6 model
|
||||
|
||||
```
|
||||
--p6
|
||||
```
|
||||
|
||||
**NOTE**: To change the inference size (defaut: 640)
|
||||
**NOTE**: To change the inference size (defaut: 640 / 1280 for `--p6` models)
|
||||
|
||||
```
|
||||
-s SIZE
|
||||
@@ -100,6 +76,30 @@ or
|
||||
-s 1280 1280
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6.0)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6.1)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 12.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
#### 6. Copy generated files
|
||||
|
||||
Copy the generated ONNX model file and labels.txt file (if generated) to the `DeepStream-Yolo` folder.
|
||||
|
||||
@@ -46,30 +46,6 @@ Generate the ONNX model file (example for YOLOv8s)
|
||||
python3 export_yoloV8.py -w yolov8s.pt --dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 16.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
**NOTE**: To change the inference size (defaut: 640)
|
||||
|
||||
```
|
||||
@@ -91,6 +67,30 @@ or
|
||||
-s 1280 1280
|
||||
```
|
||||
|
||||
**NOTE**: To simplify the ONNX model (DeepStream >= 6.0)
|
||||
|
||||
```
|
||||
--simplify
|
||||
```
|
||||
|
||||
**NOTE**: To use dynamic batch-size (DeepStream >= 6.1)
|
||||
|
||||
```
|
||||
--dynamic
|
||||
```
|
||||
|
||||
**NOTE**: To use implicit batch-size (example for batch-size = 4)
|
||||
|
||||
```
|
||||
--batch 4
|
||||
```
|
||||
|
||||
**NOTE**: If you are using DeepStream 5.1, remove the `--dynamic` arg and use opset 12 or lower. The default opset is 16.
|
||||
|
||||
```
|
||||
--opset 12
|
||||
```
|
||||
|
||||
#### 5. Copy generated files
|
||||
|
||||
Copy the generated ONNX model file and labels.txt file (if generated) to the `DeepStream-Yolo` folder.
|
||||
|
||||
Reference in New Issue
Block a user