<img src="https://user-images.githubusercontent.com/33729709/222878237-fb9e902e-79ef-4393-9bb6-e1bc9b3a77b3.gif" width="120" height="40" />
# YOLOv8-3D (tracker)
## Description:
YOLOv8-3D is a lightweight and user-friendly library designed for efficient 2D and 3D bounding box object detection in Advanced Driver Assistance Systems (ADAS). With its intuitive API and comprehensive features, EasyADAS makes it straightforward to integrate object detection capabilities into your ADAS projects.
<div align="center">
<a href="https://github.com/bharath5673/YOLOv8-3D">
<img src="assets/demo.gif" alt="YOLOv8-3D" width="1000"/>
</a>
<p>
This API supports for easy understanding and integrate 3D perception, systems can make more informed decisions and operate effectively in complex, real-world environments.
</p>
</div>
### Key Features:
- Simplified API: YOLOv8-3D provides a straightforward API that allows you to quickly implement object detection in your ADAS system.
- 2D and 3D Bounding Boxes: Detect both 2D and 3D bounding boxes for accurate spatial understanding of objects.
- Efficient Processing: Leverage optimized algorithms for fast and reliable object detection performance.
- Flexible Integration: EasyADAS is designed to seamlessly integrate with existing ADAS systems and frameworks.
- Comprehensive Documentation: Extensive documentation and examples ensure that you can get started quickly and easily.
- Scalable: Scale your detection capabilities by leveraging EasyADAS in a variety of scenarios and environments.
- supports all best and and state of the art CNN models and easy training setup.
- added augmentations for 3d .
- api supports Resnets, VGG nets, mobilenets, Efficientnets , MOT-Tracker
## special features
<div align="center">
<img src="assets/Screenshot from 2023-10-18 16-17-03.png" width="350"/> <img src="assets/mobilenetv2_results_plot.png" width="600"/>
<p>
augmentations for better training, automated backup training and results plot
</p>
</div>
## Get started
### Prerequisites
- Python 3.x (tested on 3.10, Ubuntu 22.04)
- OpenCV
- Tensorflow
- PyTorch
- NumPy
- KITTI 3d dataset [download from here](https://www.cvlibs.net/datasets/kitti/eval_object.php?obj_benchmark=3d)
- [Visualize KITTI Objects in Camera, Point Cloud and BEV in Videos](https://github.com/HengLan/Visualize-KITTI-Objects-in-Videos)
### Usage
### Installation
1. Clone this repository.
2. Install the required dependencies
### Run
#### for training
```bash
conda create -n test1 python=3.10 -y
conda activate test1
pip install tensorflow
```
For more detailed tensorflow gpu installation instructions and options, refer to [this documentation](https://www.tensorflow.org/install/pip).
```
####### select model on train.py ########
# select_model = 'resnet50'
# select_model ='resnet101'
# select_model = 'resnet152'
# select_model = 'vgg11'
# select_model = 'vgg16'
# select_model = 'vgg19'
# select_model = 'efficientnetb0'
# select_model = 'efficientnetb5'
select_model = 'mobilenetv2'
```
```bash
###[INFO] set num of iterations to run (train.py) on (run_train.sh) file /// this automatically saves training info for every 20 epochs.
bash run_train.sh
```
[recommended](https://github.com/bharath5673/YOLOv8-3D/issues/1#issuecomment-1770855065) new environment to infer models only on cpu
#### for testing
```bash
conda create -n test2 python=3.10 -y
conda activate test2
pip install tensorflow ultralytics
```
```bash
python demo.py
```
<br>
<img src="https://media0.giphy.com/media/J19OSJKmqCyP7Mfjt1/giphy.gif" width="80" height="30" />
<h2> realtime BEV plot </h2>
<div align="center">
<a href="https://github.com/bharath5673/YOLOv8-3D">
<img src="assets/demo_bev.gif" alt="YOLOv8-3D" width="1000"/>
</a>
</div>
```bash
set
## BEV_plot = True
## TracK = True
```
### Contributing
Contributions are welcome! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.
### Acknowledgements
<details><summary> <b>Expand</b> </summary>
* [https://github.com/AlexeyAB/darknet](https://github.com/AlexeyAB/darknet)
* [https://github.com/WongKinYiu/yolor](https://github.com/WongKinYiu/yolor)
* [https://github.com/WongKinYiu/PyTorch_YOLOv4](https://github.com/WongKinYiu/PyTorch_YOLOv4)
* [https://github.com/WongKinYiu/ScaledYOLOv4](https://github.com/WongKinYiu/ScaledYOLOv4)
* [https://github.com/Megvii-BaseDetection/YOLOX](https://github.com/Megvii-BaseDetection/YOLOX)
* [https://github.com/ultralytics/yolov3](https://github.com/ultralytics/yolov3)
* [https://github.com/ultralytics/yolov5](https://github.com/ultralytics/yolov5)
* [https://github.com/DingXiaoH/RepVGG](https://github.com/DingXiaoH/RepVGG)
* [https://github.com/JUGGHM/OREPA_CVPR2022](https://github.com/JUGGHM/OREPA_CVPR2022)
* [https://github.com/TexasInstruments/edgeai-yolov5/tree/yolo-pose](https://github.com/TexasInstruments/edgeai-yolov5/tree/yolo-pose)
* [https://www.cvlibs.net/datasets/kitti/eval_object.php?obj_benchmark=3d](https://www.cvlibs.net/datasets/kitti/eval_object.php?obj_benchmark=3d)
* [https://opencv.org/](https://opencv.org/)
* [https://github.com/ultralytics/ultralytics](https://github.com/ultralytics/ultralytics)
* [https://github.com/lzccccc/3d-bounding-box-estimation-for-autonomous-driving](https://github.com/lzccccc/3d-bounding-box-estimation-for-autonomous-driving)
* [https://github.com/lzccccc/SMOKE](https://github.com/lzccccc/SMOKE)
* [https://github.com/abhi1kumar/DEVIANT.git](https://github.com/abhi1kumar/DEVIANT.git)
</details>

徐浪老师
- 粉丝: 9536
最新资源
- 宜春城区SDH地图.vsdx
- (71页PPT)高处作业理实施细则.ppt
- 2023年山东省信息学小学组(CSP-X)第一轮试题及参考答案
- (71页PPT)智慧方案智慧油田音视讯信息融合生产指挥调度解决方案.pptx
- (71页PPT)工业40及智能制造解决方案.ppt
- edge浏览器插件,关闭后切换到左侧标签页
- 将 VOC 格式的 XML 标签转换为 YOLO 格式的 TXT 标签的 Python 脚本,包含函数化实现和类实例化实现两种方式
- 适用于OpenAI ChatGPT、DeepSeek R1、Anthropic Claude等的跨平台桌面LLM客户端,专注于隐私和安全
- 统计目标检测 TXT 标签文件中类别分布并生成美观直方图的 Python 脚本
- 汽车专卖店管理系统的设计与实现(代码+数据库+LW)
- postman+免登录
- 智能微秘书全能微信机器人管理平台:轻松接入 ChatGPT 等多模型,支持绘图、语音、定时任务及企微公众号等多渠道
- 5ca287de202bb218961ec07495d7e9e4.xls
- Linkclump浏览器插件
- 基于 STM32 微控制器的智能鱼缸控制系统
- 一网打尽!R1 那些令人拍案叫绝的经典作品大盘点
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈


