<div align="center">
<p>
<a align="left" href="https://ultralytics.com/yolov5" target="_blank">
<img width="850" src="https://github.com/ultralytics/yolov5/releases/download/v1.0/splash.jpg"></a>
</p>
<br>
<div>
<a href="https://github.com/ultralytics/yolov5/actions"><img src="https://github.com/ultralytics/yolov5/workflows/CI%20CPU%20testing/badge.svg" alt="CI CPU testing"></a>
<a href="https://zenodo.org/badge/latestdoi/264818686"><img src="https://zenodo.org/badge/264818686.svg" alt="YOLOv5 Citation"></a>
<a href="https://hub.docker.com/r/ultralytics/yolov5"><img src="https://img.shields.io/docker/pulls/ultralytics/yolov5?logo=docker" alt="Docker Pulls"></a>
<br>
<a href="https://colab.research.google.com/github/ultralytics/yolov5/blob/master/tutorial.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a>
<a href="https://www.kaggle.com/ultralytics/yolov5"><img src="https://kaggle.com/static/images/open-in-kaggle.svg" alt="Open In Kaggle"></a>
<a href="https://join.slack.com/t/ultralytics/shared_invite/zt-w29ei8bp-jczz7QYUmDtgo6r6KcMIAg"><img src="https://img.shields.io/badge/Slack-Join_Forum-blue.svg?logo=slack" alt="Join Forum"></a>
</div>
<br>
<p>
YOLOv5 ð is a family of object detection architectures and models pretrained on the COCO dataset, and represents <a href="https://ultralytics.com">Ultralytics</a>
open-source research into future vision AI methods, incorporating lessons learned and best practices evolved over thousands of hours of research and development.
</p>
<div align="center">
<a href="https://github.com/ultralytics">
<img src="https://github.com/ultralytics/yolov5/releases/download/v1.0/logo-social-github.png" width="2%"/>
</a>
<img width="2%" />
<a href="https://www.linkedin.com/company/ultralytics">
<img src="https://github.com/ultralytics/yolov5/releases/download/v1.0/logo-social-linkedin.png" width="2%"/>
</a>
<img width="2%" />
<a href="https://twitter.com/ultralytics">
<img src="https://github.com/ultralytics/yolov5/releases/download/v1.0/logo-social-twitter.png" width="2%"/>
</a>
<img width="2%" />
<a href="https://www.producthunt.com/@glenn_jocher">
<img src="https://github.com/ultralytics/yolov5/releases/download/v1.0/logo-social-producthunt.png" width="2%"/>
</a>
<img width="2%" />
<a href="https://youtube.com/ultralytics">
<img src="https://github.com/ultralytics/yolov5/releases/download/v1.0/logo-social-youtube.png" width="2%"/>
</a>
<img width="2%" />
<a href="https://www.facebook.com/ultralytics">
<img src="https://github.com/ultralytics/yolov5/releases/download/v1.0/logo-social-facebook.png" width="2%"/>
</a>
<img width="2%" />
<a href="https://www.instagram.com/ultralytics/">
<img src="https://github.com/ultralytics/yolov5/releases/download/v1.0/logo-social-instagram.png" width="2%"/>
</a>
</div>
<!--
<a align="center" href="https://ultralytics.com/yolov5" target="_blank">
<img width="800" src="https://github.com/ultralytics/yolov5/releases/download/v1.0/banner-api.png"></a>
-->
</div>
## <div align="center">Documentation</div>
See the [YOLOv5 Docs](https://docs.ultralytics.com) for full documentation on training, testing and deployment.
## <div align="center">Quick Start Examples</div>
<details open>
<summary>Install</summary>
Clone repo and install [requirements.txt](https://github.com/ultralytics/yolov5/blob/master/requirements.txt) in a
[**Python>=3.7.0**](https://www.python.org/) environment, including
[**PyTorch>=1.7**](https://pytorch.org/get-started/locally/).
```bash
git clone https://github.com/ultralytics/yolov5 # clone
cd yolov5
pip install -r requirements.txt # install
```
</details>
<details open>
<summary>Inference</summary>
Inference with YOLOv5 and [PyTorch Hub](https://github.com/ultralytics/yolov5/issues/36)
. [Models](https://github.com/ultralytics/yolov5/tree/master/models) download automatically from the latest
YOLOv5 [release](https://github.com/ultralytics/yolov5/releases).
```python
import torch
# Model
model = torch.hub.load('ultralytics/yolov5', 'yolov5s') # or yolov5m, yolov5l, yolov5x, custom
# Images
img = 'https://ultralytics.com/images/zidane.jpg' # or file, Path, PIL, OpenCV, numpy, list
# Inference
results = model(img)
# Results
results.print() # or .show(), .save(), .crop(), .pandas(), etc.
```
</details>
<details>
<summary>Inference with detect.py</summary>
`detect.py` runs inference on a variety of sources, downloading [models](https://github.com/ultralytics/yolov5/tree/master/models) automatically from
the latest YOLOv5 [release](https://github.com/ultralytics/yolov5/releases) and saving results to `runs/detect`.
```bash
python detect.py --source 0 # webcam
img.jpg # image
vid.mp4 # video
path/ # directory
path/*.jpg # glob
'https://youtu.be/Zgi9g1ksQHc' # YouTube
'rtsp://example.com/media.mp4' # RTSP, RTMP, HTTP stream
```
</details>
<details>
<summary>Training</summary>
The commands below reproduce YOLOv5 [COCO](https://github.com/ultralytics/yolov5/blob/master/data/scripts/get_coco.sh)
results. [Models](https://github.com/ultralytics/yolov5/tree/master/models)
and [datasets](https://github.com/ultralytics/yolov5/tree/master/data) download automatically from the latest
YOLOv5 [release](https://github.com/ultralytics/yolov5/releases). Training times for YOLOv5n/s/m/l/x are
1/2/4/6/8 days on a V100 GPU ([Multi-GPU](https://github.com/ultralytics/yolov5/issues/475) times faster). Use the
largest `--batch-size` possible, or pass `--batch-size -1` for
YOLOv5 [AutoBatch](https://github.com/ultralytics/yolov5/pull/5092). Batch sizes shown for V100-16GB.
```bash
python train.py --data coco.yaml --cfg yolov5n.yaml --weights '' --batch-size 128
yolov5s 64
yolov5m 40
yolov5l 24
yolov5x 16
```
<img width="800" src="https://user-images.githubusercontent.com/26833433/90222759-949d8800-ddc1-11ea-9fa1-1c97eed2b963.png">
</details>
<details open>
<summary>Tutorials</summary>
* [Train Custom Data](https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data) ð RECOMMENDED
* [Tips for Best Training Results](https://github.com/ultralytics/yolov5/wiki/Tips-for-Best-Training-Results) âï¸
RECOMMENDED
* [Weights & Biases Logging](https://github.com/ultralytics/yolov5/issues/1289) ð NEW
* [Roboflow for Datasets, Labeling, and Active Learning](https://github.com/ultralytics/yolov5/issues/4975) ð NEW
* [Multi-GPU Training](https://github.com/ultralytics/yolov5/issues/475)
* [PyTorch Hub](https://github.com/ultralytics/yolov5/issues/36) â NEW
* [TFLite, ONNX, CoreML, TensorRT Export](https://github.com/ultralytics/yolov5/issues/251) ð
* [Test-Time Augmentation (TTA)](https://github.com/ultralytics/yolov5/issues/303)
* [Model Ensembling](https://github.com/ultralytics/yolov5/issues/318)
* [Model Pruning/Sparsity](https://github.com/ultralytics/yolov5/issues/304)
* [Hyperparameter Evolution](https://github.com/ultralytics/yolov5/issues/607)
* [Transfer Learning with Frozen Layers](https://github.com/ultralytics/yolov5/issues/1314) â NEW
* [TensorRT Deployment](https://github.com/wang-xinyu/tensorrtx)
</details>
## <div align="center">Environments</div>
Get started in seconds with our verified environments. Click each icon below for details.
<div align="center">
<a href="https://colab.research.google.com/github/ultralytics/yolov5/blob/master/tutorial.ipynb">
<img src="https://github.com/ultralytics/yolov5/r
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码。 毕业设计题目:基于步态识别的多目标跨镜头跟踪算法研究 主算法:基于yoloV5-deepsort框架进行目标检测和跟踪+GaitSet算法 人工智能本科毕业设计基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码.zip人工智能本科毕业设计基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码.zip人工智能本科毕业设计基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码.zip人工智能本科毕业设计基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码.zip人工智能本科毕业设计基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码.zip人工智能本科毕业设计基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码.zip人工智能本科毕业设计基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码.zip人工智能本科毕业设计基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码.zip人工智能本科毕业设计基于yolov5的步态识别多目标跨镜头跟踪检测算法系统源码.zip人工智能本科毕业设计
资源推荐
资源详情
资源评论





























收起资源包目录





































































































共 343 条
- 1
- 2
- 3
- 4

程序员张小妍
- 粉丝: 2w+
上传资源 快速赚钱
我的内容管理 展开
我的资源 快来上传第一个资源
我的收益
登录查看自己的收益我的积分 登录查看自己的积分
我的C币 登录后查看C币余额
我的收藏
我的下载
下载帮助


最新资源
- 大数据视域下高职课程改革与创新.docx
- 2019-4年4月电大-大学英语B网络统考b题库真题.doc
- 中职计算机基础教学中快捷键的运用和操作习惯的培养.docx
- HPLC法测定民族药材天仙子中金丝桃苷的含量初探.docx
- 电子商务中的商标销售侵权.doc
- 探析计算机软件项目管理实施对策.docx
- 审慎应对人工智能带来的潜在性教育挑战.docx
- Iqazgq单片机控制交通灯大学本科方案设计书.doc
- 互联网+下营销稽查工作日监测模式.docx
- 无线传感器网络节点定位算法的Matlab仿真.doc
- 计算机职业教育教学改革研究.docx
- 数据库技术及应用(第版)答案.doc
- 光纤通信系统5B6B码编码的研究与设计开发与仿真.doc
- 大数据时代大学计算机信息技术基础课程的教学改革探究.docx
- 基于PLC交通灯控制系统毕业设计39284.doc
- 辽宁工程技术大学测绘学院mapgis考试资料.doc
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈



安全验证
文档复制为VIP权益,开通VIP直接复制

- 1
- 2
- 3
- 4
前往页