High-performance neural network inference framework
ncnn is a high-performance neural network inference computing framework optimized for mobile platforms. ncnn is deeply concerned about its deployment and use on mobile phones from the beginning of its design. ncnn does not have third party dependencies. It is cross-platform, and runs faster than all known open-source frameworks on mobile phone CPUs. Developers can easily deploy deep learning algorithm models to mobile platforms by using the efficient ncnn implementation. They can create intelligent apps, and bring the artificial intelligence to your fingertips. ncnn is currently being used in many Tencent applications, such as QQ, Qzone, WeChat, Pitu and so on.
$
pkg install ncnnOrigin
misc/ncnn
Size
48.7MiB
License
BSD3CLAUSE
Maintainer
yuri@FreeBSD.org
Dependencies
2 packages
Required by
0 packages