PENGUINLIONG / MegEngine

MegEngine 是一个快速、可拓展、易于使用且支持自动求导的深度学习框架

Home Page:https://megengine.org.cn/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MegEngine

English | 中文

MegEngine is a fast, scalable and easy-to-use deep learning framework, with auto-differentiation.


Installation

NOTE: MegEngine now supports Python installation on Linux-64bit/Windows-64bit/MacOS(CPU-Only)-10.14+ platforms with Python from 3.5 to 3.8. On Windows 10 you can either install the Linux distribution through Windows Subsystem for Linux (WSL) or install the Windows distribution directly. Many other platforms are supported for inference.

Binaries

To install the pre-built binaries via pip wheels:

python3 -m pip install megengine -f https://megengine.org.cn/whl/mge.html

Building from Source

Prerequisites

Most of the dependencies of MegEngine are located in third_party directory, which can be prepared by executing:

./third_party/prepare.sh
./third_party/install-mkl.sh

But some dependencies need to be installed manually:

  • CUDA(>=10.1), cuDNN(>=7.6) are required when building MegEngine with CUDA support.
  • TensorRT(>=5.1.5) is required when building with TensorRT support.
  • LLVM/Clang(>=6.0) is required when building with Halide JIT support.
  • Python(>=3.5) and numpy are required to build Python modules.

Build

MegEngine uses CMake as the build tool. We provide the following scripts to facilitate building.

  • host_build.sh builds MegEngine that runs on the same host machine (i.e., no cross compiling). The following command displays the usage:
    scripts/cmake-build/host_build.sh -h
    
  • cross_build_android_arm_inference.sh builds MegEngine for DNN inference on Android-ARM platforms. The following command displays the usage:
    scripts/cmake-build/cross_build_android_arm_inference.sh -h
    
  • cross_build_linux_arm_inference.sh builds MegEngine for DNN inference on Linux-ARM platforms. The following command displays the usage:
    scripts/cmake-build/cross_build_linux_arm_inference.sh -h
    
  • cross_build_ios_arm_inference.sh builds MegEngine for DNN inference on iOS (iPhone/iPad) platforms. The following command displays the usage:
    scripts/cmake-build/cross_build_ios_arm_inference.sh
    

Please refer to BUILD_README.md for more details.

How to Contribute

We strive to build an open and friendly community. We aim to power humanity with AI.

How to Contact Us

Resources

License

MegEngine is Licensed under the Apache License, Version 2.0

Copyright (c) 2014-2021 Megvii Inc. All rights reserved.

About

MegEngine 是一个快速、可拓展、易于使用且支持自动求导的深度学习框架

https://megengine.org.cn/

License:Apache License 2.0


Languages

Language:C++ 79.9%Language:Cuda 13.8%Language:Python 4.6%Language:C 0.9%Language:CMake 0.4%Language:Shell 0.2%Language:Objective-C 0.0%Language:Starlark 0.0%Language:Makefile 0.0%Language:MLIR 0.0%Language:Dockerfile 0.0%