Skip to content

mindspore-ai/mindspore-lite

查看中文

What Is MindSpore Lite

MindSpore Lite provides lightweight AI inference acceleration capabilities for different hardware devices, enabling intelligent applications and providing end-to-end solutions for developers. It offers development friendly, efficient, and flexible deployment experiences for algorithm engineers and data scientists, helping the AI software and hardware application ecosystem thrive. In the future, MindSpore Lite will work with the MindSpore AI community to enrich the AI software and hardware application ecosystem.

For more details please check out our MindSpore Lite Architecture Guide.

Example

MindSpore Lite achieves double the inference performance for AIGC, speech algorithms, and CV model inference, and has been deployed in Huawei's flagship smartphones for commercial use. As shown in the figure below, MindSpore Lite supports image style transfer and image segmentation for CV algorithms.

mindir infer case 1 mindir infer case 2 mindir infer case 3

original image for image segmentation image segmentation rendering image style transfer original image Image style transfer rendering

Quick Start

  1. Compile

    MindSpore Lite has multiple different hardware backends, including:

    • For service side devices, users can compile dynamic libraries and Python wheel packages by setting compilation options such as MSLITE_ENABLE_CLOUD_INFERENCE for inference of upgrade and CPU hardware. For detailed compilation tutorials, please refer to the official website of MindSpore Lite.

    • For end and edge devices, different dynamic libraries can be compiled through different cross compilation toolchains. For detailed compilation tutorials, please refer to the official website of MindSpore Lite.

  2. Model conversion

    MindSpore Lite supports the conversion of models serialized from various AI frameworks such as MindSpore, ONNX, TF, etc. into MindSpore Lite format IR. In order to achieve more efficient model inference, MindSpore Lite supports the conversion of models into .ms format or .mindir format, where:

    • The .mindir model is used for inference on service side devices and is more compatible with the model structure exported by the MindSpore training framework. It is mainly suitable for Ascend cards and X86/Arm architecture CPU hardware. For detailed conversion methods, please refer to the Conversion Tool Tutorial.

    • The .ms model is mainly used for inference of end and edge devices, and is mainly suitable for terminal hardware such as Kirin NPU and Arm architecture CPU. In order to better reduce the size of the model file, the .ms model is serialized and deserialized through protobuffer. For detailed instructions on how to use the conversion tool, please refer to the Conversion Tool

  3. Model inference

    MindSpore Lite provides three APIs: Python, C++, and Java, and complete usage cases for the corresponding APIs:

Technical Solution

MindSpore Lite Features

MindSpore Lite Architecture

  1. Terminal and Cloud one-stop inference deployment

    • Provide end-to-end processes for model transformation optimization, deployment, and inference.

    • The unified IR realizes the device-cloud AI application integration.

  2. Lightweight

    • Provides model compression, which could help to improve performance as well.

    • Provides the ultra-lightweight reasoning solution MindSpore Lite Micro to meet the deployment requirements in extreme environments such as smart watches and headphones.

  3. High-performance

    • The built-in high-performance kernel computing library NNACL supports high-performance inference for dedicated chips such as CPU, NNRt, and Ascend, maximizing hardware computing power while minimizing inference latency and power consumption.

    • Assembly code to improve performance of kernel operators. Supports CPU, GPU, and NPU.

  4. Versatility

    • Supports deployment of multiple hardware such as server-side Ascend and CPU.

    • Supports HarmonyOS and Android mobile operating systems.

Further Understanding of MindSpore Lite

If you wish to further learn and use MindSpore Lite, please refer to the following content:

API and documentation

  1. API documentation:

  2. MindSpore Lite Official Website Document

Key characteristic capability

Communication and Feedback

  • Welcome to Gitee Issues: submit questions, reports, and suggestions;

  • Welcome to Community Forum: engage in technical and problem-solving exchanges;

  • Welcome to Sig: to manage and improve workflow, participate in discussions and exchanges;

Surrounding communities

About

No description, website, or topics provided.

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Packages

No packages published

Contributors 6