# mindnlp **Repository Path**: mindspore-lab/mindnlp ## Basic Information - **Project Name**: mindnlp - **Description**: MindNLP is an open source NLP library based on MindSpore. - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 38 - **Forks**: 21 - **Created**: 2022-11-15 - **Last Updated**: 2026-03-08 ## Categories & Tags **Categories**: nature-language **Tags**: None ## README

MindNLP Logo

MindNLP

Run HuggingFace Models on MindSpore with Zero Code Changes

The easiest way to use 200,000+ HuggingFace models on Ascend NPU, GPU, and CPU

GitHub stars PyPI Downloads License

Documentation CI PRs Welcome Issues

Quick Start β€’ Features β€’ Installation β€’ Why MindNLP β€’ Documentation

--- ## 🎯 What is MindNLP? **MindNLP** bridges the gap between HuggingFace's massive model ecosystem and MindSpore's hardware acceleration. With just `import mindnlp`, you can run any HuggingFace model on **Ascend NPU**, **NVIDIA GPU**, or **CPU** - no code changes required. ```python import mindnlp # That's it! HuggingFace now runs on MindSpore from transformers import pipeline pipe = pipeline("text-generation", model="Qwen/Qwen2-0.5B") print(pipe("Hello, I am")[0]["generated_text"]) ``` ## ⚑ Quick Start ### Text Generation with LLMs ```python import mindspore import mindnlp from transformers import pipeline pipe = pipeline( "text-generation", model="Qwen/Qwen3-8B", ms_dtype=mindspore.bfloat16, device_map="auto" ) messages = [{"role": "user", "content": "Write a haiku about coding"}] print(pipe(messages, max_new_tokens=100)[0]["generated_text"][-1]["content"]) ``` ### Image Generation with Stable Diffusion ```python import mindspore import mindnlp from diffusers import DiffusionPipeline pipe = DiffusionPipeline.from_pretrained( "stable-diffusion-v1-5/stable-diffusion-v1-5", ms_dtype=mindspore.float16 ) image = pipe("A sunset over mountains, oil painting style").images[0] image.save("sunset.png") ``` ### BERT for Text Classification ```python import mindnlp from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased") model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased") inputs = tokenizer("MindNLP is awesome!", return_tensors="pt") outputs = model(**inputs) ``` ## ✨ Features
### πŸ€— Full HuggingFace Compatibility - **200,000+ models** from HuggingFace Hub - **Transformers** - All model architectures - **Diffusers** - Stable Diffusion, SDXL, ControlNet - **Zero code changes** - Just `import mindnlp` ### πŸš€ Hardware Acceleration - **Ascend NPU** - Full support for Huawei AI chips - **NVIDIA GPU** - CUDA acceleration - **CPU** - Optimized CPU execution - **Multi-device** - Automatic device placement
### πŸ”§ Advanced Capabilities - **Mixed precision** - FP16/BF16 training & inference - **Quantization** - INT8/INT4 with BitsAndBytes - **Distributed** - Multi-GPU/NPU training - **PEFT/LoRA** - Parameter-efficient fine-tuning ### πŸ“¦ Easy Integration - **PyTorch-compatible API** via mindtorch - **Safetensors** support for fast loading - **Model Hub mirrors** for faster downloads - **Comprehensive documentation**
## πŸ§ͺ Mindtorch NPU Debugging Mindtorch NPU ops are async by default. Use `torch.npu.synchronize()` when you need to block on results. For debugging, set `ACL_LAUNCH_BLOCKING=1` to force per-op synchronization. ## πŸ“¦ Installation ```bash # From PyPI (recommended) pip install mindnlp # From source (latest features) pip install git+https://github.com/mindspore-lab/mindnlp.git ```
πŸ“‹ Version Compatibility | MindNLP | MindSpore | Python | |---------|-----------|--------| | 0.6.x | β‰₯2.7.1 | 3.10-3.11 | | 0.5.x | 2.5.0-2.7.0 | 3.10-3.11 | | 0.4.x | 2.2.x-2.5.0 | 3.9-3.11 |
## πŸ’‘ Why MindNLP? | Feature | MindNLP | PyTorch + HF | TensorFlow + HF | |---------|---------|--------------|-----------------| | HuggingFace Models | βœ… 200K+ | βœ… 200K+ | ⚠️ Limited | | Ascend NPU Support | βœ… Native | ❌ | ❌ | | Zero Code Migration | βœ… | - | ❌ | | Unified API | βœ… | βœ… | ❌ | | Chinese Model Support | βœ… Excellent | βœ… Good | ⚠️ Limited | ### πŸ† Key Advantages 1. **Instant Migration**: Your existing HuggingFace code works immediately 2. **Ascend Optimization**: Native support for Huawei NPU hardware 3. **Production Ready**: Battle-tested in enterprise deployments 4. **Active Community**: Regular updates and responsive support ## πŸ—ΊοΈ Supported Models MindNLP supports **all models** from HuggingFace Transformers and Diffusers. Here are some popular ones: | Category | Models | |----------|--------| | **LLMs** | Qwen, Llama, ChatGLM, Mistral, Phi, Gemma, BLOOM, Falcon | | **Vision** | ViT, CLIP, Swin, ConvNeXt, SAM, BLIP | | **Audio** | Whisper, Wav2Vec2, HuBERT, MusicGen | | **Diffusion** | Stable Diffusion, SDXL, ControlNet | | **Multimodal** | LLaVA, Qwen-VL, ALIGN | πŸ‘‰ [View all supported models](https://mindnlp.cqu.ai/supported_models) ## πŸ“š Resources - πŸ“– [Documentation](https://mindnlp.cqu.ai) - πŸš€ [Quick Start Guide](https://mindnlp.cqu.ai/quick_start) - πŸ“ [Tutorials](https://mindnlp.cqu.ai/tutorials/quick_start) - πŸ’¬ [GitHub Discussions](https://github.com/mindspore-lab/mindnlp/discussions) - πŸ› [Issue Tracker](https://github.com/mindspore-lab/mindnlp/issues) ## 🀝 Contributing We welcome contributions! See our [Contributing Guide](https://mindnlp.cqu.ai/contribute) for details. ```bash # Clone and install for development git clone https://github.com/mindspore-lab/mindnlp.git cd mindnlp pip install -e ".[dev]" ``` ## πŸ‘₯ Community

Join the **MindSpore NLP SIG** (Special Interest Group) for discussions, events, and collaboration:

QQ Group

## ⭐ Star History

Star History Chart

**If you find MindNLP useful, please consider giving it a star ⭐ - it helps the project grow!** ## πŸ“„ License MindNLP is released under the [Apache 2.0 License](LICENSE). ## πŸ“– Citation ```bibtex @misc{mindnlp2022, title={MindNLP: Easy-to-use and High-performance NLP and LLM Framework Based on MindSpore}, author={MindNLP Contributors}, howpublished={\url{https://github.com/mindspore-lab/mindnlp}}, year={2022} } ``` ---

Made with ❀️ by the MindSpore Lab team