# mindnlp **Repository Path**: mindspore-lab/mindnlp ## Basic Information - **Project Name**: mindnlp - **Description**: MindNLP is an open source NLP library based on MindSpore. - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 38 - **Forks**: 21 - **Created**: 2022-11-15 - **Last Updated**: 2026-03-08 ## Categories & Tags **Categories**: nature-language **Tags**: None ## README
Run HuggingFace Models on MindSpore with Zero Code Changes
The easiest way to use 200,000+ HuggingFace models on Ascend NPU, GPU, and CPU
Quick Start β’ Features β’ Installation β’ Why MindNLP β’ Documentation
--- ## π― What is MindNLP? **MindNLP** bridges the gap between HuggingFace's massive model ecosystem and MindSpore's hardware acceleration. With just `import mindnlp`, you can run any HuggingFace model on **Ascend NPU**, **NVIDIA GPU**, or **CPU** - no code changes required. ```python import mindnlp # That's it! HuggingFace now runs on MindSpore from transformers import pipeline pipe = pipeline("text-generation", model="Qwen/Qwen2-0.5B") print(pipe("Hello, I am")[0]["generated_text"]) ``` ## β‘ Quick Start ### Text Generation with LLMs ```python import mindspore import mindnlp from transformers import pipeline pipe = pipeline( "text-generation", model="Qwen/Qwen3-8B", ms_dtype=mindspore.bfloat16, device_map="auto" ) messages = [{"role": "user", "content": "Write a haiku about coding"}] print(pipe(messages, max_new_tokens=100)[0]["generated_text"][-1]["content"]) ``` ### Image Generation with Stable Diffusion ```python import mindspore import mindnlp from diffusers import DiffusionPipeline pipe = DiffusionPipeline.from_pretrained( "stable-diffusion-v1-5/stable-diffusion-v1-5", ms_dtype=mindspore.float16 ) image = pipe("A sunset over mountains, oil painting style").images[0] image.save("sunset.png") ``` ### BERT for Text Classification ```python import mindnlp from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased") model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased") inputs = tokenizer("MindNLP is awesome!", return_tensors="pt") outputs = model(**inputs) ``` ## β¨ Features| ### π€ Full HuggingFace Compatibility - **200,000+ models** from HuggingFace Hub - **Transformers** - All model architectures - **Diffusers** - Stable Diffusion, SDXL, ControlNet - **Zero code changes** - Just `import mindnlp` | ### π Hardware Acceleration - **Ascend NPU** - Full support for Huawei AI chips - **NVIDIA GPU** - CUDA acceleration - **CPU** - Optimized CPU execution - **Multi-device** - Automatic device placement |
| ### π§ Advanced Capabilities - **Mixed precision** - FP16/BF16 training & inference - **Quantization** - INT8/INT4 with BitsAndBytes - **Distributed** - Multi-GPU/NPU training - **PEFT/LoRA** - Parameter-efficient fine-tuning | ### π¦ Easy Integration - **PyTorch-compatible API** via mindtorch - **Safetensors** support for fast loading - **Model Hub mirrors** for faster downloads - **Comprehensive documentation** |
Made with β€οΈ by the MindSpore Lab team