transformer实战案例
时间: 2025-04-17 10:24:06 浏览: 18
### Transformer 实战项目实例
#### 自然语言处理中的机器翻译
在自然语言处理领域,Transformer被广泛应用来提升机器翻译的质量。通过引入自注意力机制,模型能够更好地捕捉源语言和目标语言之间的依赖关系[^1]。
```python
from transformers import MarianMTModel, MarianTokenizer
model_name = "Helsinki-NLP/opus-mt-en-zh"
tokenizer = MarianTokenizer.from_pretrained(model_name)
model = MarianMTModel.from_pretrained(model_name)
def translate(text):
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs)
translated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
return translated_text
```
#### 图像分类任务中的Swin Transformer
除了传统的NLP应用场景外,Transformers也被成功应用于计算机视觉领域。例如,在图像分类方面,Swin Transformer展示了卓越的表现。下面是一个基于预训练权重进行推理的例子[^4]:
```python
import torch
from PIL import Image
from torchvision.transforms import Compose, Resize, ToTensor
transform = Compose([
Resize((224, 224)),
ToTensor()
])
image_path = 'path_to_image.jpg'
img = Image.open(image_path).convert('RGB')
input_tensor = transform(img).unsqueeze(0)
checkpoint = torch.load('./output/swin_tiny_patch4_window7_224/default/ckpt_epoch_1.pth', map_location='cpu')['model']
model.eval()
with torch.no_grad():
output = model(input_tensor)
predicted_class_idx = int(torch.argmax(output))
print(f"Predicted class index: {predicted_class_idx}")
```
#### 文本摘要生成器
另一个有趣的实践案例是在新闻文章或其他文档上构建自动化的文本摘要工具。这不仅提高了效率还减少了人工成本。以下是创建一个简单的文本摘要系统的代码片段[^2]:
```python
from transformers import pipeline
summarizer = pipeline("summarization")
article = """
Your long article text goes here...
"""
summary = summarizer(article, max_length=150, min_length=30, do_sample=False)[0]['summary_text']
print(summary)
```
阅读全文
相关推荐















