Generative AI

Hugging Face Transformers: Open Source AI Models

📅 December 05, 2025 ⏱️ 1 min read 👁️ 2 views 🏷️ Generative AI

Hugging Face provides access to thousands of open-source AI models for text, vision, audio, and more.

Basic Model Usage


from transformers import pipeline

# Text generation
generator = pipeline("text-generation", model="gpt2")
result = generator("The future of AI is", max_length=50)
print(result[0]['generated_text'])

# Sentiment analysis
classifier = pipeline("sentiment-analysis")
result = classifier("I love this product!")
print(result)  # [{'label': 'POSITIVE', 'score': 0.99}]

# Translation
translator = pipeline("translation_en_to_fr")
result = translator("Hello, how are you?")

Using Specific Models


from transformers import AutoTokenizer, AutoModelForCausalLM

# Load model and tokenizer
model_name = "mistralai/Mistral-7B-v0.1"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# Generate text
inputs = tokenizer("The key to success is", return_tensors="pt")
outputs = model.generate(**inputs, max_length=100)
result = tokenizer.decode(outputs[0])

Question Answering


from transformers import pipeline

qa_pipeline = pipeline("question-answering")

context = """
Python is a high-level programming language known for its simplicity.
It was created by Guido van Rossum in 1991.
"""

question = "Who created Python?"
result = qa_pipeline(question=question, context=context)
print(f"Answer: {result['answer']}")
print(f"Score: {result['score']:.2f}")

Image Classification


from transformers import pipeline
from PIL import Image

# Load classifier
classifier = pipeline("image-classification")

# Classify image
image = Image.open("cat.jpg")
results = classifier(image)

for result in results:
    print(f"{result['label']}: {result['score']:.2f}")

Custom Model Deployment


# Save model
model.save_pretrained("./my_model")
tokenizer.save_pretrained("./my_model")

# Load later
loaded_model = AutoModelForCausalLM.from_pretrained("./my_model")
loaded_tokenizer = AutoTokenizer.from_pretrained("./my_model")

# Use with pipeline
custom_pipeline = pipeline(
    "text-generation",
    model=loaded_model,
    tokenizer=loaded_tokenizer
)

Hugging Face democratizes access to cutting-edge AI models. Explore and build!

🏷️ Tags:
Hugging Face transformers open source AI pre-trained models NLP

📚 Related Articles