Instructions to use joaogante/test_text with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use joaogante/test_text with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="joaogante/test_text")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("joaogante/test_text") model = AutoModelForMaskedLM.from_pretrained("joaogante/test_text") - Notebooks
- Google Colab
- Kaggle
Adding `safetensors` variant of this model
#12 opened about 1 year ago
by
SFconvertbot
Add TF weights
#11 opened over 3 years ago
by
joaogante
Update TF weights
#10 opened almost 4 years ago
by
joaogante
Update TF weights
#9 opened almost 4 years ago
by
joaogante
Update TF weights
#7 opened almost 4 years ago
by
joaogante
Update TF weights
#6 opened almost 4 years ago
by
joaogante
Add TF weights
#5 opened almost 4 years ago
by
joaogante
Add TF weights
#4 opened almost 4 years ago
by
joaogante
Add TF weights
#3 opened almost 4 years ago
by
joaogante
Add TF weights
#2 opened almost 4 years ago
by
joaogante
Add TF weights
#1 opened almost 4 years ago
by
joaogante