Instructions to use Intel/tvp-base-ANet with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Intel/tvp-base-ANet with Transformers:
# Load model directly from transformers import AutoProcessor, TvpForVideoGrounding processor = AutoProcessor.from_pretrained("Intel/tvp-base-ANet") model = TvpForVideoGrounding.from_pretrained("Intel/tvp-base-ANet") - Notebooks
- Google Colab
- Kaggle
| { | |
| "do_center_crop": false, | |
| "do_normalize": true, | |
| "do_resize": true, | |
| "do_rescale": false, | |
| "do_padding": true, | |
| "image_mean": [8.2381, 7.3115, 6.6981], | |
| "image_std": [9.6335, 9.0659, 8.7213], | |
| "processor_class": "TvpProcessor", | |
| "padding_size": {"height": 448, "width": 448}, | |
| "tokenizer": "bert-base-uncased" | |
| } |