How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-classification", model="grandfso/bge-reranker-v2-m3-openvino")
# Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained("grandfso/bge-reranker-v2-m3-openvino")
model = AutoModelForSequenceClassification.from_pretrained("grandfso/bge-reranker-v2-m3-openvino")
Quick Links

This model was converted to OpenVINO from BAAI/bge-reranker-v2-m3 using optimum-intel via the export space.

First make sure you have optimum-intel installed:

pip install optimum[openvino]

To load your model you can do as follows:

from optimum.intel import OVModelForSequenceClassification

model_id = "grandfso/bge-reranker-v2-m3-openvino"
model = OVModelForSequenceClassification.from_pretrained(model_id)
Downloads last month
84
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for grandfso/bge-reranker-v2-m3-openvino

Finetuned
(72)
this model