MonoConfig
- class lightning_ir.models.cross_encoders.mono.MonoConfig(query_length: int | None = 32, doc_length: int | None = 512, pooling_strategy: 'first' | 'mean' | 'max' | 'sum' | 'bert_pool' = 'first', linear_bias: bool = False, scoring_strategy: 'mono' | 'rank' = 'rank', tokenizer_pattern: str | None = None, **kwargs)[source]
Bases:
CrossEncoderConfigConfiguration class for mono cross-encoder models.
- __init__(query_length: int | None = 32, doc_length: int | None = 512, pooling_strategy: 'first' | 'mean' | 'max' | 'sum' | 'bert_pool' = 'first', linear_bias: bool = False, scoring_strategy: 'mono' | 'rank' = 'rank', tokenizer_pattern: str | None = None, **kwargs)[source]
Initialize the configuration for mono cross-encoder models.
- Parameters:
query_length (int | None) – Maximum number of tokens per query. If None does not truncate. Defaults to 32.
doc_length (int | None) – Maximum number of tokens per document. If None does not truncate. Defaults to 512.
pooling_strategy (Literal["first", "mean", "max", "sum", "bert_pool"]) – Pooling strategy for the embeddings. Defaults to “first”.
linear_bias (bool) – Whether to use bias in the final linear layer. Defaults to False.
scoring_strategy (Literal["mono", "rank"]) – Scoring strategy to use. Defaults to “rank”.
tokenizer_pattern (str | None) – Optional pattern for tokenization. Defaults to None.
Methods
__init__([query_length, doc_length, ...])Initialize the configuration for mono cross-encoder models.
Attributes
Model type for mono cross-encoder models.