KLDivergence
- class lightning_ir.loss.listwise.KLDivergence[source]
Bases:
ListwiseLossFunctionKullback-Leibler Divergence loss for listwise ranking tasks.
KL Divergence loss for listwise ranking treats both the ground truth relevance labels and the predicted scores as probability distributions over the entire list of items. The loss is computed by minimizing the divergence between them to align the global ranking structure rather than just local comparisons.
Originally proposed in: On Information and Sufficiency
Methods
compute_loss(output, batch)Compute the Kullback-Leibler Divergence loss.
- compute_loss(output: LightningIROutput, batch: TrainBatch) torch.Tensor[source]
Compute the Kullback-Leibler Divergence loss.
- Parameters:
output (LightningIROutput) – The output from the model containing scores.
batch (TrainBatch) – The training batch containing targets.
- Returns:
The computed loss.
- Return type:
torch.Tensor