Skip to content

svrijenhoek/dart

Repository files navigation

RADio

This repository contains the supporting material for the paper 📻 RADio – Rank-Aware Divergence metrIcs to measure nOrmative diversity in news recommendations.

1 Computing RADio with rank-aware JS-divergence

git clone https://anonymous.4open.science/r/RADio/
pip install -f requirements.txt
python metrics_calculations.py

2 Additional Material

2.1 Jensen Shannon as an f-Divergence

2.2 RADio with KL divergence

RADio framework with DART metrics based on KL divergence with recommendation algorithms on the MIND dataset. From left to right: Calibration (topic), Calibration (complexity), Fragmentation, Activation, Representation and Alternative Voices. These metrics are executed on a random sample of 35.000 users, with rank-awareness, and without cutoff point.

rec_type

calibration_topic

calibration_complexity

fragmentation

affect

representation

alternative_voices

lstur

2.6038

1.1432

7.7201

0.1481

0.1078

0.0142

naml

2.5333

1.1287

7.3926

0.1531

0.1047

0.0127

npa

2.5945

1.1390

7.6202

0.1521

0.1237

0.0134

nrms

2.5013

1.1204

7.4519

0.1442

0.1114

0.0113

pop

2.9384

1.1082

7.6377

0.1605

0.1028

0.0102

random

3.6038

1.5985

8.6295

0.8079

1.1248

0.0420

2.3 Jensen-Shannon divergence Kullback-Leibler divergence with and without rank-awareness

Jensen-Shannon divergence for each DART metric, with and without rank-awareness, with a cutoff @10. Boxplot with median and the interquartile range in the inner box.

Jensen-Shannon divergence for each DART metric, with and without rank-awareness, with a cutoff @10. Boxplot with median and the interquartile range in the inner box.

Kullback-Leibler divergence for each DART metric, with and without rank-awareness, with a cutoff @10. Boxplot with median and the interquartile range in the inner box.

Kullback-Leibler divergence for each DART metric, with and without rank-awareness, with a cutoff @10. Boxplot with median and the interquartile range in the inner box.

2.4 Jensen-Shannon divergence for all recommender strategies without cutoff

Jensen-Shannon divergence for each DART metric for all neural recommender strategies, with and without rank-awareness, and without a cutoff. Without rank-awareness and cutoff no divergence is found for the Activation, Representation and Alternative Voices metrics, as in these cases the recommendation and the context are identical. Boxplot with median and the interquartile range in the inner box.

Jensen-Shannon divergence for each DART metric for all neural recommender strategies, with and without rank-awareness, and without a cutoff. Without rank-awareness and cutoff no divergence is found for the Activation, Representation and Alternative Voices metrics, as in these cases the recommendation and the context are identical. Boxplot with median and the interquartile range in the inner box.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages