Skip to content

SeongKu-Kang/HetComp_WWW23

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Distillation from Heterogeneous Models for Top-K Recommendation

DOI

This repository provides the source code of "Distillation from Heterogeneous Models for Top-K Recommendation" accepted in TheWebConf (WWW2023) as a research paper.

1. Overview

We present HetComp Framework that effectively compresses the valuable but difficult ensemble knowledge of heterogeneous models, generating a lightweight model with high recommendation performance.

2. Main Results

Training curves of w/o KD, DCD, and HetComp. Testing recall per 10 epochs. After convergence, we plot the last value.

2-a. Benchmark setup

2-b. Generalization setup

We found that the sampling processes for top-ranked unobserved items are unnecessary, and removing the processes gave considerable performance improvements for the ranking matching KD methods (i.e., RRD, MTD, CL-DRD, and DCD). For this reason, we remove the sampling process for all ranking matching methods in our experiments.

3. Requirements

3-a. Dataset

3-b. Software

  • Python version: 3.6.10
  • Pytorch version: 1.10.1

3-c. Else.

About

Distillation from Heterogeneous Models for Top-K Recommendation (WWW'23)

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages