Prevent this user from interacting with your repositories and sending you notifications.
Learn more about blocking users.
Contact GitHub support about this user’s behavior.
Learn more about reporting abuse.
PyTorch-based modular, configuration-driven framework for knowledge distillation. 🏆18 methods including SOTA are implemented so far. 🎁 Trained models, training logs and configurations are available…
[ICPR 2020] "Neural Compression and Filtering for Edge-assisted Real-time Object Detection in Challenged Networks" and [MobiCom EMDL 2020] "Split Computing for Complex Object Detectors: Challenges …
[JCDL WOSP 2020] "Citations Beyond Self Citations: Identifying Authors, Affiliations, and Nationalities in Scientific Papers"
Code for running all the background services for Covid19 efforts.
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Netw…
Code for the StatNLP course homework.
Seeing something unexpected? Take a look at the
GitHub profile guide.