Model Compression & Optimization

Model compression has emerged as an important area of research for deploying deep learning models on IoT devices. However, model compression is not a sufficient solution to fit the models within the memory of a single device; as a result we need to distribute them across multiple devices. This leads to a distributed inference paradigm in which communication costs represent another major bottleneck. To this end, we focus on knowledge distillation and ‘teacher’ – ‘student’ type of architectures for distributed model compression, as well as data independent model compression.

model compressions

Selected Publications

14 entries « 3 of 3 »

Bhardwaj, Kartikeya; Lin, Ching-Yi; Sartor, Anderson; Marculescu, Radu

Memory-and communication-aware model compression for distributed deep learning inference on iot Journal Article

In: ACM Transactions on Embedded Computing Systems (TECS), vol. 18, no. 5s, pp. 1–22, 2019.

Links | BibTeX

Sartor, Anderson Luiz; Becker, Pedro Henrique Exenberger; Wong, Stephan; Marculescu, Radu; Beck, Antonio Carlos Schneider

Machine Learning-Based Processor Adaptability Targeting Energy, Performance, and Reliability Proceedings Article

In: 2019 IEEE Computer Society Annual Symposium on VLSI (ISVLSI), pp. 158–163, IEEE 2019.

Links | BibTeX

14 entries « 3 of 3 »