R^2: Range Regularization for Model Compression and Quantization
AuthorsArnav Kundu*, Chungkuk Yoo*, Srijan Mishra*, Minsik Cho, Saurabh Adya
AuthorsArnav Kundu*, Chungkuk Yoo*, Srijan Mishra*, Minsik Cho, Saurabh Adya
*Equal Contributors
Model parameter regularization is a widely used technique to improve generalization, but also can be used to shape the weight distributions for various purposes. In this work, we shed light on how weight regularization can assist model quantization and compression techniques, and then propose range regularization () to further boost the quality of model optimization by focusing on the outlier prevention. By effectively regulating the minimum and maximum weight values from a distribution, we mold the overall distribution into a tight shape so that model compression and quantization techniques can better utilize their limited numeric representation powers. We introduce regularization, its extension margin regularization and a new soft-min-max regularization to be used as a regularization loss during full-precision model training. Coupled with state-of-the-art quantization and compression techniques, models trained with perform better on average, specifically at lower bit weights with 16x compression ratio. We also demonstrate that helps parameter constrained models like MobileNetV1 achieve significant improvement of around 8% for 2 bit quantization and 7% for 1 bit compression.