Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Machine Learning and Compression

Weight-Sharing Method for Upsampling Layer from Feature Embedding Recursive Block

Jinwoo Hyun · YunKyong Hyon · Mira Lee · Sunju Lee · Taeyoung Ha · Young Rock KIM


Abstract:

In the field of super-resolution, the Laplacian pyramid framework-based model needs to estimate the result of the inverse convolution. Generally, the transposed convolution is applied to estimate the result close to the inverse convolution. In this process, the transposed convolution can be designed efficiently to reduce the trainable weights. In this study, we propose a new model compression method that replaces the transposed convolution layer by sharing the weights of the convolution layer trained in the feature embedding stage, and explain the results accordingly.

Chat is not available.