Global Prompt Cell: A Portable Control Module for Effective Prompt Tuning

Global Prompt Cell Model (best viewed in color)

Abstract

As a novel approach to tuning pre-trained models, prompt tuning involves freezing the parameters in downstream tasks while inserting trainable embeddings into inputs in the first layer. However, previous methods have mainly focused on the initialization of prompt embeddings. The strategy of training and utilizing prompt embeddings in a reasonable way has become a limiting factor in the effectiveness of prompt tuning. To address this issue, we introduce the Global Prompt Cell (GPC), a portable control module for prompt tuning that selectively preserves prompt information across all encoder layers. Our experimental results demonstrate a 5.8% improvement on SuperGLUE datasets compared to vanilla prompt tuning.

Publication
In The 12th CCF International Conference on Natural Language Processing and Chinese Computing
Nuwa Xi
Nuwa Xi
Graduate Student

Hi there. This is Nova 😊, currently a master at HIT-SCIR lab. My research interests include Natural Language Processing and its application in science.