Xiaolong Li*, Youping Gu*, Xi Lin*, Weijie Wang, Bohan Zhuang
* Equal contribution
Preprint • December 2025
PSA introduce an efficient attention mechanism to accelerate video understanding and generation. It leverages a multi-level sparse attention strategy, enabling the model to effectively mitigates information loss while preserving computational efficiency under a low compute budget.