Yi Su
2 papers ยท Latest:
Natural Language Processing
Kwai Summary Attention Technical Report
Kwai Summary Attention (KSA) reduces LLM long-context modeling costs by compressing historical contexts into learnable summary tokens.
2604.24432
Computer VisionHierarchical Mesh Transformers with Topology-Guided Pretraining for Morphometric Analysis of Brain Structures
A hierarchical transformer with topology-guided pretraining enables robust morphometric analysis of brain structures across diverse mesh types and clinical features.
2604.05215
๐ฌ Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week โ summarized, scored, and delivered to your inbox every Monday.