Fan Yang
8 papers ยท Latest:
Unifying Sparse Attention with Hierarchical Memory for Scalable Long-Context LLM Serving
SPIN unifies sparse attention with hierarchical memory, significantly boosting LLM serving throughput and reducing latency for long contexts.
GLM-5V-Turbo: Toward a Native Foundation Model for Multimodal Agents
GLM-5V-Turbo is a new foundation model integrating multimodal perception natively for enhanced agent reasoning, planning, and tool use across diverse contexts.
Observation of field-odd and field-free superconducting diode effects in $\mathrm{Mo}_2\mathrm{C}$ nanoflakes
Researchers discovered both field-odd and field-free superconducting diode effects in centrosymmetric Mo2C nanoflakes, enabling new quantum electronics.
Weak Magnetic Sensing via Floquet Driving in an Active Cavity Magnon Coupled System
This paper presents a room-temperature, miniaturized weak magnetic field sensor using an active cavity-magnon system and Floquet driving.
Beyond Nodes vs. Edges: A Multi-View Fusion Framework for Provenance-Based Intrusion Detection
PROVFUSION is a multi-view framework that fuses attribute, structure, and causality signals to improve provenance-based intrusion detection accuracy.
BLaDA: Bridging Language to Functional Dexterous Actions within 3DGS Fields
BLaDA is a zero-shot framework that translates open-vocabulary language instructions into precise, functional dexterous robot actions using 3D Gaussian Splatting.
Gemini: A Family of Highly Capable Multimodal Models
Gemini is a new family of multimodal AI models excelling in image, audio, video, and text understanding, achieving state-of-the-art results across numerous benchmarks including human-expert level on MMLU.
Baichuan 2: Open Large-scale Language Models
Baichuan 2 is a series of large-scale, open-source multilingual language models that achieve state-of-the-art performance across general and specialized benchmarks.
๐ฌ Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week โ summarized, scored, and delivered to your inbox every Monday.