Close Menu
SkytikSkytik

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    At Least 32 People Dead After a Mine Bridge Collapsed Due to Overcrowding

    November 17, 2025

    Here’s how I turned a Raspberry Pi into an in-car media server

    November 17, 2025

    Beloved SF cat’s death fuels Waymo criticism

    November 17, 2025
    Facebook X (Twitter) Instagram
    • About Us
    • Contact Us
    SkytikSkytik
    • Home
    • AI Tools
    • Online Tools
    • Tech News
    • Guides
    • Reviews
    • SEO & Marketing
    • Social Media Tools
    SkytikSkytik
    Home»AI Tools»Why Pruning Prevails for One-Shot MoE compression
    AI Tools

    Why Pruning Prevails for One-Shot MoE compression

    AwaisBy AwaisMarch 11, 2026No Comments2 Mins Read0 Views
    Facebook Twitter Pinterest LinkedIn Telegram Tumblr Email
    Measuring Intelligence Efficiency of Local AI
    Share
    Facebook Twitter LinkedIn Pinterest Email

    [Submitted on 15 Oct 2025 (v1), last revised 10 Mar 2026 (this version, v2)]

    View a PDF of the paper titled REAP the Experts: Why Pruning Prevails for One-Shot MoE compression, by Mike Lasby and 5 other authors

    View PDF
    HTML (experimental)

    Abstract:Sparsely-activated Mixture-of-Experts (SMoE) models offer efficient pre-training and low latency but their large parameter counts create significant memory overhead, motivating research into expert compression. Contrary to recent findings favouring expert merging on discriminative benchmarks, we find that expert pruning is a superior strategy for generative tasks. We demonstrate that existing merging techniques introduce an irreducible error due to the loss of fine-grained routing control over experts. Leveraging this insight, we propose Router-weighted Expert Activation Pruning (REAP), a novel pruning criterion that considers both router gate-values and expert activation norms to minimize the reconstruction error bound. Across a diverse set of SMoE models ranging from 20B to 1T parameters, REAP consistently outperforms merging and other pruning methods on generative benchmarks, especially at 50% compression. Notably, our method achieves near-lossless compression on code generation tasks with Qwen3-Coder-480B and Kimi-K2, even after pruning 50% of experts.

    Submission history

    From: Mike Lasby [view email]
    [v1]
    Wed, 15 Oct 2025 18:29:28 UTC (359 KB)
    [v2]
    Tue, 10 Mar 2026 14:03:57 UTC (373 KB)

    Compression MoE Oneshot Prevails Pruning
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Awais
    • Website

    Related Posts

    SpecMoE: Spectral Mixture-of-Experts Foundation Model for Cross-Species EEG Decoding

    March 18, 2026

    How a Neural Network Learned Its Own Fraud Rules: A Neuro-Symbolic AI Experiment

    March 18, 2026

    Bridging Modality Gap with Temporal Evolution Semantic Space

    March 18, 2026

    How to Effectively Review Claude Code Output

    March 18, 2026

    Everything You Need to Know About Recursive Language Models

    March 17, 2026

    [2601.15871] Why Inference in Large Models Becomes Decomposable After Training

    March 17, 2026
    Leave A Reply Cancel Reply

    Top Posts

    At Least 32 People Dead After a Mine Bridge Collapsed Due to Overcrowding

    November 17, 20250 Views

    Here’s how I turned a Raspberry Pi into an in-car media server

    November 17, 20250 Views

    Beloved SF cat’s death fuels Waymo criticism

    November 17, 20250 Views
    Don't Miss

    Google expands Personal Intelligence to AI Mode, Gemini, Chrome

    March 18, 2026

    Google is expanding Personal Intelligence across AI Mode, Gemini, and Chrome in the U.S., moving…

    Google AI Overviews Cut Germany’s Top Organic CTR By 59%

    March 18, 2026

    SpecMoE: Spectral Mixture-of-Experts Foundation Model for Cross-Species EEG Decoding

    March 18, 2026

    How a Neural Network Learned Its Own Fraud Rules: A Neuro-Symbolic AI Experiment

    March 18, 2026
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    How to Effectively Review Claude Code Output

    March 18, 2026

    Google adds video visibility to Performance Max reporting

    March 18, 2026
    Most Popular

    13 Trending Songs on TikTok in Nov 2025 (+ How to Use Them)

    November 18, 20257 Views

    How to watch the 2026 GRAMMY Awards online from anywhere

    February 1, 20263 Views

    Corporate Reputation Management Strategies | Sprout Social

    November 19, 20252 Views
    Our Picks

    At Least 32 People Dead After a Mine Bridge Collapsed Due to Overcrowding

    November 17, 2025

    Here’s how I turned a Raspberry Pi into an in-car media server

    November 17, 2025

    Beloved SF cat’s death fuels Waymo criticism

    November 17, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    • Disclaimer

    © 2025 skytik.cc. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.