PulseAugur
LIVE 06:55:19
research · [2 sources] ·
0
research

QB-LIF neuron boosts SNN efficiency with learnable scale and burst spiking

Researchers have introduced QB-LIF, a novel neuron model for spiking neural networks (SNNs) that addresses the information throughput limitations of binary spike coding. QB-LIF reformulates burst spiking using a learnable scale for membrane potential quantization, allowing layers to adapt their resolution. This approach maintains hardware efficiency by folding the learned scale into synaptic weights and uses a specialized surrogate gradient for stable optimization. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Introduces a new neuron model that improves accuracy and efficiency for spiking neural networks, potentially enabling more performant neuromorphic hardware.

RANK_REASON This is a research paper introducing a new method for spiking neural networks.

Read on arXiv cs.CV →

COVERAGE [2]

  1. arXiv cs.CV TIER_1 · Dewei Bai, Hongxiang Peng, Jiajun Mei, Yang Ren, Hong Qu, Dawen Xia, Zhang Yi ·

    QB-LIF: Learnable-Scale Quantized Burst Neurons for Efficient SNNs

    arXiv:2604.25688v1 Announce Type: new Abstract: Binary spike coding enables sparse and event-driven computation in spiking neural networks (SNNs), yet its 1-bit-per-timestep representation fundamentally limits information throughput. This bottleneck becomes increasingly restricti…

  2. arXiv cs.CV TIER_1 · Zhang Yi ·

    QB-LIF: Learnable-Scale Quantized Burst Neurons for Efficient SNNs

    Binary spike coding enables sparse and event-driven computation in spiking neural networks (SNNs), yet its 1-bit-per-timestep representation fundamentally limits information throughput. This bottleneck becomes increasingly restrictive in deep architectures under short simulation …