PulseAugur
LIVE 09:06:06
ENTITY CIFAR-100

CIFAR-100

PulseAugur coverage of CIFAR-100 — every cluster mentioning CIFAR-100 across labs, papers, and developer communities, ranked by signal.

Total · 30d
27
27 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
27
27 over 90d
TIER MIX · 90D
RELATIONSHIPS
SENTIMENT · 30D

1 day(s) with sentiment data

RECENT · PAGE 2/2 · 28 TOTAL
  1. RESEARCH · CL_18358 ·

    New research advances federated learning for privacy and heterogeneity

    Researchers are developing new methods to improve federated learning, a technique that allows models to train on decentralized data without compromising privacy. Several papers introduce novel algorithms for handling da…

  2. RESEARCH · CL_06561 ·

    Researchers develop POUR, a provably optimal method for unlearning AI representations

    Researchers have developed a new method called POUR (Provably Optimal Unlearning of Representations) to effectively remove specific concepts or training data from machine learning models without requiring a full retrain…

  3. RESEARCH · CL_08221 ·

    RDCNet achieves state-of-the-art image classification with novel dilated convolution

    Researchers have introduced RDCNet, a novel architecture designed to improve image classification accuracy. The network integrates a Multi-Branch Random Dilated Convolution module for capturing fine-grained features and…

  4. RESEARCH · CL_06359 ·

    New research tackles Fast Adversarial Training with dynamic guidance and a fair benchmark

    Researchers have developed a new strategy called Distribution-aware Dynamic Guidance (DDG) to improve the robustness of AI models trained using Fast Adversarial Training (FAT). DDG addresses issues like catastrophic ove…

  5. RESEARCH · CL_05095 ·

    New AI methods enhance out-of-distribution detection and representation learning

    Researchers have developed UFCOD, a novel framework for few-shot cross-domain out-of-distribution (OOD) detection. UFCOD leverages information-geometric analysis of diffusion trajectories, extracting 'Path Energy' and '…

  6. RESEARCH · CL_04908 ·

    Federated Learning uses spectral entropy for data-free client contribution estimation

    Researchers have developed a novel method for estimating client contributions in Federated Learning without requiring access to client data. This approach utilizes the spectral entropy of final-layer updates to measure …

  7. RESEARCH · CL_03001 ·

    New research suggests fine-tuning regimes significantly impact continual learning evaluations

    A new paper argues that the fine-tuning regime, specifically the trainable parameter subspace, is a critical variable in evaluating continual learning methods. Researchers found that the relative performance rankings of…

  8. RESEARCH · CL_03012 ·

    New GEM activation functions offer smoother, rational alternatives to ReLU

    Researchers have introduced Geometric Monomial (GEM), a new family of activation functions designed for deep neural networks. These functions utilize purely rational arithmetic and offer $C^{2N}$-smoothness, aiming to i…