PulseAugur
LIVE 09:59:21
research · [9 sources] ·
0
research

AI research explores emotion learning, solar forecasting, and Transformer efficiency

Researchers have developed SolarTformer, a deep learning model using transformer architecture and self-attention mechanisms for more accurate short-term solar power forecasting. This model integrates meteorological data and power station-specific metadata to capture temporal dependencies and spatial variability, outperforming previous methods. Separately, a new study explores predicting induced pleasure from social media content using multimodal fusion and transformer-based architectures, achieving 0.6624 accuracy. Another paper compares n-gram models with neural networks like LSTMs and Transformers for event-log prediction, finding that n-grams offer comparable accuracy with significantly fewer resources. Finally, a paper introduces sub-token routing within LoRA-adapted transformers to improve efficiency by compressing information within tokens, enhancing both language modeling and downstream task performance. AI

Summary written by gemini-2.5-flash-lite from 9 sources. How we write summaries →

IMPACT Introduces novel transformer-based approaches for solar forecasting, emotion prediction, event-log analysis, and efficient model adaptation.

RANK_REASON The cluster contains multiple academic papers detailing new models and methods, including SolarTformer and sub-token routing in LoRA.

Read on arXiv cs.LG →

COVERAGE [9]

  1. arXiv cs.LG TIER_1 · Ankush Pratap Singh, Houwei Cao, Yong Liu ·

    CHUCKLE -- When Humans Teach AI To Learn Emotions The Easy Way

    arXiv:2510.09382v2 Announce Type: replace Abstract: Curriculum learning (CL) structures training from simple to complex samples, facilitating progressive learning. However, existing CL approaches for emotion recognition often rely on heuristic, data-driven, or model-based definit…

  2. arXiv cs.LG TIER_1 · Ankan Basu, Jyotiraditya Roy, Aditya Datta, Prayas Sanyal, Sumanta Banerjee ·

    SolarTformer: A Transformer Based Deep Learning Approach for Short Term Solar Power Forecasting

    arXiv:2604.24306v1 Announce Type: new Abstract: Accurate forecasting of solar power output is essential for efficient integration of renewable energy into the grid. In this study, an attention-based deep learning model, inspired by transformer architecture, is used for short-term…

  3. arXiv cs.LG TIER_1 · Nastaran Dab, Raziyeh Zall, Mohammadreza Kangavari ·

    Modeling Induced Pleasure through Cognitive Appraisal Prediction via Multimodal Fusion

    arXiv:2604.23753v1 Announce Type: cross Abstract: Multimodal affective computing analyzes user-generated social media content to predict emotional states. However, a critical gap remains in understanding how visual content shapes cognitive interpretations and elicits specific aff…

  4. arXiv cs.AI TIER_1 · Sumanta Banerjee ·

    SolarTformer: A Transformer Based Deep Learning Approach for Short Term Solar Power Forecasting

    Accurate forecasting of solar power output is essential for efficient integration of renewable energy into the grid. In this study, an attention-based deep learning model, inspired by transformer architecture, is used for short-term solar power forecasting. Our proposed model, "S…

  5. Hugging Face Daily Papers TIER_1 ·

    SolarTformer: A Transformer Based Deep Learning Approach for Short Term Solar Power Forecasting

    Accurate forecasting of solar power output is essential for efficient integration of renewable energy into the grid. In this study, an attention-based deep learning model, inspired by transformer architecture, is used for short-term solar power forecasting. Our proposed model, "S…

  6. arXiv cs.LG TIER_1 · Paul Zeinaty ·

    Promoting Simple Agents: Ensemble Methods for Event-Log Prediction

    We compare lightweight automata-based models (n-grams) with neural architectures (LSTM, Transformer) for next-activity prediction in streaming event logs. Experiments on synthetic patterns and five real-world process mining datasets show that n-grams with appropriate context wind…

  7. Hugging Face Daily Papers TIER_1 ·

    Promoting Simple Agents: Ensemble Methods for Event-Log Prediction

    We compare lightweight automata-based models (n-grams) with neural architectures (LSTM, Transformer) for next-activity prediction in streaming event logs. Experiments on synthetic patterns and five real-world process mining datasets show that n-grams with appropriate context wind…

  8. arXiv cs.CL TIER_1 · Wei Wang ·

    Sub-Token Routing in LoRA for Adaptation and Query-Aware KV Compression

    Sub-token routing offers a finer control axis for transformer efficiency than the coarse units used in most prior work, such as tokens, pages, heads, or layers. In this paper, we study routing within a token representation itself in LoRA-adapted transformers. The motivation is th…

  9. Hugging Face Daily Papers TIER_1 ·

    Sub-Token Routing in LoRA for Adaptation and Query-Aware KV Compression

    Sub-token routing offers a finer control axis for transformer efficiency than the coarse units used in most prior work, such as tokens, pages, heads, or layers. In this paper, we study routing within a token representation itself in LoRA-adapted transformers. The motivation is th…