liquid air
PulseAugur coverage of liquid air — every cluster mentioning liquid air across labs, papers, and developer communities, ranked by signal.
No coverage in the last 90 days.
-
Liquid AI releases LFM2-24B-A2B, an efficient 24B parameter MoE model
Liquid AI has released an early checkpoint of its LFM2-24B-A2B model, a sparse Mixture of Experts (MoE) architecture with 24 billion total parameters and 2 billion active parameters per token. This model demonstrates th…
-
MIT, Liquid AI, and Max Planck develop CompreSSM for AI model compression during training
Researchers from MIT, Liquid AI, and the Max Planck Institute have developed a novel technique called CompreSSM. This method enables the compression of AI architectures during the training process itself. The innovation…
-
Shopify CTO details AI integration, new workflows, and deployment challenges
Shopify CTO Mikhail Parakhin discussed the company's extensive AI integration, highlighting a significant shift in model quality around December that accelerated adoption. He emphasized that the primary challenges in AI…