18 articles

Multi-stage system reduces computational expense while maintaining performance through strategic human-model interaction.

New research applies neural networks to chronic rhinosinusitis prediction, knee osteoarthritis grading, Alzheimer's progression modeling, and medical image reasoning.

New model reduces hallucinations in law, medicine, finance while preserving fast response times.

New benchmark reveals how large language models drift from original constraints during multi-turn collaborative refinement.

New method expands adapter expressivity without increasing parameter count, addressing core efficiency-performance tradeoff.

New research reveals large language models cannot faithfully sample from probability distributions, a critical gap for stochastic systems.

The Chinese AI startup's new flagship processes vastly longer prompts while undercutting Western competitors on price.

GPT-5.5 now runs Codex on NVIDIA GB200 infrastructure, expanding AI agents into knowledge work.

New model closes gap with frontier AI systems while improving efficiency over previous versions.

The new model excels at coding, research, and data analysis while racing ahead in benchmark competitions.

New analysis suggests large language models acquire intelligence in reverse order, challenging unlimited scaling assumptions.

Chinese AI lab Minimax releases M2.7 model to compete in crowded LLM marketplace.