220

arXiv:2512.16969v1 Announce Type: new
Abstract: Despite advances in scientific AI, a coherent framework for Scientific General Intelligence (SGI)-the ability to autonomously conceive, investigate, and reason across scientific domains-remains lacking. We present an operational SGI definition grounde…
109

arXiv:2512.08554v2 Announce Type: replace
Abstract: In this article, we give two extended space formulations, respectively, for the induced tree and path polytopes of chordal graphs with vertex and edge variables.
These formulations are obtained by proving that the induced tree and path extended …
111

arXiv:2512.17017v1 Announce Type: new
Abstract: Working with abstract information often relies on static, symbolic representations that constrain exploration. We introduce Explorable Ideas, a framework that externalizes abstract concepts into explorable environments where physical navigation coordi…
109

arXiv:2510.16882v2 Announce Type: replace
Abstract: Supervised fine-tuning (SFT) is a commonly used technique to adapt large language models (LLMs) to downstream tasks. In practice, SFT on a full dataset is computationally expensive and sometimes suffers from overfitting or bias amplification. This…
220

arXiv:2510.08461v2 Announce Type: replace
Abstract: We introduce a refinement-based Christoffel sampling (RCS) algorithm for least squares approximation in the span of a given, generally non-orthogonal set of functions $\Phi_n = \{\phi_1, \dots, \phi_n\}$. A standard sampling strategy for this prob…
234

arXiv:2505.14886v2 Announce Type: replace
Abstract: Winning competitive debates requires sophisticated reasoning and argument skills. There are unique challenges in the competitive debate: (1) The time constraints force debaters to make strategic choices about which points to pursue rather than cov…
109

arXiv:2512.17276v1 Announce Type: new
Abstract: Machine learning approaches for Alzheimer's disease (AD) diagnosis face a fundamental challenges. Clinical assessments are expensive and invasive, leaving ground truth labels available for only a fraction of neuroimaging datasets. We introduce Multi v…
111

arXiv:2512.17532v1 Announce Type: new
Abstract: Multimodal Large Language Models struggle to maintain reliable performance under extreme real-world visual degradations, which impede their practical robustness. Existing robust MLLMs predominantly rely on implicit training/adaptation that focuses sol…
120

arXiv:2512.17607v1 Announce Type: new
Abstract: Physics-informed neural networks (PINNs) have recently emerged as a prominent paradigm for solving partial differential equations (PDEs), yet their training strategies remain underexplored. While hard prioritization methods inspired by finite element …
212

arXiv:2412.16827v2 Announce Type: replace
Abstract: As intelligent reflecting surface (IRS) has emerged as a new and promising technology capable of configuring the wireless environment favorably, channel estimation for IRS-assisted multiple-input multiple-output (MIMO) systems has garnered extensi…
109

arXiv:2512.17663v1 Announce Type: new
Abstract: We study the computational complexity of scheduling jobs on a single speed-scalable processor with the objective of capturing the trade-off between the (weighted) flow time and the energy consumption. This trade-off has been extensively explored in th…
219

arXiv:2512.17077v1 Announce Type: new
Abstract: Diffusion Large Language Models (dLLMs) have emerged as a promising alternative to Autoregressive Models (ARMs), utilizing parallel decoding to overcome sequential bottlenecks. However, existing research focuses primarily on kernel-level optimizations…