Show newer

Lightweight Language Models are Prone to Reasoning Errors for Complex Computational Phenotyping Tasks arxiv.org/abs/2507.23146

Global, Regional, and National Burden of Chronic Kidney Disease Attributable to High Body Mass Index (BMI) among Individuals Aged 20-54 Years from 1990 to 2021: An Analysis of the Global Burden of Disease Study arxiv.org/abs/2507.23537

Theoretical modeling and quantitative research on aquatic ecosystems driven by multiple factors arxiv.org/abs/2507.19553

Review of Deep Learning Applications to Structural Proteomics Enabled by Cryogenic Electron Microscopy and Tomography arxiv.org/abs/2507.19565

Posterior bounds on divergence time of two sequences under dependent-site evolutionary models arxiv.org/abs/2507.19659

Pre-exposure prophylaxis and syphilis in men who have sex with men: a network analysis arxiv.org/abs/2507.19711

External light schedules can induce nighttime sleep disruptions in a Homeostat-Circadian-Light Model for sleep in young children arxiv.org/abs/2507.19772

Sequence-based protein-protein interaction prediction and its applications in drug discovery arxiv.org/abs/2507.19805

Attractive and Repulsive Perceptual Biases Naturally Emerge in Generative Adversarial Inference arxiv.org/abs/2507.19944

Signed Higher-Order Interactions for Brain Disorder Diagnosis via Multi-Channel Transformers arxiv.org/abs/2507.20205

Comparison of Optimised Geometric Deep Learning Architectures, over Varying Toxicological Assay Data Environments arxiv.org/abs/2507.17775

CM-UNet: A Self-Supervised Learning-Based Model for Coronary Artery Segmentation in X-Ray Angiography arxiv.org/abs/2507.17779

Improving reproducibility of cheminformatics workflows with chembl-downloader arxiv.org/abs/2507.17783

Multimodal Recurrent Ensembles for Predicting Brain Responses to Naturalistic Movies (Algonauts 2025) arxiv.org/abs/2507.17897

Synthesis of nanoparticles from carboxymethyl cellulose using one-pot hydrothermal carbonization for Drug Entrapment Studies arxiv.org/abs/2507.18299

ARTreeFormer: A Faster Attention-based Autoregressive Model for Phylogenetic Inference arxiv.org/abs/2507.18380

Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.