Remote Inference of Cognitive Scores in ALS Patients Using a Picture Description. (arXiv:2309.06989v1 [cs.CL])
Auto-Regressive Next-Token Predictors are Universal Learners. (arXiv:2309.06979v1 [cs.LG])
Dynamic Causal Disentanglement Model for Dialogue Emotion Detection. (arXiv:2309.06928v1 [cs.CL])
Native Language Identification with Big Bird Embeddings. (arXiv:2309.06923v1 [cs.CL])
Continual Learning with Dirichlet Generative-based Rehearsal. (arXiv:2309.06917v1 [cs.CL])
Towards the TopMost: A Topic Modeling System Toolkit. (arXiv:2309.06908v1 [cs.CL])
Gpachov at CheckThat! 2023: A Diverse Multi-Approach Ensemble for Subjectivity Detection in News Articles. (arXiv:2309.06844v1 [cs.CL])
Comparative Analysis of Contextual Relation Extraction based on Deep Learning Models. (arXiv:2309.06814v1 [cs.CL])
Cognitive Mirage: A Review of Hallucinations in Large Language Models. (arXiv:2309.06794v1 [cs.CL])
Scaled Prompt-Tuning for Few-Shot Natural Language Generation. (arXiv:2309.06759v1 [cs.CL])
CONVERSER: Few-Shot Conversational Dense Retrieval with Synthetic Data Generation. (arXiv:2309.06748v1 [cs.CL])
Enhancing Keyphrase Generation by BART Finetuning with Splitting and Shuffling. (arXiv:2309.06726v1 [cs.CL])
Simultaneous Machine Translation with Large Language Models. (arXiv:2309.06706v1 [cs.CL])
VLSlice: Interactive Vision-and-Language Slice Discovery. (arXiv:2309.06703v1 [cs.CV])
Benchmarking Procedural Language Understanding for Low-Resource Languages: A Case Study on Turkish. (arXiv:2309.06698v1 [cs.CL])
Statistical Rejection Sampling Improves Preference Optimization. (arXiv:2309.06657v1 [cs.CL])
RT-LM: Uncertainty-Aware Resource Management for Real-Time Inference of Language Models. (arXiv:2309.06619v1 [cs.LG])
Narrative as a Dynamical System. (arXiv:2309.06600v1 [cs.CL])
Do Generative Large Language Models need billions of parameters?. (arXiv:2309.06589v1 [cs.CL])
Can humans help BERT gain "confidence"?. (arXiv:2309.06580v1 [cs.CL])
All recent Computation and Language articles on arXiv.org for the Fediverse
Inspired by https://twitter.com/arxiv_cscl