Show newer

Scaling TransNormer to 175 Billion Parameters. (arXiv:2307.14995v1 [cs.CL]) 

Incrementally-Computable Neural Networks: Efficient Inference for Dynamic Inputs. (arXiv:2307.14988v1 [cs.LG]) 

PanGu-Coder2: Boosting Large Language Models for Code with Ranking Feedback. (arXiv:2307.14936v1 [cs.CL]) 

ARC-NLP at PAN 2023: Transition-Focused Natural Language Inference for Writing Style Detection. (arXiv:2307.14913v1 [cs.CL]) 

ARC-NLP at PAN 2023: Hierarchical Long Text Classification for Trigger Detection. (arXiv:2307.14912v1 [cs.CL]) 

Retrieval-based Text Selection for Addressing Class-Imbalanced Data in Classification. (arXiv:2307.14899v1 [cs.CL]) 

MESED: A Multi-modal Entity Set Expansion Dataset with Fine-grained Semantic Classes and Hard Negative Entities. (arXiv:2307.14878v1 [cs.CL]) 

Exploiting the Potential of Seq2Seq Models as Robust Few-Shot Learners. (arXiv:2307.14856v1 [cs.CL]) 

ArcGPT: A Large Language Model Tailored for Real-world Archival Applications. (arXiv:2307.14852v1 [cs.CL]) 

Turkish Native Language Identification. (arXiv:2307.14850v1 [cs.CL]) 

What Makes a Good Paraphrase: Do Automated Evaluations Work?. (arXiv:2307.14818v1 [cs.CL]) 

Models of reference production: How do they withstand the test of time?. (arXiv:2307.14817v1 [cs.CL]) 

Improving Aspect-Based Sentiment with End-to-End Semantic Role Labeling Model. (arXiv:2307.14785v1 [cs.CL]) 

Emotion4MIDI: a Lyrics-based Emotion-Labeled Symbolic Music Dataset. (arXiv:2307.14783v1 [eess.AS]) 

Turning Whisper into Real-Time Transcription System. (arXiv:2307.14743v1 [cs.CL]) 

Evaluating Generative Models for Graph-to-Text Generation. (arXiv:2307.14712v1 [cs.CL]) 

Improving Natural Language Inference in Arabic using Transformer Models and Linguistically Informed Pre-Training. (arXiv:2307.14666v1 [cs.CL]) 

Metric-Based In-context Learning: A Case Study in Text Simplification. (arXiv:2307.14632v1 [cs.CL]) 

Speed Reading Tool Powered by Artificial Intelligence for Students with ADHD, Dyslexia, or Short Attention Span. (arXiv:2307.14544v1 [cs.CL]) 

Plug and Pray: Exploiting off-the-shelf components of Multi-Modal Models. (arXiv:2307.14539v1 [cs.CR]) 

Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.