Show newer

RT @ambrafer
Friends heading to Rovereto in May: deadline for abstract submission to has been extended! 👇 twitter.com/cimec_unitrento/st

RT @sachipaul18
can vouch that this is an incredible post-bacc opportunity for anyone interested in language and the brain (specifically post-stroke aphasia or alexia)!!! amazing community & perfect for people interested in pursuing neuropsychology, speech, or neurology! cognitiverecoverylab.com/join-

RT @viscog
Our lab is looking for Seattle-area folks who have a vision condition called amblyopia (also known as lazy eye) for a research study. We're also looking for a few age-matched controls (18 or 19 years old). Get in touch if you're interested!

RT @alindleyw
Out now in Current Biology: @jdyeatman, @cvnlab and I found huge task effects on activity in the “visual word form area." The patterns of response modulation and functional connectivity seem to be uniquely related to engagement in a linguistic task.
authors.elsevier.com/a/1gieB3Q

Thrilled & grateful to @NatEyeInstitute for the lab’s first to study the variability of brain reorganization in blindness!
Thanks to my collaborators and many mentors (many off twitter/Mast), and reach out to me if you want to work together on this project – I’m recruiting!
---
RT @StriemAmit
Please RT: My lab is recruiting a postdoc to work on individual differences of plasticity in blindness: how brain manifests in d…
twitter.com/StriemAmit/status/

RT @martin_hebart
I'm thrilled that THINGS-data is now online! We provide 3 massive datasets of fMRI, MEG, & behavior in response to up to 1854 objects and >22k images. We hope this will allow studying objects in vision, memory & language with unique semantic breadth! Link: doi.org/10.25452/figshare.plus

RT @lina_teichmann
Yay for team science: THINGS-data is finally out thanks to @martin_hebart @OliverContier @Chris_I_Baker & everyone else involved! I really enjoyed working on THINGS-MEG and am still trying to make sense of this incredibly rich dataset. A 🧵on initial findings & what’s yet to come twitter.com/martin_hebart/stat

RT @MonacoSimona
I am hiring a post-doc! Come work with me at CIMeC @cimec_unitrento in Italy to figure out the neural mechanisms of predictions for action execution. I am looking for expertise in brain imaging, ideally MEG or EEG. Please RT!

RT @leylaisi
Our lab is looking for a postdoc! We study the neural basis of social vision using a combination of (naturalistic) neuroimaging, behavior, and computational modeling. More details on the lab and position are at our website: isiklab.org
Please RT

Our lab is looking for a postdoc! We study the neural basis of social vision using a combination of (naturalistic) neuroimaging, behavior, and computational modeling. More details on the lab and position are at our website: isiklab.org
Please boost :)

🎉🎉 Our big consortium paper, led by Sam Sober at Emory, showing high channel count EMG electrodes that can isolate action potentials from many motor units across a range of behaviors and species is now out in preprint form.

biorxiv.org/content/10.1101/20

It's been a huge team effort on many fronts! A special thanks to those in my lab who did the work getting these going in NHPs (and humans): Jonathan Michaels, Tomomichi Oya, Mehrdad Kashefi and Rhonda Kersten.

If kick-ass EMG interests you check out the paper but also more practical details (maybe even in how to get some) on the consortium website here: internationalmotorcontrol.org

Myomatrix arrays for high-definition muscle recording

Neurons coordinate their activity to produce an astonishing variety of motor behaviors. Our present understanding of motor control has grown rapidly thanks to new methods for recording and analyzing populations of many individual neurons over time. In contrast, current methods for recording the nervous system`s actual motor output - the activation of muscle fibers by motor neurons - typically cannot detect the individual electrical events produced by muscle fibers during natural behaviors and scale poorly across species and muscle groups. Here we present a novel class of electrode devices (''Myomatrix arrays'') that record muscle activity at cellular resolution across muscles and behaviors. High-density, flexible electrode arrays allow for stable recordings from the muscle fibers activated by a single motor neuron, called a ''motor unit'', during natural behaviors in many species, including mice, rats, primates, songbirds, frogs, and insects. This technology therefore allows the nervous system's motor output to be monitored in unprecedented detail during complex behaviors across species and muscle morphologies. We anticipate that this technology will allow rapid advances in understanding the neural control of behavior and in identifying pathologies of the motor system. ### Competing Interest Statement The authors have declared no competing interest.

www.biorxiv.org

Grateful to my many family members, friends, and colleagues who take to the streets to protest against the threat to Israel’s democracy.
I’m with you at heart, even if I can’t be there in person.

Working memory and imagery in early visual cortex biorxiv.org/content/10.1101/20 on decoding memorized feature-continuous gratings from visual cortex #fMRI activity; #neuroscience

Working memory and imagery in early visual cortex

It has been suggested that visual images are memorized across brief periods of time by vividly imagining them as if they still were there. Such visual imagery has been proposed as a key mechanism underlying working memory. However, the ability to evoke visual imagery varies substantially across individuals, raising the question whether people with weak or even absent visual imagery might use other, non-visual strategies for memorization. Here, we systematically investigated this question in two groups of participants, very strong and very weak imagers, that were asked to remember images across brief delay periods. We were able to reliably reconstruct the memorized stimuli from early visual cortex during the delay period. Importantly, the quality of reconstruction was equally accurate for both strong and weak imagers. The decodable information also closely reflected behavioral precision in both groups, suggesting it could potentially contribute to behavioral performance, even in the extreme case of completely aphantasic individuals. Our data thus suggest that strong and weak imagers do not differ in how early visual areas encode memory-related information and that this information might even be equally used in both groups. ### Competing Interest Statement The authors have declared no competing interest.

www.biorxiv.org
Show older
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.