Show more

RT @sachipaul18
can vouch that this is an incredible post-bacc opportunity for anyone interested in language and the brain (specifically post-stroke aphasia or alexia)!!! amazing community & perfect for people interested in pursuing neuropsychology, speech, or neurology! cognitiverecoverylab.com/join-

RT @ElisaRFerre
Our new discovery covered by @SmithsonianMag "Why Newborn Chicks Love Objects That Defy Gravity" @eli_versace @QMUL @BirkbeckScience @BirkbeckUoL @bbkpsychology

smithsonianmag.com/science-nat

RT @viscog
Our lab is looking for Seattle-area folks who have a vision condition called amblyopia (also known as lazy eye) for a research study. We're also looking for a few age-matched controls (18 or 19 years old). Get in touch if you're interested!

Happy , I guess...
---
RT @datta_lab
Female mice are studied less than males in part because of a belief held by some that the estrous cycle makes all female mouse behavior more variable. But is this true? Check out our latest, led by @DanaRubiLevy in collab with @ShanskyLab! bit.ly/3ZQEIHD
twitter.com/Datta_Lab/status/1

RT @alindleyw
Out now in Current Biology: @jdyeatman, @cvnlab and I found huge task effects on activity in the “visual word form area." The patterns of response modulation and functional connectivity seem to be uniquely related to engagement in a linguistic task.
authors.elsevier.com/a/1gieB3Q

Thrilled & grateful to @NatEyeInstitute for the lab’s first to study the variability of brain reorganization in blindness!
Thanks to my collaborators and many mentors (many off twitter/Mast), and reach out to me if you want to work together on this project – I’m recruiting!
---
RT @StriemAmit
Please RT: My lab is recruiting a postdoc to work on individual differences of plasticity in blindness: how brain manifests in d…
twitter.com/StriemAmit/status/

RT @martin_hebart
I'm thrilled that THINGS-data is now online! We provide 3 massive datasets of fMRI, MEG, & behavior in response to up to 1854 objects and >22k images. We hope this will allow studying objects in vision, memory & language with unique semantic breadth! Link: doi.org/10.25452/figshare.plus

RT @lina_teichmann
Yay for team science: THINGS-data is finally out thanks to @martin_hebart @OliverContier @Chris_I_Baker & everyone else involved! I really enjoyed working on THINGS-MEG and am still trying to make sense of this incredibly rich dataset. A 🧵on initial findings & what’s yet to come twitter.com/martin_hebart/stat

RT @lady_ginseng
Are you interested in scientific editing as a full-time career and have a PhD in computational and/or systems neuroscience?

Then please apply to join our @NatureNeuro editorial team! Application deadline is March 31st.

More info below 👇🏾 twitter.com/NatureNeuro/status

RT @MonacoSimona
I am hiring a post-doc! Come work with me at CIMeC @cimec_unitrento in Italy to figure out the neural mechanisms of predictions for action execution. I am looking for expertise in brain imaging, ideally MEG or EEG. Please RT!

RT @leylaisi
Our lab is looking for a postdoc! We study the neural basis of social vision using a combination of (naturalistic) neuroimaging, behavior, and computational modeling. More details on the lab and position are at our website: isiklab.org
Please RT

Our lab is looking for a postdoc! We study the neural basis of social vision using a combination of (naturalistic) neuroimaging, behavior, and computational modeling. More details on the lab and position are at our website: isiklab.org
Please boost :)

Please RT: My lab is recruiting a postdoc to work on individual differences of plasticity in blindness: how brain manifests in different individuals, and how that may account for their compensatory skills and use of assistive devices.
samp-lab.facultysite.georgetow

🎉🎉 Our big consortium paper, led by Sam Sober at Emory, showing high channel count EMG electrodes that can isolate action potentials from many motor units across a range of behaviors and species is now out in preprint form.

biorxiv.org/content/10.1101/20

It's been a huge team effort on many fronts! A special thanks to those in my lab who did the work getting these going in NHPs (and humans): Jonathan Michaels, Tomomichi Oya, Mehrdad Kashefi and Rhonda Kersten.

If kick-ass EMG interests you check out the paper but also more practical details (maybe even in how to get some) on the consortium website here: internationalmotorcontrol.org

Grateful to my many family members, friends, and colleagues who take to the streets to protest against the threat to Israel’s democracy.
I’m with you at heart, even if I can’t be there in person.

Working memory and imagery in early visual cortex biorxiv.org/content/10.1101/20 on decoding memorized feature-continuous gratings from visual cortex #fMRI activity; #neuroscience

RT @RezaAzadi_
Our new study shows that optogenetic stimulation of high-level visual cortex results in distortions of the concurrent contents of vision.

@AfrazArash @sbohn8 @RozFranklin51
@CurrentBiology
cell.com/current-biology/fullt
!
(1/6)

RT @StephanieRossit
Please RT! Motion-tracking experts - are you using video (from webcams or cameras?) and what software would you recommend to track hand movements based on video?

RT @MariusPeelen
The CAOs workshop in Rovereto (Italy) is back! May 4-6, 2023. Great lineup of speakers. Registration now open. event.unitn.it/cimec-caos/

Show more
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.