RT @UrbMobRafal
So honored to receive @ERC_Research Starting Grant! Super excited to start running my #COeXISTENCE grant and see what will happen when we start sharing our cities with AI machines at @JagiellonskiUni in Kraków, Poland. Details: https://rafalkucharskipk.github.io/COeXISTENCE/ https://twitter.com/ERC_Research/status/1595011727711940608
RT @DrewLinsley
Check out our new paper, to appear at NeurIPS. We show that DNNs are becoming progressively *less* aligned with human perception as their ImageNet accuracy increases. Ignore the elections, Elon, and FTX for a moment — this is important!
https://serre-lab.github.io/Harmonization/
#catsofmastodon
Mastodon, meet Frank. In rare moments in which he doesn't want to murder his surroundings, he is actually a sweet cat!
This place feels a lot nicer & nerdier than Twitter, so I actually feel like sharing a little: 😋
We have recently released the 2022 update of the Metapsy meta-analytic database for depression psychotherapy (415 studies), which can be analyzed here: https://www.metapsy.org/databases/.
Detailed documentation & download here: https://docs.metapsy.org/databases/depression-psyctr/
...and you can directly retrieve the data in R using https://data.metapsy.org
Hi all!
#introduction #ml #neuralnetwork #science
I'm a PhD candidate doing research into ML for biotechnology. In my free time, I'm also a cofounder of a startup making tools for real estate analysis.
Don't expect any startup threads from me, though, I'm more into science than business!
My ML research is focusing on few shot tabular learning, and applying them in my biotech life.
The latter focuses on metagenomics.
Mastodon friends, I wrote a thread on Twitter that asks for commitments to three behaviors to maximize the chances for a successful transition of the community discussion to this platform. Please have a look and retweet it if you are willing to commit to the behaviors for November:
RT @karwowskaz
Thank you @polonium_org for the opportunity to talk about my research. The amount of questions assured me that there is a very bright future for gut microbiome research!
Finally, the online network is trained using gradient descent, while target network's weights are updated by averaging them (exponential moving average) with online's weights.
This way they are taught to achieve consistent embeddings of observations across different ways of introducing noise.
First of all, they specify a fast-learning "online" network, and a slow-learning, "target" one.
For a given sample the online network is trying to predict target network's embedding.
The catch?
They are using different augmentations!
AI guy. Tooting interesting publications and statistics that catch my eye