Links for 2022-11-12
Sanofi signs latest billion-dollar AI drug discovery deal, this time with Insilico https://venturebeat.com/ai/sanofi-signs-latest-billion-dollar-ai-drug-discovery-deal-this-time-with-insilico/amp/
Character.ai integrates image generation into chatbot dialogues https://twitter.com/character_ai/status/1588207276090802176
“Thread on recent tools I like for writing prompts and experimenting with GPT‑3” https://threadreaderapp.com/thread/1588247865503010816.html
Finetuning large multilingual language models on English tasks with English prompts allows for task generalization to non-English languages that appear only in the pretraining corpus. https://arxiv.org/abs/2211.01786
“Can large language models write prompts…for themselves? Yes, at a human-level (!) if they are given the ability to experiment and see what works.” https://arxiv.org/abs/2211.01910
Generating Human-level Text with Contrastive Search in Transformers https://huggingface.co/blog/introducing-csearch
“A Single Dose of Psilocybin for Treatment-Resistant Depression...A brand new study published in the New England Journal of Medicine suggests a single dose of 25mg psilocybin can significantly improve the symptoms of depression for at least 12 weeks...” https://twitter.com/alieninsect/status/1588038619834437632
Accident With 1918 Pandemic Virus Raises Questions About Pathogen Research https://theintercept.com/2022/11/01/pandemic-1918-flu-virus-biosafety/
Too much efficiency makes everything worse: overfitting and the strong version of Goodhart's law https://sohl-dickstein.github.io/2022/11/06/strong-Goodhart.html
Canadian universities largely unable to tell genuine Indigenous employees from fraudsters: report https://nationalpost.com/news/canada/eliminate-the-fraudsters-report-delves-into-indigenous-identity-at-universities
If artificial neural networks (ANNs) infringe copyright, so do all biological neural networks (human brains). Just like human brains, ANNs provably do not just copy their training data but have learned how to generate similar data.
Human brains cannot code or create art out of the box but rely on crystallized intelligence in the form of human culture. Just like ANNs, human brains are exposed to hundreds of thousands of hours of rich multimodal data before they become proficient at specific tasks. Similarly to this cultural transmission of knowledge, ANNs compress terabytes of human culture into a few gigabytes of weights by encoding the algorithms that generated the original data.
Saying that ANNs steal your work is just a psychological defense mechanism from people who are starting to realize that their skills are not as unique as they thought.