Links for 2022-08-15
“This paper changed my thinking about what future langauge models will be good at, mostly in a really concerning way…if we can generate unlimited training data for coding in particular, this suggests that *future language models will be far better at coding than anything else.*” https://threadreaderapp.com/thread/1558347542101839873.html
Scientists doubled working memory in mice by selectively inhibiting PDE4 and PDE5 https://www.reddit.com/r/ExperimentalNootropic/comments/wnznj8/scientists_doubled_working_memory_in_mice_by/
In 1953, John Wheeler accidentally left a file describing how to build the Hydrogen Bomb on a train https://physicstoday.scitation.org/doi/10.1063/PT.3.4364
Romans just didn't get probability: "such extreme [dice shape] variation was acceptable because makers and users understood roll outcomes as the product of fate, rather than chance or probability." https://link.springer.com/article/10.1007/s12520-022-01599-y
GPT-3 solves chemistry better when copyright notices are added to prompt https://chemrxiv.org/engage/chemrxiv/article-details/62c5c622244ce03b8e3c4f21
If you want to learn the maths behind neural networks with Python code examples http://neuralnetworksanddeeplearning.com/
“Real explanations will sometimes sound weird, crazy, or too complicated because reality itself is often weird, crazy, or too complicated.” https://slimemoldtimemold.com/2022/01/11/reality-is-very-weird-and-you-need-to-be-prepared-for-that/
Skin exposure to UVB light induces a skin-brain-gonad axis and sexual behavior https://www.cell.com/cell-reports/fulltext/S2211-1247(21)01013-5
The Earth is getting greener due to increase CO2 presence in the atmosphere and this is predicted to lead to substantial cooling according to an article in Nature. https://www.nature.com/articles/s41467-022-28305-9