Since 2012, the amount of compute for training to AlexNet-level performance on ImageNet has been decreasing exponentially — halving every 16 months, in total a 44x improvement.
By contrast, Moore's Law would only have yielded an 11x cost improvement: https://openai.com/blog/ai-and-efficiency/
Introducing #csharp Source Generators, try it out with the latest #dotnet 5 preview! https://devblogs.microsoft.com/dotnet/introducing-c-source-generators/
"My Bay Area friends treat people as naturally motivated, and assume that if someone acts unmotivated, it’s because they’ve spent so long being taught to suppress their own desires that they’ve lost touch with innate enthusiasm. Personified China treats people as naturally unmotivated, and assumes that if someone acts unmotivated, it’s because they haven’t been trained to pursue a goal determinedly without getting blown around by every passing whim."
This is cool!
THE BARTERED BRIDE Smetana – Garsington Opera - YouTube https://www.youtube.com/watch?v=bBHSzJoDNC4&feature=youtu.be&t=4830
Bayes theorem, and making probability intuitive
"love of order is above all else about appearances. Streets arranged in grids, people waiting in clean lines, cars running at the same speed… But everything that looks good doesn’t necessarily work well. In fact, those two traits are opposed more often than not: efficiency tends to look messy, and good looks tend to be inefficient."
"The idea of this year’s experiment is simple: using per-test coverage data flowing from CI to run only that part of your test suite that might have been affected by changes you did."
I would love to see this work in a team of mine in the future.