So yeah I'm really happy about this new DeOldify model actually......
When an AI is trained on words, weird things can happen to the physical domain.
I asked the @OpenAI@twitter.com API about horses.
C programmers are like
Since 2012, the amount of compute for training to AlexNet-level performance on ImageNet has been decreasing exponentially — halving every 16 months, in total a 44x improvement.
By contrast, Moore's Law would only have yielded an 11x cost improvement: https://openai.com/blog/ai-and-efficiency/