RT @citnaj@twitter.com

So yeah I'm really happy about this new DeOldify model actually......

🐦🔗: twitter.com/citnaj/status/1275

RT @JanelleCShane@twitter.com

When an AI is trained on words, weird things can happen to the physical domain.

I asked the @OpenAI@twitter.com API about horses.
aiweirdness.com/post/621186154

🐦🔗: twitter.com/JanelleCShane/stat

RT @OpenAI@twitter.com

Since 2012, the amount of compute for training to AlexNet-level performance on ImageNet has been decreasing exponentially — halving every 16 months, in total a 44x improvement.

By contrast, Moore's Law would only have yielded an 11x cost improvement: openai.com/blog/ai-and-efficie

🐦🔗: twitter.com/OpenAI/status/1257

Mastodon

Personal server of Lukáš Lánský