This is insane. @email@example.com just replaced video codecs with a neural network.
We'll all be controlling digital face puppets of ourselves on video calls in the future! 👹
4 years ago we started the @firstname.lastname@example.org. Today, Google announced the availability of Cilium as the new GKE networking dataplane.
What a great honor for everyone who has contributed to the Cilium project and to eBPF overall.
The background story:
This is the best zero-shot prompt style I've found for generating short poetry snippets with GPT-3 (at default 0.7 temperature and 0/0 penalty rates).
You can also tweak the poem tone by adjusting the student's adjectives (clever, depressed, etc) and name (for mimicking style).
Euro area #RetailTrade +5.7% in June over May, +1.3% over June 2019 https://ec.europa.eu/eurostat/en/web/products-press-releases/-/4-05082020-AP
"Imagine all the extraordinary things we'll be able to do once computers have literally hundreds of CPU cores!"
Give GPT-3 a color scale and an emoji. Get back new scales based on color of the emoji. HOW DOES IT KNOW.
A surprising result: We found that smooth activation functions are better than ReLU for adversarial training and can lead to substantial improvements in adversarial robustness.
So yeah I'm really happy about this new DeOldify model actually......
When an AI is trained on words, weird things can happen to the physical domain.
I asked the @OpenAI@twitter.com API about horses.
Since 2012, the amount of compute for training to AlexNet-level performance on ImageNet has been decreasing exponentially — halving every 16 months, in total a 44x improvement.
By contrast, Moore's Law would only have yielded an 11x cost improvement: https://openai.com/blog/ai-and-efficiency/