• 0 Posts
  • 8 Comments
Joined 2 years ago
cake
Cake day: November 5th, 2023

help-circle




  • One of the major breakthroughs wasn’t just compute hardware, it was things like the “Attention Is All You Need” whitepaper that spawned all the latest LLMs and multi-modal models (video generation, music generation, classification, sentiment analysis, etc etc.) So there has been an insane amount of improvement on the whole neural network architectures themselves. (LSTM, Transformers, recurrent neural nets, convolutional neural nets, etc.) RNN’s were 1972, LSTMs only came out in 1999 come to find out.

    2009-2011 was when we got good image recognition. Transformers started after the Attention whitepaper in 2017. Now the models are improving themselves at this point, singularity is heading our way pretty quickly.