> Perceptrons, which I believe are the simplest neural networks, go back to the 1950s. What's changed is the hardware we can run them on has gotten so much faster and so much more efficient and so much more powerful, and the data sizes that we can work with have gotten so much bigger. So now we can solve these problems, and it's kind of awesome what we can do. #AI #machinelearning
It’s incredible how even now you run a normal multi-layer neural network on a CPU machine and it’s quite slow. The GPU is a big advancement -- and the other thing is there have been some algorithmic advances on the vision side of deep learning. #AI
Why does pretty much everyone associate #DeepLearning with Google's TensorFlow? It's just a nightmare to use in practice.
Despite the tremendous hype around it, the number of people who are actually using it to build real things that make a difference is probably very low. #AI #NeuralNetworks
You have a bunch of people with physics PhDs who maybe wrote some R code in graduate school. And they suddenly have to compile all these packages with GPU support so they can get CUDA running, and they're just like, "We can't do that." #AI #machinelearning #startup