n
TIL:
👀 3
🔥 1
j
Can someone ELIF this for me? Is the memory available at training time or inference time? How is it connected to the network? And what does it mean that it is analogous to a Turing machine on von Neuman whatchamacallit?
n
@Jason Morris These constructs define a "memory" to be used by the network at inference time. I'm still going through the NTM and DNC papers. But basically, Siegelmann and Sontag showed in 1992 that Recurrent Neural Networks (RNNs) are Turing-complete.
image.png