True, but you have to do it one at a time to look at the variables. Also, the exception message doesn't tell you which operator among several is the issue. This just makes it easier. :)
Thanks, Jeremy. :) I didn't go into super huge detail in the article on the implementation part as most readers won't have interest in language nerd details like you and I do.
I wondered about that. It might work just calling my internal pyviz("some python code in string) function from the debugger. It'll execute in context (maybe?)
One of the biggest challenges when writing code to implement deep learning networks is getting all of the tensor (matrix and vector) dimensions to line up properly, even when using predefined network layers. This article describes a new library called TensorSensor that clarifies exceptions by augmenting messages and visualizing Python code to indicate the shape of tensor variables. It works with Tensorflow, PyTorch, and Numpy, as well as higher-level libraries like Keras and fastai.
Glad to be of service. Yeah, I just couldn't see what RNNs were doing through all of the neural net stuff. Much easier to think of it as just gradually transforming a vector. I also can't see linear algebra w/o visualizing different size boxes :)
Vanilla recurrent neural networks (RNNs) form the basis of more sophisticated models, such as LSTMs and GRUs. There are lots of great articles, books, and videos that describe the functionality, mathematics, and behavior of RNNs so, don't worry, this isn't yet another rehash. (See below for a list of resources.) My goal is to present an explanation that avoids the neural network metaphor, stripping it down to its essence—a series of vector transformations that result in embeddings for variable-length input vectors.
Thank you for this. As someone who who understands linear algebra well, I feel the neural network formalism obfuscates the core meaning of these processes. It's good for describing composition, but not good for understanding the parts in-between.
Looking forward to your take on LSTMs and transforms ;).