In my quest to simplify the current state of matrix differential calculus, I have been working on a more elegant, simplified form of taking derivatives of functions regardless of the input and output shape, with more direct implementation. This has resulted in the following series, which continues to grow as I use what I've learned so far.
Program committee for Neurips 2018 Workshop on Security in Machine Learning, SafeML ICLR 2019 Workshop
Best defense paper at NIPS 2017 ML & Security Workshop
I am an avid dancer in the ballroom, latin, and hip hop styles.
I am also an enthusiastic cook, and love experimenting with different ingredients.