Act II — ResearchTeaching matter to think.
My research lives at the meeting of three disciplines. Resistive random-access memory (RRAM) gives us a substrate where computation and storage are no longer separate things. Analog & in-memory computing turns that substrate into a working machine. And optimization & learning ask what such a machine could solve that GPUs cannot.
I am building the bridge between the three — designing distributed primal-dual algorithms that run natively on RRAM crossbars, and chasing the day when neuromorphic hardware quietly outperforms a rack of GPUs on the world's hardest linear programs.
Read recent work →