Tag: neural networks

  • Learners and Poly

    Brendan Fong, David Spivak and Remy Tuyeras cooked up a vast generalisation of neural networks in their paper Backprop as Functor: A compositional perspective on supervised learning. Here’s a nice introduction to neural networks for category theorists by Bruno Gavranovic. At 1.49m he tries to explain supervised learning with neural networks in one slide. Learners […]

  • Deep learning and toposes

    Judging from this and that paper, deep learning is the string theory of the 2020s for geometers and representation theorists. String theory is the 90s answer to the tears of algebraic geometers worldwide trying to write the "Applications" part of their grant proposals. https://t.co/AboZ5WkPtc — algebraic geometer BLM (@BarbaraFantechi) December 17, 2021 If you want…