In this talk we discuss a nonsmooth implicit function theorem equipped with an operational calculus that allows for its use to solve practical problems in machine learning. This calculus is special in that it is compatible with backpropagation - allowing, for instance, to replace derivatives by Clarke Jacobians in the usual differentiation formulas for a wide class of nonsmooth problems. We provide several applications such as training neural networks with implicit layers or differentiating solutions to nonsmooth optimization problems. Finally, to show the sharpness of our assumptions, we present numerical experiments showcasing the extremely pathological gradient dynamics one can encounter when applying implicit algorithmic differentiation without the hypothesis of the theorem.