Designing Invariant and Equivariant Neural Networks
Yaron Lipman  1  
1 : Weizmann Institute

 Many tasks in machine learning (ML) require learning functions that are invariant or equivariant with respect

to symmetric transformations of the data. For example, graph classification is invariant to permutations of its

nodes, while recognizing the shape of a point cloud is invariant to both permutation and Euclidean motion of its

points. Designing parameteric models (i.e., neural networks) that are by construction invariant or equivariant to

symmetries of the data has been proven successful in many ML tasks involving data such as images, sets and

point-clouds, and graphs. In designing invariant/equivariant neural network model there are few factors that

should be taken into account: (i) The expressive/approximation power of the model; (ii) the computational and

memory complexity of the model; (iii) the model's practical performance (inductive bias).

In this talk I will review two methodologies for designing invariant/equivariant networks: The intrinsic

method , and the extrinsic method . The intrinsic method first characterizes invariant/equivariant primitive

functions, such as linear transformations, and then composes these with non-linear activations to build the

final parametric model. Extrinsic methods, on the other hand, apply symmetrization to general parametric

functions. In the talk I will review some earlier works in this space, and provide an in-depth description of

Frame Averaging , a recent symmetrization approach, that in some cases allows designing efficient and maximally

expressive invariant/equivariant models.


Personnes connectées : 1 Vie privée
Chargement...