Optimization on nonlinear manifolds has applications in many areas, including machine learning, statistics, robotics, imaging and even computational complexity theory. We show that when the underlying manifold is negatively curved, optimization can be more difficult than in (flat) Euclidean spaces. The talk will provide a brief introduction to manifolds, curvature, optimization on manifolds and its applications, and Nesterov's accelerated gradient method in Euclidean spaces. We will then present our main results, which show that acceleration (in the sense of Nesterov's accelerated gradient method) is unachievable in a large class of negatively curved Riemannian manifolds when the algorithm receives exact gradient and function value information. Our work builds on the recent work of Hamilton and Moitra (2021) who show that acceleration on the class of strongly geodesically convex functions is unachievable in the hyperbolic plane when the algorithm receives gradients and function values corrupted by a small amount of noise.