Gradient descent is a
optimization algorithm. To find a
local minimum of a function using gradient descent, one takes steps proportional to the
negative of the
gradient (or of the approximate gradient) of the function at the current point. If instead one takes steps proportional to the
positive of the gradient, one approaches a
local maximum of that function; the procedure is then known as
gradient ascent.