We seek a method of minimizing/maximizing functionals rather than ordinary functions. A functional is a mapping that takes in a function and outputs a scalar. The most familiar example is the definite integral.
Example functional:
Here, \(y=y(x)\) is the function being varied. We seek \(\overline{y}(x)\) that minimizes \(J[y]\). A classical problem is the Brachistochrone problem, minimizing the time for a bead to slide under gravity along a frictionless curve.
Bernoulli showed the time is
General Theory
In standard calculus, minima are found by setting derivatives to zero. In variational calculus, we set the functional derivative equal to zero, yielding an ODE.
For the functional
we want \(y(a)=A,\; y(b)=B\) and \(\overline{y}(x)\) minimizing \(J[y]\). Consider perturbations:
Define \(\phi(\varepsilon)=J[\overline{y}+\varepsilon\eta]\). Minimizing requires
Expanding:
Integration by parts gives
Since \(\eta\) is arbitrary, the integrand must vanish:
This is the Euler–Lagrange Equation.
Example Problem
Consider
with boundary conditions \(y(0)=0,\; y(2)=2\).
Here \(F(x,y,y')=y^2+(y')^2\). Compute:
So the Euler–Lagrange equation is
The characteristic equation is \(\lambda^2-1=0 \implies \lambda=\pm 1\).
Apply BCs: \(y(0)=0\implies A+B=0\). So \(A=-B\).
Substitute \(A=-B\):
Then \(A=-B=0.28\). So
or equivalently
This function minimizes the functional