Define $\norm{x}=\sqrt{\innerproduct{x}}, ~x \in X$. This $\norm{\cdot}$ is a norm on $X$. A space $X$ with a norm induced by the inner product is called a normed space.
\end{defi}
\begin{lem}
The Cauchy-Schwarz inequality holds
\[
\forall x, y \in X: \quad\abs{\innerproduct{x}{y}}\le\norm{x}\norm{y}
\]
as well as the triangle inequality
\[
\forall x, y \in X: \quad\norm{x + y}\le\norm{x} + \norm{y}
The norm $\norm{x}=\sqrt{\innerproduct{x}}$ satisfies this equality (without proof). This implies that $l^p, L^p(A)$ and $C(A)$ are not inner product spaces (for $p \ne2$).
This can be shown explicitly for $l^p$. Consider the sequences
\begin{align*}
x &= (1, 1, 0, 0, \cdots) & y &= (1, -1, 0, 0, \cdots)
\end{align*}
Then $\norm{x}=\norm{y}=2^{\frac{1}{p}}$ and $\norm{x + y}=\norm{x - y}=2$.
An inner product space $X$ that is complete in the norm generated by the inner product is said to be a Hilbert space.
A Hilbert space is a Banach space. A subspace $Y$ or an inner product space $X$ is defined to be a vector subspace of $X$, with the inner product restricted to $Y \times Y$.
\end{defi}
\begin{thm}
Let $Y$ be a subspace of a Hilbert space $H$. Then
\begin{enumerate}[(i)]
\item$Y$ is complete $\iff$$Y$ is closed in $H$
\item$Y$ is finite-dimensional $\implies$$Y$ is complete
\item$H$ is separable $\iff$$Y$ is separable
\end{enumerate}
(A set $X$ is separable if $\exists M \subset X$ such that $M$ is dense in $X$)
\end{thm}
\begin{proof}
\noproof
\end{proof}
\begin{defi}
An element $x \in X$ is said to be orthogonal to an element $y \in X$ if $\innerproduct{x}{y}=0$. One also says that $x$ and $y$ are orthogonal in that case, and it is denoted as $x \perp y$.
Similarly, let $A, B \subset X$. Then
\begin{align*}
x \perp A &\iff\forall a \in A: \quad x \perp a \\
A \perp B &\iff\forall a \in A ~\forall b \in B: \quad a \perp b
\end{align*}
Let $M$ be a non-empty subset of $X$, then the distance between $x$ and $M$ is defined as
\[
\delta = \inf_{y \in M}\norm{x - y}
\]
A subset $M \subset X$ is said to be convex if
\[
\forall x, y \in M ~\forall\alpha\in [0, 1]: \quad (\alpha x + (1 - \alpha) y) \in M
\]
\end{defi}
\begin{thm}\label{thm:17.11}
Let $X$ be an inner product space and $M$ a non-empty, complete, convex subset of $X$. Then for every $x \in X$ there exists a unique $y \in M$ such that
The idea of the previous remark can be extended to infinite-dimensional inner product spaces. Let $\set{e_1, \cdots, e_n}$ be an orthonormal set in an infinite-dimensional space $X$.
With some $x \in X$, consider
\[
y := \sum_{k=1}^n \innerproduct{x}{e_k} e_k, \quad z := x - y
This does prove that $(S_n)$ is a Cauchy sequence in $H$ if and only if $(R_n)$ is a Cauchy sequence in $\realn$.
Now we want to prove the second statement. For this, let $x \in\sum_{k=1}^{\infty}\alpha_k e_k$. We can compute for $k \le n$ that $\innerproduct{S_n}{e_k}=\alpha_k$.
Since $S_n \conv{n \rightarrow\infty} x$ by the continuity of the inner product, it follows that
Consider the space $L^2([-1, 1])$, which is separable and is the space of all real-valued functions $x$ with the domain $[-1, 1]$, such that
\[
\int_{-1}^1 \abs{x(t)}^2 \dd{t} < \infty
\]
We want to find an orthonormal basis of functions for this space. For that we will consider the linearly independent set of polynomials $M =\set[n \ge0]{x_n}$,
where $x_n(t)= t^n, ~t \in[-1, 1]$. Then $\closure{\spn{M}}= L^2([-1, 1])$, so $M$ is a total set. However it is not orthonormal because
These $P_n(t)$ are called the (unassociated) Legendre polynomials. The set $\set[n \ge0]{e_n}$ constructed in this way is an orthonormal basis in $L^2([-1, 1])$:
\[
x = \sum_{n=0}^{\infty}\innerproduct{x}{e_n} e_n, \quad\forall x \in L^2([-1, 1])
\]
\item Hermite Polynomials
Consider $L^2(\realn)$. We can see that $t^n \not\in L^2(\realn)$ because
\[
\int_{\realn}\abs{t^n}^2 \dd{t} = \infty
\]
Instead, consider $M =\set[n \ge0]{x_n}$ with
\[
x_n(t) = t^n e^{-\frac{t^2}{2}}, \quad t \in\realn