Polynomials#

Assume that \(f\) is sufficiently smooth and continuous such that it can be approximated by a global polynomial \(p\) \begin{align} \begin{split} f \left( \boldsymbol{x} \right) & \approx p \left( \boldsymbol{x} \right) \\ &= \sum_{j=1}^{N} \alpha_{j} \boldsymbol{\phi}_{j}\left( \boldsymbol{x}\right), \\ &= \boldsymbol{P} \boldsymbol{\alpha} \end{split} \label{equ:polyapprox}{\tag{1}} \end{align}

defined as a weighted sum of \(N\) known basis polynomials, where

\begin{align} \boldsymbol{\phi}_{j} \left(\boldsymbol{x} \right)&=\prod_{k=1}^{d}\phi^{(k)}_{j_{k}}\left(x^{(k)}\right), & \left(j_1, \ldots, j_d \right) \in \mathbb{N}_0^d, \end{align}

and where \(\left[\boldsymbol{P} \right]_{i,j}=\boldsymbol{\phi}_{j}\left(\boldsymbol{x}_{i}\right)\) for some discretised value \(\boldsymbol{x}_{i} \in \mathcal{X}\). Note that the polynomials \(\boldsymbol{\phi}_{j}\) defined in this way are mutually orthogonal in \(L^2\) weighted by \(\boldsymbol{\rho}\). More specifically, we can state that these composite univariate polynomials must satisfy

\begin{equation} \int_{\mathcal{X}^{k}}\phi_{g}\left(x^{(k)}\right)\phi_{h}\left(x^{(i)}\right)\rho_{k}\left(x^{(k)}\right)dx^{(k)}=\delta_{gh}, \end{equation}

where \(\delta_{gh}\) is the Kronecker delta; subscripts \(g\) and \(h\) denote polynomial orders. The above expression crystallizes the choice of the orthogonal polynomial family based on the choice of the weight function \(\rho_{k} \left( x^{(k)} \right)\). For instance if \(\rho_{k} \left( x^{(k)} \right)\) were the uniform distribution with \(\mathcal{X}^{(k)} \in [a, b]\) then \(\left\{ \phi_{j} \left( x^{(k)} \right) \right\}_{j=0}^{\infty}\) would correspond to Legendre polynomials; for Gaussian weights one would use Hermite polynomials, and so on. Details about these weight-polynomial pairs can be found in 1. Succinctly stated, choosing the correct weight-polynomial pairs will lead to an exponential convergence in \(\eqref{equ:polyapprox}\) thereby reducing the number of terms \(N\) required. As a consequence, this will also reduce the number of model evaluations needed to estimate the unknown coefficients \(\left(\alpha_1, \ldots, \alpha_{N} \right)\).

Before moving on to the computation of the coefficients, a few words on the computation of the individual univariate orthogonal polynomials \(\phi_{j}\left(x^{(k)} \right)\) are in order. Orthogonal polynomials of any order may be given by a three-term recurrence relation

\begin{equation} \beta_{j+1} \phi_{j+1}\left(x^{(k)} \right) = \left(x^{(k)} - \alpha_{j} \right) \phi_{j}\left(x^{(k)} \right) - \beta_{j}\phi_{j-1}\left(x^{(k)} \right), \; \; \; \; \textrm{for} \; \; \; j=0, 1, 2, 3, \ldots \end{equation}

where \(\left(\alpha_{j}, \beta_{j} \right)\) are a set of recurrence coefficients explicitly known, dependent upon \(\rho_{k}\left( x^{(k)} \right)\) 2. A four-term recurrence formula can also be derived for the derivatives of orthogonal polynomials.

References#

1

Dongbin Xiu and George Em Karniadakis. The wiener–askey polynomial chaos for stochastic differential equations. SIAM journal on scientific computing, 24(2):619–644, 2002.

2

Walter Gautschi. On generating orthogonal polynomials. SIAM Journal on Scientific and Statistical Computing, 3(3):289–317, 1982.