Introduction#

For nearly five years now, equadratures 1, a pure python code, has been made openly available to the computational methods community. Over the last two years alone, it has been downloaded well over 40,000 times, and is being used across industry, government, and academia. Today, equadratures is a model development platform facilitating explainable and robust models that do not require large cloud infrastructure to train, unlike many deep learning frameworks. Models built in equadratures can be used for a wide range of tasks spanning uncertainty quantification, sensitivity analysis, numerical integration, optimisation, clustering, parameter-space exploration, dimension reduction, surrogate modelling, and even digital twinning.

When originally developed five years ago, equadratures (known then as Effective Quadratures) was purpose-built for facilitating non-intrusive uncertainty quantification through polynomial chaos expansions 234. The unique selling point of the code was an adoption of least squares and compressed sensing-based approaches for estimating polynomial coefficients, rather than opting for more conventional tensor and sparse grid strategies. This permitted greater anisotropy in the selection of basis functions used in the polynomial expansions, as well as a reduction in the number of samples required with rising problem dimensionality. The overall workflow can best be summarised in the steps below.

          User prescribes input marginal distributions for each uncertainty.

                                         │
                                         ▼

Least squares or compressed sensing framework is used to generate a design of experiment.

                                         │
                                         ▼

                   Model is evaluated at the design of experiment.

                                         │
                                         ▼

                        Polynomial coefficients are computed.

                                         │
                                         ▼

     Moments and sensitivities (Sobol' indices) are determined from the coefficients.

This shift towards least squares and compressed sensing is well captured in literature 56 as are the different strategies for arriving at well conditioned matrices and sampling distributions 789 . There is a clear trend to opt for random sampling techniques paired with well-worn algorithms for identifying well-conditioned submatrices, e.g., QR with column pivoting and SVD-based subset selection 10 , among other convex optimisation rooted techniques (see 11 for a comparison).

Over the past few years, equadratures has grown in capability, ingesting and synthesising key advances in literature to accelerate application impact. One of the most fruitful advances have been in parameter-space dimension reduction, where the central idea is to ascertain whether a function admits a dimension reducing subspace—i.e., a few linear combination of all the variables which may be used for function approximation. While the idea is itself not unprecedented 12, it has experienced a resurgence owing to a prodigious number of publications under the handles of active subspaces 13, sufficient dimension reduction 14 and ridge functions 15 (to name a few). These works have been championed by researchers spanning both academia and industry—with impactful use cases 16171819 that likely serve as a precursor to further advances within the areas of function approximation. A practical outlook on the success of data-driven dimension reduction in computational science may be enforced by the notion that we trust our models within a relatively small parameter space and conduct our parameter studies accordingly. Thus, function values around a notional centroid may be well-approximated via linear projections of neighboring parameter values.

Beyond dimension reduction, ancillary progress on robust methods for dealing with correlations in the inputs, i.e., identifying independent polynomial basis on correlated spaces 20 has been important for driving forth the uptake of polynomial-based methods and thus equadratures. This builds upon prior work with Nataf and Rosenblatt transformations 21. These advances have permitted the construction of stable global polynomials across both high order and high dimensional problems. This naturally leads one to consider leveraging polynomials across a wider range of challenges including optimisation, multi-fidelity modelling, spatial-field modelling and dimension reduction.

It is important to remark that global smoothness and continuity are certainly not guaranteed for all problems. Thus, strategies to fit multiple polynomials over a domain are extremely useful, especially when working with problems that are characterised by a relatively large parameter space. This may take the form of trust region methods 22 or tree-based methods 23, where polynomials need to be defined over a subdomain in a recursive manner—based on certain approximation error criterion. Within equadratures these ideas have found utility in polynomial regression tree models and trust-region optimisation methods. In fact for the latter, if one further assumes that a subspace-based dimension reduction representation exists, then from the perspective of optimiser, as the trust region migrates through a larger parameter space, finding such projections iteratively may be incredibly valuable, both from the perspectives of convergence rate and optimality (see 24).

References#

1

Pranay Seshadri and Geoffrey Parks. Effective-quadratures (eq): polynomials for computational engineering studies. The Journal of Open Source Software, 2:166–166, 2017.

2

Dongbin Xiu and George Em Karniadakis. The wiener–askey polynomial chaos for stochastic differential equations. SIAM journal on scientific computing, 24(2):619–644, 2002.

3

Paul G Constantine, Michael S Eldred, and Eric T Phipps. Sparse pseudospectral approximation method. Computer Methods in Applied Mechanics and Engineering, 229:1–12, 2012.

4

Pranay Seshadri, Akil Narayan, and Sankaran Mahadevan. Effectively subsampled quadratures for least squares polynomial approximations. SIAM/ASA Journal on Uncertainty Quantification, 5(1):1003–1023, 2017.

5

Albert Cohen, Mark A Davenport, and Dany Leviatan. On the stability and accuracy of least squares approximations. Foundations of computational mathematics, 13(5):819–834, 2013.

6

Nora Lüthen, Stefano Marelli, and Bruno Sudret. Sparse polynomial chaos expansions: literature survey and benchmark. SIAM/ASA Journal on Uncertainty Quantification, 9(2):593–649, 2021.

7

Ling Guo, Akil Narayan, and Tao Zhou. Constructing least-squares polynomial approximations. SIAM Review, 62(2):483–508, 2020.

8

Mohammad Hadigol and Alireza Doostan. Least squares polynomial chaos expansion: a review of sampling strategies. Computer Methods in Applied Mechanics and Engineering, 332:382–407, 2018.

9

Ji Peng, Jerrad Hampton, and Alireza Doostan. A weighted ℓ1-minimization approach for sparse polynomial chaos expansions. Journal of Computational Physics, 267:92–111, 2014.

10

Åke Björck. Numerical methods in matrix computations. Volume 59. Springer, 2015.

11

Pranay Seshadri, Gianluca Iaccarino, and Tiziano Ghisu. Quadrature strategies for constructing polynomial approximations. In Uncertainty Modeling for Engineering Applications, pages 1–25. Springer, 2019.

12

Alexander M Samarov. Exploring regression structure using nonparametric functional estimation. Journal of the American Statistical Association, 88(423):836–847, 1993.

13

Paul G Constantine. Active subspaces: Emerging ideas for dimension reduction in parameter studies. SIAM, 2015.

14

R Dennis Cook and Liqiang Ni. Sufficient dimension reduction via inverse regression: a minimum discrepancy approach. Journal of the American Statistical Association, 100(470):410–428, 2005.

15

Allan Pinkus. Ridge functions. Volume 205. Cambridge University Press, 2015.

16

Pranay Seshadri, Shahrokh Shahpar, Paul Constantine, Geoffrey Parks, and Mike Adams. Turbomachinery active subspace performance maps. Journal of Turbomachinery, 140(4):041003, 2018.

17

Ashley D Scillitoe, Bryn Ubald, Pranay Seshadri, and Shahrokh Shahpar. Design space exploration of stagnation temperature probes via dimension reduction. In Turbo Expo: Power for Land, Sea, and Air, volume 84089, V02CT35A057. American Society of Mechanical Engineers, 2020.

18

Paul G Constantine and Paul Diaz. Global sensitivity metrics from active subspaces. Reliability Engineering & System Safety, 162:1–13, 2017.

19

Paul G Constantine, Michael Emory, Johan Larsson, and Gianluca Iaccarino. Exploiting active subspaces to quantify uncertainty in the numerical simulation of the hyshot ii scramjet. Journal of Computational Physics, 302:1–20, 2015.

20

John D Jakeman, Fabian Franzelin, Akil Narayan, Michael Eldred, and Dirk Plfüger. Polynomial chaos expansions for dependent random variables. Computer Methods in Applied Mechanics and Engineering, 351:643–666, 2019.

21

Lukas Novak and Drahomir Novak. Polynomial chaos expansion for surrogate modelling: theory and software. Beton-und Stahlbetonbau, 113:27–32, 2018.

22

Jorge Nocedal and Stephen J Wright. Trust-region methods. Numerical Optimization, pages 66–100, 2006.

23

Leo Breiman, Jerome H Friedman, Richard A Olshen, and Charles J Stone. Classification and regression trees. Routledge, 2017.

24

James C Gross and Geoffrey T Parks. Optimization by moving ridge functions: derivative-free optimization for computationally intensive functions. Engineering Optimization, pages 1–23, 2021.