Remark on Algorithm 1012: Computing projections with large data sets
In ACM TOMS Algorithm 1012, the DELAUNAYSPARSE software is given for performing Delaunay interpolation
in medium to high dimensions. When extrapolating outside the convex hull of the training set, DELAUNAYSPARSE
calls the nonnegative least squares solver DWNNLS to compute projections onto the convex hull. However,
DWNNLS and many other available sum of squares optimization solvers were not intended for usage with many
variable problems, which result from the large training sets that are typical in machine learning applications.
Thus, a new PROJECT subroutine is given, based on the highly customizable quadratic program solver BQPD. This
solution is shown to be as robust as DELAUNAYSPARSE for projection onto both synthetic and real-world data
sets, where other available solvers frequently fail. Although it is intended as an update for DELAUNAYSPARSE,
due to the difficulty and prevalence of the problem, this solution is likely to be of external interest as well.
Chang, Tyler H., Layne T. Watson, Sven Leyffer, Thomas C. H. Lux, and Almohri, H. M. j.. “Remark on Algorithm 1012: Computing Projections with Large Data Sets.” ACM Transactions on Mathematical Software, vol. 50, no. 2, 2024, Article 2, pp. 1–8.