Use the steps below for a local installation of HARFE. Requires Python >= 3.6.
$ git clone https://github.com/esaha2703/HARFE.git
$ pip install - e .
Use the code below to learn the function
import numpy as np
from harfe import harfe
from harfe.utils import generate_omega_bias, feature_matrix
dimension = 10 # input dimension
num_points = 500 # number of sampling points
inp = np.random.uniform(-1, 1, (num_points, dimension)) # Input data
out = inp[:,1]**2 + inp[:,5]*inp[:,7] + np.cos(inp[:,9]) # Output function f(x) = x2^2 + x6*x8 + cos(x10)
# generate random weights and biases
weights, bias = generate_omega_bias(rows = 5000, columns = d, weight = 1, par1 = -1, par2 = 1,
distribution = 'norm-uni', bool_bias = True, sparsity = 2)
#build random feature matrix of the form A = sin(weights*inp + bias) and normalize the columns
A = feature_matrix(inp, weights, bias, activation = 'sin', dictType = 'SRF')
scale_A = np.linalg.norm(A, axis = 0)
A /= scale_A
# Implement HARFE algorithm
coeff, rel_error, iterations, _ = harfe(out, A, N = 5000)
#recover the function using learnt coefficients using out ~ A*coefficients
out_recovered = np.matmul(A, coeff)
print('Relative error:', rel_error[-1],'\nIterations required:', iterations)
Given data
HARFE solves the problem of representing
Let
We solve this iteratively using the Hard Thresholding Pursuit algorithm i.e.,
Start with
-
$S^{n+1}$ = {$s$ largest entry indices of (1 -$\mu$ $m$ $\lambda$ )$\mathbf{c}^n$ +$\mu$ $A^{*}$ $(y - A \mathbf{c}^n)$ } -
$c^{n+1}$ = argmin {$||b - A\mathbf{z}||_2^2$ +$m$ $\lambda$ $||\mathbf{z}||_2^2$ , supp($\mathbf{z}$ )$\subset$ $S^{n+1}$ }.
- func_harfe.ipynb: Exhibits approximation of Friedman function of type 1 using HARFE.
- data_harfe.ipynb: Exhibits function approximation of propulsion dataset using HARFE.
Email esaha@uwaterloo.ca if you have any questions, comments or suggestions. Please cite the associated paper if you found the code useful in any way:
@misc{https://doi.org/10.48550/arxiv.2202.02877,
doi = {10.48550/ARXIV.2202.02877},
url = {https://arxiv.org/abs/2202.02877},
author = {Saha, Esha and Schaeffer, Hayden and Tran, Giang},
keywords = {Machine Learning (stat.ML), Machine Learning (cs.LG), Optimization and Control (math.OC), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Mathematics, FOS: Mathematics},
title = {HARFE: Hard-Ridge Random Feature Expansion},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}