ChristophRaab / nso

Code release for the KI 2020 Paper "Low-Rank Subspace Override for Unsupervised Domain Adaptation".

Home Page:https://link.springer.com/chapter/10.1007/978-3-030-58285-2_10

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Python Source code now available!

For a demo run
pyhon code/nso_demo.py
Requirments: Scikit-Learn,Numpy
Implementation oriented at the Scikit-Learn interface.

Nyström Subspace Override

Matlab Source Code for the accepted KI 2020 Paper "Low-Rank Subspace Override for Unsupervised Domain Adaptation". Link to paper (springer)

Folders are self-explaining. If you encounter any problems with the repository, please open up an issue here or write me a message!

Demo and Reproducing:

For a demo and reproducing of performance/time results start demo.m

Main file:

nso.m (Submission Algorithm)

Secondardy Files:

pd_ny_svd.m
libsvm (folder)
augmentation.m

Tutorial:

[model,K,m,n] = nso(Xs',Xt',Ys,options);
[label, acc,scores] = svmpredict(full(Yt), [(1:n)', K(m+1:end, 1:m)], model);
fprintf('\n NSO %.2f%% \n', acc(1));

Assume traning data Xs with size d x m and test data Xt with size d x n. Label vector Ys and Yt accordingly. nso accecpts the data and an options struct. With this struct the user can specify:

NSO Specific:
options.landmarks ~ Number of Landmarks (int)
SVM Specific: 
options.gamma ~ Gamma of SVM (int)
options.smvc ~ C Parameter of SVM (int)
options.ker ~ Kernel Type "linear|rbf|lab" (string)

The functions outputs a libsvm model and a kernel over training and test data modified. The training data is modified by NSO algorithm.

Reproducing Plots:

Figure 1: Sensitivity of landmark-parameter: landmarkperformance_plot.m
Figure 2: Process of NSO: plot_process.m

Reproducing of deep learning results:

See this repo for source code, data and guide for reproducing the deep learning results.

Abstract of Submission:

  Domain adaptation focuses on the reuse of supervised learning models in a new context. Prominent applications can be found in robotics, image processing or web mining. In these areas, learning scenarios change by nature, but often remain related and motivate the reuse of existing supervised models.
While the majority of domain adaptation algorithms utilize all available source and target domain data, we show that efficient domain adaptation requires only a substantially smaller subset from both domains. This makes it more suitable for real-world scenarios where target domain data is rare. The presented approach finds domain invariant representation for source and target data to address domain differences by overriding orthogonal basis structures. By employing a low-rank approximation, the approach remains low in computational time. 
The presented idea is evaluated on typical domain adaptation tasks with standard benchmark data.

About

Code release for the KI 2020 Paper "Low-Rank Subspace Override for Unsupervised Domain Adaptation".

https://link.springer.com/chapter/10.1007/978-3-030-58285-2_10

License:MIT License


Languages

Language:C++ 54.0%Language:MATLAB 23.4%Language:C 12.1%Language:HTML 6.6%Language:Python 2.7%Language:TeX 0.7%Language:Makefile 0.5%