codezonediitj / adaboost

Implementation of AdaBoost Algorithm

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AdaBoost

Build Status Join the chat at https://gitter.im/codezoned2017/Lobby contributions welcome

About Us

We are some machine learning enthusiasts who aim to implement the adaboost algorithm from scratch.

Technologies

We are using the following technologies in our project,

  1. C++
  2. Python
  3. CUDA C
  4. Google Test
  5. Boost.Python

Building from source

Linux

  1. Clone Repository to local machine git clone https://github.com/codezonediitj/adaboost
  2. Move to back to parent directory, cd ../
  3. Execute, mkdir build-adaboost
  4. Execute, cd build-adaboost
  5. Execute, cmake -D[OPTIONS] ../adaboost
  6. Execute, make. Do not execute, make -j5 if you are using -DINSTALL_GOOGLETEST=ON otherwise make will try to link tests with gtest gtest_main before GoogleTest is installed into your system.
  7. To test, run, ./bin/*. Ensure that you have used the option -DBUILD_TESTS=ON in step 5 above.

Windows

  1. git clone https://github.com/codezonediitj/adaboost
  2. Move to back to parent directory, cd ../
  3. Execute, mkdir build-adaboost
  4. Execute, cd build-adaboost
  5. Install CMake from https://cmake.org/download/. You can also follow the steps given at, https://cgold.readthedocs.io/en/latest/first-step/installation.html#windows
  6. Open cmake GUI and put the adaboost directory as source code in the source code field and build-adaboost directory in the build binaries field.
  7. Select the cmake options(see below) which you want to use for building, then click Configure and Generate, to build the files .

We provide the following options for cmake,

  1. BUILD_TESTS

By default OFF, set it to ON if you wish to run the tests. Tests are stored in the bin under your build directory.

  1. INSTALL_GOOGLETEST

By default ON, set it to OFF if you do not want to update the already existing GoogleTest on your system. Note that it uses this release of googletest.

  1. CMAKE_INSTALL_PREFIX

Required for installing if not installing to /usr/local/include on Linux based systems. Defines the path where the library is to be installed.

Installing

Follow the steps for building from source. After that run the following,

Linux

sudo make install

Windows

cmake install <path to your build directory>

How to contribute?

Follow the steps given below,

  1. Fork, https://github.com/codezonediitj/adaboost
  2. Execute, git clone https://github.com/codezonediitj/adaboost/
  3. Change your working directory to ../adaboost.
  4. Execute, git remote add origin_user https://github.com/<your-github-username>/adaboost/
  5. Execute, git checkout -b <your-new-branch-for-working>.
  6. Make changes to the code.
  7. Add your name and email to the AUTHORS, if you wish to.
  8. Execute, git add ..
  9. Execute, git commit -m "your-commit-message".
  10. Execute, git push origin_user <your-current-branch>.
  11. Make a PR.

That's it, 10 easy steps for your first contribution. For future contributions just follow steps 5 to 10. Make sure that before starting work, always checkout to master and pull the recent changes using the remote origin and then start following steps 5 to 10.

See you soon with your first PR.

Guidelines

We recommend you to introduce yourself on our gitter channel. You can include the literature you have studied relevant to adaboost, some projects, prior experience with the technologies mentioned above, in your introduction.

Please follow the rules and guidelines given below,

  1. For Python we follow the numpydoc docstring guide.
  2. For C++ we follow our own coding style mentioned here.
  3. For C++ documentation we follow, Doxygen style guide. Refer to various modules in the existing master branch for the pattern.
  4. Follow the Pull Request policy given here. All changes are made through Pull Requests, no direct commits to the master branch.

Keep contributing!!

About

Implementation of AdaBoost Algorithm

License:Other


Languages

Language:C++ 97.7%Language:CMake 1.7%Language:Cuda 0.5%