Mstfakts / Building-Detection-MaskRCNN

Building detection from the SpaceNet dataset by using Mask RCNN.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Building-Detection-MaskRCNN v2.0

The aim of this project is to detect the buildings from the bird's-eye view pictures. I used SpaceNet dataset as a dataset. Because I had limited time, matterport's Mask RCNN implementation was used not to waste time by coding all the details of Mask-RCNN.

  • If you only need pretrained weight file, leave a star for this project and send an email to mstfakts98@gmail.com. I will respond in 5-6 hours (As soon as possible).
  • If you need more clarification or have any question please open an issue first.
  • If the files are not opened, click nbviewer and copy+paste the files' link

You may want to check all the Results of version 2.0.

TestResult_v2/Test1.png TestResult_v2/Test2.png TestResult_v2/Test3.png TestResult_v2/Test8.png

Here is an output from version 1.0 (See how the model was improved.) (All former outputs): TestResult/successful (1).png

Project is divided into 5 steps:

1- Download the SpaceNet Dataset

To download the dataset on your computer, visit the SpaceNet's website, and check the 'Dependencies'. As it says, you need to have an AWS account, then download the AWS Command Line Interface (CLI). In this project, I downloaded the SpaceNet Buildings Dataset V2, but only used 'AOI 2 - Vegas' part of it. Then, copy-paste the commend lines that are shown on the web page to CLI.

2- What is Mask-RCNN?

To comprehend what MRCNN is you can read this paper. Here are some helpful websites that helped me out a lot;

3- From Theory to Implementation

Okay, we got the idea of the MRCNN, but how are we going to impelement those theoretical stuff to real code? If you are not concerned with time, please try to implement by yourself. However, I had limited time for this project. So, I used matterport's Mask RCNN implementation. You can search and find other implementations. Do not forget to check the projects that are on matterport's Mask RCNN page. Also, it is possible to search similar projects and read thier codes. It will improve your coding skill.

4- Preprocess the Data

After you download the dataset, you can check how geojson files look like by running Display_GeoJSON.ipynb

While I was striving to complete this project, I got lots of errors because of the name of the training files. After you download the training dataset, you will notice that the name of files are not in order. I mean, the data includes 'RGB-PanSharpen_AOI_2_Vegas_img1.tif' and 'RGB-PanSharpen_AOI_2_Vegas_img3.tif' but 'RGB-PanSharpen_AOI_2_Vegas_img2.tif'. So, We need to put them in order. To do this, you can run Rename_Files.ipynb

We have the dataset and its files' name are in order, but its format it TIFF. So, I converted TIFF file to the RGB file by running TIF_to_PNG.py

To train the model, we need the labels, too. We will create these labels by using TIFF files and its corresponding GeoJSON files. We need the TIFF file to adjust the position of the GeoJSON coordinates to the specific picture. You only need to follow Create_Masks.ipynb

If you want to see RGB and its corresponding Mask, run Display_Mask_and_RGB_Image.ipynb

5- Training the Model & Testing

To train the model, you need to know what size of model your computer can work with. For example, I was using 'resnet101' as a backbone, but I got OOM (Out Of Memory) error, then I reduced it to the 'resnet50'. If it is possible, try to work with 'resnet101'. Also, do not forget to adjust configuration part regarding to your computer and dataset. Please analyze the SpaceNet_train.py part. Another thing that may be helpful for you, I share my trained h5 file (leave a star for this project and send an email to the mail-address below). I trained 151 epoach for 'all' part of the model.

That's it. If you need more clarification or have any question please open an issue.

About

Building detection from the SpaceNet dataset by using Mask RCNN.


Languages

Language:Jupyter Notebook 74.0%Language:Python 26.0%