ShadowXZT / Implicit-remote-sensing-scene.github.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Implicit Neural Representations for Remote Sensing Scene

About The Project

(https://shadowxzt.github.io/Implicit-remote-sensing-scene.github.io/)

Remote sensing imagery, captured from top-view, can cover a large range of visual content. However, compared with ground-view data, they usually lack some informative details of the scene. In this context, recent progress on neural rendering and implicit neural representation makes the photorealistic cross-view synthesis possible by predicting the ground-view image given the top-view data. Cross-view synthesis enables the scene understanding from remote sensing and street views. Many applications can benefit from this task, including autonomous driving and navigation. Moreover, view synthesis between ground-view to top-view also bridges the gap between computer vision and remote sensing community and extends the application areas of remote sensing methods.

example

Prerequisites


Before the following steps, please make sure that your TUM VPN has been connected.

  1. First, run the following script to connect the annotation server
Windows PowerShell
  1. Then, run the following script in the powershell terminal:
ssh -L localhost:8002:172.17.0.3:8000 xshadow@AI4EO2.sipeo.lrg.tum.de
  1. Finally, open a browser and use an account to log into the annotation system:

    localhost:8002

Please Do Not Use the root account. The following accounts are avaliable now:

username: user1  passwd: user1
username: user2  passwd: user2
  1. In the table page, find a starting sample and left click on it to start the annotation.

Annotation Instructions

  1. Annotating about 9 (more than 6) pairs of corresponding points. Different points are displayed with different colors.

  2. Trying to find the semantic corresponding points from the given two images.

  3. Left click on the first image and then left click on the second image to annotate a pair of corresponding points.

  4. Using the mouse wheel to control the size of points. This is useful for double checking the anntoated points.

  5. Cornner points of windows and buildings are always the best choices.

  6. CTRL+S or click on the next arrow will automatically save the anntoations.

  7. Using the following link to view and select image samples for annotation:

    localhost:8002/table

  8. Using the following link to view the summary of the annotation process:

    localhost:8002/summary

(back to top)


Contact

Zhitong Xiong - zhitong.xiong@tum.de

Project Link: Implicit Neural Representations for Remote Sensing Scene

(back to top)


Problem statement:

Assume that we have got a 3D city model, which can be reconstructed by MVS from multiple satellite images. Given the 3D model and street-view images at several locations, we aim to render novel views at other locations continuously.

Data description:

  • Top view satellite image. The red dot indicates the location for capturing the street view images.
  • The corresponding 3D city model (mesh and texture), with pixel-wise segmentation annotations. Georeferenced to the top-view image and street view images.
  • The corresponding street view images at coordinate (6063,4597) of the 3D model. The filenames of these images indicate the camera poses (location, pitch, fov and heading).
  • The corresponding 360 street view image
  • Illustration of an initial idea on novel street view synthesis.

In Data_sample/panos.txt:

    Pano_ID,              Lat,               Lon,          Y in 3D model,     X in 3D model

    AXD-LhS8HF3v1XajBnV2nA, 60.1644797333783, 24.92898345052589, 4400.232103129849,  6056.963851299137

Pano_ID can be used to download the street view image; Lat and Lon can be used to download the Street view image or the top-view satellite image X and Y can be used to locate in the 3D mesh model and render the synthetic image

Mesh_render.py is a sample code for rendering the synthetic image. Using the street view API, we can get the corresponding real image.

About


Languages

Language:Jupyter Notebook 99.2%Language:Python 0.8%