3a1b2c3 / seeingSpace

A lightfield botanist's guide to neural rendering

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Sehender Raum / Seeing Space

Catching up with newer research in image based rendering: A TLDR on how traditional computer graphics will change with neural rendering. And how it fits with computer vision, machine learning and capture hardware.

With neural rendering computer graphics and vision might be heading for a moment where we can see and remove some limiting assumptions about what it is about. Some disjoint pieces may just fall into place: Computer graphics and vision now have a shared framework rather than being in nther own little boxes. We got things wrong and the possibilitis are exciting. I spent enternity on tools for generating photoreal 3d environments, now its easy as snap chat. We can have realtime google earth. I am allowed to be damned excited.

Moved to the WIKI of this repo.

Dont feel like reading but just want to try Nerfs yourself: Getting started-and Nerf frameworks

Overview and papers: Nice intro video https://www.youtube.com/watch?v=isKbsNKArJU https://youtu.be/sr1LG47J5uc?si=l4K59vOOlJyRv6KH nerf 101 video

1) Understanding Neural rendering (START HERE)

2) Nerf (radiance fields) basics, frameworks and real world uses

3) Nerf for 3d mapping: aka Google live view and Apple Fly around

4) Nerf Editing: Relighting, geometry extraction, scene segmentation

5) Dynamic and generative Nerfs: Text to nerf

6) Nerf rendering, compositing and web viewing

7) Related fields (Photogrametry, LIDAR, SLAM etc)

Hands on:

Join Nerf discord at https://discord.com/invite/atzsCcAXEh or the Nerfstudio discord at https://discord.gg/jKBErnzw and build the future with us

Why does it matter? It might be the biggest change change to graphics since the mid 80ies: We can now render lightfields, the holy grail of graphics. We know about light fields since 1908 but until 3 years ago they were just too expensive and impractical to create and render ("flying cars", possible but why do it?). You needed the camera on the picture below at least, now you render one from 30 frames of compressed video from youtube and a gaming machine or the cloud...

Getting hands on: Google colab setup for NVIDIA Instant Nerf, so you can try before buying a new gpu Step by step

About

A lightfield botanist's guide to neural rendering


Languages

Language:Jupyter Notebook 99.5%Language:Python 0.5%