This is fantastic. Having an underlying 3d representation should make this viable for videogames. This is not super dissimilar to what Meta showed at Connect a few days ago, although I noticed Meta's version of this had some sort of normal map generated as well?
Generating normals is not particularly difficult once you have triangles. I believe the method is to take one corner of the triangle, get the two vectors pointing towards the two other corners, and then get the cross product of these two vectors. This gives you a new vector that is perpendicular to the triangle's face, i.e. the normal.
There's little point in rendering normals computed from a triangle mesh to a normal map. The point of using textures is that they enable greater detail without having to add many vertices. With the right normal map, you can make a flat triangle look curved or bumpy.
Computing a normal map from video that still looks good if you change the lighting is a bit more difficult.
> There's little point in rendering normals computed from a triangle mesh to a normal map.
I disagree. It's a common workflow that I've used many times. The idea is you bake the normals from a high poly model with millions of verts and then apply the texture to a low poly model with matching uvs.
> Computing a normal map from video that still looks good if you change the lighting is a bit more difficult.
Yes, I agree. But it's also difficult to generate a clean mesh, and we can see that they've succeeded in doing that. The point I tried (and failed) to make is that if they can generate a mesh like that from a video, then normals shouldn't be too much of a stretch.