I’m working on an online-capable full-scale space flight simulation. My goal is, among other things, to be able to board the Starship at the SpaceX base on Earth and to be able to experience the entire flight into orbit, to the moon to Mars, online. Thanks to WebGPU, the new web graphics standard, such a project is possible for the first time.

Since the Starship begins its journey from Earth, the Earth must of course be in the original scale in order to get a real impression. In both the Mars and Earth screenshots, the sun is not an image but really a geometric 3D sphere with the real sun radius and the real distance from the sun. So if you fly long enough you could actually fly to the sun.

The screenshots are from my first attempt to show the entire Mars and Earth at their original scale with real map material approaching from the distance of space down to the planet’s surface. For this I work with three.js. Although it already works quite well. In the screenshots I used a stitching LOD geometry system and a large-scale map loader. But for what I have in mind, I had to improve the whole thing significantly in order to be able to use much higher resolution map material even more efficiently. And for this I worked on two very important system components that also use Unity and the Unreal Engine to display highly detailed landscapes. A CDLOD geometry system and a virtual texture system.

CDLOD (Continious Distance Level Of Detail) Geometry System

In the left picture you can see the stitching LOD geometry system that I used to create the planets. The skirts turned out to be unnecessary because the stitching works so well that I never saw any gaps between the LODs. The disadvantage of stitching LOD is that when you move, new areas suddenly appear with better resolution and that works better. So I thought about how do Unity and the Unreal Engine do this with their landscapes? And the answer was a CDLOD geometry system. So I got to work and developed a CDLOD geometry system. In order to get a better impression of it, I made a video of it. Since I also started wanting to simulate the ocean, I depicted it in the video with a moving water surface.

New areas initially appear without any noticeable change with the same resolution as the previously larger area and, depending on the camera position, morph into the higher-resolution area. The whole thing runs on my Samsung Galaxy Tab S8 in the Chrome browser without any problems.

A Virtual Texture System

An online-capable virtual WebGPU texture system was quite a challenge. For this I had to initiate some extensions in three.js with the developers. But what exactly is a virtual texture system and why is it so incredibly valuable? Another name for virtual textures is Megatexture. With a virtual texture system it is possible to use gigantic textures (16k, 32k, 64k, … xM, … xG, …) highly efficiently. The most well-known example is Google Maps. Since the camera can never capture the entire area with maximum resolution, it makes no sense (and is not possible at all in terms of computing and memory requirements) to load the entire texture. A virtual texture system only loads the parts of a texture that are in the camera’s field of view and, depending on the distance, only the necessary resolution. A video at the bottom left from the unreal engine documentation illustrates this or the associated source. And since I wanted to develop it just as well, right next to it a video with my virtual texture system with a complex 3D model from grafxbox, he made a good looking spaceship which i bought at cgtrader. At the moment I’m in the process of integrating the normal texture as well. And after the specular texture so that light reflections can also be fully appreciated.

A virtual texture system does something similar to a CDLOD system with textures. In addition, with a virtual texture system, the texture fragments within the camera’s field of view must be loaded and this must also work quickly and efficiently. I use multithreading for this. I downloaded the highest resolution maps of Earth from my NASA account. These are enormous amounts of data. But with my virtual texture system I can now use this to significantly improve my previous program, which is already running on my tablet as you can see in the screenshots above. But with my virtual texture system I basically can make my own Google Maps but that’s not my goal. Only with my virtual texture system can I now texture a starship in such detail so that you can not only see it from the outside, but also enter it and experience it from the inside. 3D models with 32k textures and more enable you to experience a new world online. With the starship, a resolution of 3 mm per texture pixel would be achieved with a 16k texture, and a resolution of 1.5 mm per texture pixel with a 32k texture. I have seen several times in the three.js forum that some people want to be able to use high-resolution textures. On the one hand, there are limits to this due to the browser and, on the other hand, due to the enormous resource consumption of very big textures themselves. So a virtual texture system solves a lot of problems and I am prepared to make all of this open source and to document it cleanly so that everyone can work with it quickly.

 

 

Scroll to Top