Imagine a weather app that lets you see detailed 3D models of the massive cloud structures above; the elegant wisps of cirrus, the bulky storm clouds as they roll across the sky… we have the technology to this, we just haven’t done it yet.
The concept is simple: a network of cameras pointed at the sky, spread far apart from each other to capture different perspectives, and we use software to combine all the images into a live interactive 3D model.
This technique is called photogrammetry, and is widely used for making 3D models of surface features using aerial drones.
In terms of hardware, “Whole Sky Cameras” (wiki) could be used to capture hemispheric images of everything above the horizon. A live stream would then be setup, (similar to WU’s weather webcam system) and location data would be added to the webcam.
Initially, a company wanting to pioneer this idea could blanket one specific area with webcam coverage, as a proof of concept to showcase the 3D models. As the network grows, the site would automatically merge other people’s webcams into the global crowdsourced model.
Awe-inspiring sculptures of colorful sunsets, meandering tornados, and billowing volcanic eruptions — That’s what we can capture with this technology.
Not only would this satisfy the curious mind, this tech can also be valuable for research, atmospheric studies (meteorological prediction improvements, fluid dynamics) flight simulation models, etc.
If you'd like to contribute to this idea, feel free to share your github project in the comment section below (click here if you don't see the comment box) and with your permission I'll embed it in this article.
View this article on Medium.