Prometheus has received the following awards and nominations. Way to go!
We are a group of 4 high-school and 2 electrical engineering students interested in science, space, physic, music, programming etc.
The inspiration for this project came from the challenges that were presented to us by NASA. All the team members were really amazed and inspired by the James Webb Space Telescope that will be launched in 2021. This was an amazing opportunity for us to make our ideas come true.
Github: https://github.com/prometheussac/webby
Webby is, like mentioned, an interactive app made to educate about the James Webb Space Telescope and space in general. The game itself is simple and the language used is not complicated, therefore it is a good starting point for everyone who wants to learn about space. Knowing by ours childhoods, we know that it is easier to learn through a game. That is what we did-we made an educative game.
This game takes you through all the telescope pieces, combining of them and even including some interesting facts about the pieces.
There are many things we can and are willing to improve. The first thing will happen when the telescope is launched. Immediately after the telescope launches and starts its long voyage across the cosmos, the game will be implemented with an interface that includes Virtual Reality and Haptic Feedback technology all within a virtually designed "Spaceship" in which the "Passengers" aka the individuals using the game will be able to look through the "windows" of the "spaceship" and the image they shall see will actually be the images sent to us by the telescope itself. Not only that but using state of the art haptic feedback technology they will be able to interact with both the "spaceship" and each other, bringing cooperation and collaboration to another dimension. By a simple seek and find action, users of the app will also be able to see every single bit of information the telescope gathered about every aspect of the planet, they shall aswell be gifted with the ability of seeing a representation of what the planet could actually look like on real life achieved by using an algorithm that transforms planetary and star data into a visual representation.
We used cell phones and laptops as hardware tools but the main focus of our project was on the software. We used Microsoft Visual Studio Express c#, Photoshop, Postman...
https://images.nasa.gov/docs/images.nasa.gov_api_docs.pdf
https://jwst.nasa.gov/content/features/3dInteractive.html
https://webbtelescope.org/quick-facts/mission-launch-quick-facts
https://www.nasa.gov/mission_pages/webb/main/index.html
https://apod.nasa.gov/apod/image/1910/PIA12797-full.jpg
https://apod.nasa.gov/apod/image/1901/sombrero_spitzer_3000.jpg
we also used photos from the NASA sites.
the data we will be using on improving the project:
https://developers.google.com/vr/discover/360-degree-media
https://developers.google.com/vr/develop/web/vrview-web
and more ...