
Developers:
Marcos Urios: https://marcosurios.com
Nestor Sabater: https://nsabater.com
Elaine Zhang
“Developing a wider approach to human-machine interactions (HMI), putting special focus into the upcoming spacial journeys.”
Improvement controlling information and systems, simplifying interactions.
Target user: any space visitor.
Clients: brands specialized in space Human-Machine Interactions and not specialized in space. Also NASA can take part of our services.
Mixing a set of technologies like Leap Motion for motion tracking, Project North Star / Windows Hololens, Real time rendering engines, GPS, AI and a set of infrared sensors we can achieve very acurate mixed reality environment for improving the workflow of astronauts in the outer space, as wel as making peoples life improve to the next level.
- Handsfree devices control: it's right now possible to control handsfree a system.
- Display valuable information (holographic): it's real and possible the possibility
- Spacial orientation helper
- Remote systems control
- Open to new technologies
Of course this is just a preview developed in ~32h so It’s far from complete. More tests, UI improvements and technology mixes are required to go.
Testing all the technologies together is the next step to get this project to a future end, unfortunately we had no access to Project North Star devices by the time of this contest.
There are many solutions for this, the more integrated ones with the next technologies are Unreal Engine and Unity5
Used for realtime hand motion tracking, allows us not only to track movement acurately but also to find directions and gestures.

This setup facilitates the integration into astronauts helmet to display images directly into the helmet glass interior.
The North Star headset is published under the GPL license.
You can also access the project onGitHub.
NASA 3d models
Unreal Engine free models
Astronaut animations: Adobe Mixamo