Revisiting the moon will involve many challenges. We will learn how to use the moon’s resources in order to sustain missions at a greater distance and duration. It will not be easy. Radiation, isolation, and dangerous and unexpected environments will test our limits like never before. We will face these challenges and expand the perimeter of the human presence in the interest of exploration.
Our approach inspires people by providing them a platform that stylizes images with random or custom moon images. These images can be used for any purpose e.g. in games, over clothes, etc.
We offer random moon pictures provided by NASA Image and Video Library's API. Users should simply click on the "Random Style" button. For our main purpose, we use Deep learning with neural style transfer algorithm. It composes one image in the style of another image by using neural network.
The neural style transfer is an optimization technique used to take two images — a content image and a style referenceimage (in our case a moon picture) — and blend them together so the output image looks like the content image, but “painted” in the style of the style reference image.
Except pictures, our platform supports stylizing videos.
Used technologies:
Requirements:
How to run the project:
In order to train a new style transfer network:
python style.py --style path/to/style/img.jpg \
--checkpoint-dir checkpoint/path \
--test path/to/test/img.jpg \
--test-dir path/to/test/dir \
--content-weight 1.5e1 \
--checkpoint-iterations 1000 \
--batch-size 20
In order to evaluate a style transfer network:
python evaluate.py --checkpoint path/to/style/model.ckpt \
--in-path dir/of/test/imgs/ \
--out-path dir/for/results/
You can even stylize a video:
python transform_video.py --in-path path/to/input/vid.mp4 \
--checkpoint path/to/style/model.ckpt \
--out-path out/video.mp4 \
--device /gpu:0 \
--batch-size 4
Source Code: https://github.com/Hulkstance/spaceapps