Tuesday 24 February 2015

More images of Nicolas Cage

I've refined my method a little bit and created a couple of good images that I like enough.

It now fills in the background with unique images (this doesn't work correctly as of 25th feb due to upgrades on the algorithm that places the background)




 

These two images have been good tests so far.



This image is nearing the level that I want

Tuesday 17 February 2015

Main work so far

I've been putting together a project using some image processing techniques to achieve a static image result.

I've also been using (whisper) image search engine scraping to download unique images from the web in bulk.

So far I have been able to come up with an example image of where I'm up to with this project:


As you can most likely see, I'm trying to recreate an image out of other unique images. 

My next step is to fill the background with images first before anything else.

Technical updates:
 - I want unique images so I'm using perceptual hashing to check incoming images for uniqueness.
 - I was using a list of images until I realised that checking the hamming distance for uniqueness would be best suited to using a hashmap so I've built a new data structure base on the assumption that I'll be doing a hamming distance on the hashes for the hashmap.
 - I've got some ideas in mind for trying to fill in the remaining space on the image:
     - Grid up the reference image in sections that are have the same number of pixels as the average number of pixels per image and place the an image per area and then doing passes of area per image (its currently doing multiple passes of area per image only).
     - Store all the placements that were performed during processing, and sort them by score, then bump images that are close to or under higher scored images down a score until they are sufficiently separated. This would be great with some way to balance separation with high score.
 - I should probably try using PyPy rather than the standard Python compiler that I'm using currently because while its faster than doing it by hand, I am doing quite a large amount of processing (specifically image processing and list comprehension) in a language that is notoriously slow.
 - I need to eventually get this program onto a computer with more than 10Gb of RAM because I cannot scale images up as much as I want to because as soon as images get decompressed to memory (jpegs ~50MB) , things die.