The flare explanation is not necessary correct to me. Flare usually causes two kinds of artifacts. One is adding haziness to the whole image. The other is creating artifacts alike the ghost of lens' shape. Neither applied to iphone 5 purple images. Flare won't turn blue into purple. Purple means lack of green, which is closely related to luminance channel. If you check the purple haze of the images reported on CNN. The purple pixels are about [255 150 255] in sRGB space. So the luminance channel is somehow not preserved in final output image. Given purple haze happens at over-exposed area, it's odd that green channel is the lowest, while the B/R are saturated. I suspect there can be a bug in iphone 5 's camera image signal processing pipeline / algorithms. There might be something related to inappropriate handling of signal range during image processing that causes unwanted signal clipping.