As a photographer, I’m ashamed to say that I only recently learnt the term “computational photography” while reading up on the capabilities of iPhone 11 Pro as a camera. I’m so impressed by the images that I’m seeing out there, but I’m holding out for iPhone 12 despite the temptation to upgrade from my iPhone 8 (which is very capable of producing amazing images, except it’s limited by the field of view).
A photography-enthusiast-slash-engineer friend once asked me about using manual lenses, so I shared my knowledge and experience with using manual lenses. But the conversation came to a stop when he started talking about the physics of photography.
“Sorry, people like me go by feels.”
I stopped the discussion because there was no way I could continue with the discussion as an art student. The science of photography is something I never really paid attention to because it’s not in my field of interest.
A good image conveys emotions. That’s all I care about.
Yet, ever since I learnt about computational photography, I’ve been very interested in what the future may hold for photographers and videographers like myself.
I asked the same friend if he thinks computational photography can replace mirrorless cameras and DSLRs and whether he thinks the future of photography will be pocketable devices like our smart phones. This was what he had to contribute:
Hard to say. If [the] industry keeps working at it. And computers get more powerful, maybe. Like how CG replaced many props in movies. […] Physics still plays a part in CG. But there are physics engines (coded algorithms and calculations) to make things real. One day something as small as [a] smartphone can replace the big cameras – compare the current cameras to the huge camera obscures [in the past]. [It’s] not too unimaginable.
That really made me wish the future is now.
The past year has seen me asking my travel buddies if I should take my Fujifilm XT2 with me on my travels, or if I should travel light. I even conducted a poll on Instagram Story, and the majority voted that I should take my camera with me on my travels. And so, I went with the majority.
But after a few overseas non-work related trips, I think I have found my answer.
During my trips, I found myself consistently reaching out for my iPhone that fits nicely into a small shoulder bag more than I had wished for my Fujifilm XT2, which was left in the hotel room most of the time because it’s just too cumbersome for days I wasn’t planning on shooting. As a bokeh whore, I sometimes do wish that I had my XT2 with me, but there’s really nothing that’s stopping me from producing interesting pictures with my iPhone. With all the photography and videography apps available these days, it’s not too difficult to create an interesting picture good enough for personal viewing and social media.
Although the image quality is far from what my Fujifilm XT2 is capable of, what my iPhone 8 provides is speed of setting up and light weight. Sometimes it’s just quicker to take out your smartphone, swipe for the camera app and press the shutter button, without having to meddle with all the camera settings. Computational photography does most of the work for you. And decently so.
Apps I use:
Instagram | Lightroom | VSCO | Moment





If you’re looking for super sharp and clean images in low light, iPhone might disappoint. However, I recently tried shooting with my iPhone 8 at Christmas Wonderland and was quite impressed by the amount of details I could get in the pictures with the Moment app.


Taean, South Korea. Great dynamic range.


Jangtaesan in Daejeon, South Korea. After a long hike, the last thing I wanted to do was to pull out my XT2 (although I eventually did). This picture is good enough for my own viewing pleasure.
Going through these pictures only made me realise how much I would love to travel light with just an iPhone in pocket, and I am really excited for what the next version of iPhone in 2020 would bring.
What are your thoughts on computational photography?