Daniel Sinclair+ Your Authors @_DanielSinclair Building for young people. Not reading @danielsunread. Lurking behind likes and thinking about social media, communication, & China Sep. 14, 2019 1 min read + Your Authors

One of the really interesting opportunities of having a wide angle and super wide angle lens in an iPhone, at least when dual-capture is possible, is that you can shoot in both horizontal and vertical at the same time. Crop algorithms like this, in post.

These are what the ratios between the 3 sensors on the iPhone 11 Pro look like. No matter which lens you're capturing with, or which direction you're holding the device, it could algorithmically pull-out to capture both viable vertical and horizontal video or photographs.

I'm surprised Apple has actually not done this, yet, anyways. For photographs, at least, a third-party developer could pull that off — but it would deem a new format. Really cool possibilities. The most interesting momentum in vertical since Spectacles.

The more sensors, the more possibilities. Can't wait to see what becomes possible as we encroach on Light-like devices.

It's unfortunate that Lytro didn't make, & that their assets were thrown into a fire sale. Light-field cameras will surely miniaturize & reappear in consumer devices at some point in the future. What is possible at the high-end, like what was proposed w/ Lytro Cinema, is awesome:

The Lytro Illum 'wigglegrams' still have a bit of a cult following. It's really cool what you can capture digitally that you can't capture on film. They've always felt a bit too uncanny.  https://lissyellephotoblog.tumblr.com/post/94859825167/these-are-some-images-that-ive-taken-in 

There's something about the Nishika Nimslo film lenticulars that make them just so unique. The combination of unpredictable film grain with this really obscure and weird camera design is irreplaceable.

Similar to the Lytro's post-processed focal point & depth of field, that the Light L1 can capture just enough light-field data to produce great results. Apple is largely leaning on ML edge detection & artificial blurring for this — and the results can be whacky. Improves w/ array

You can follow @_DanielSinclair.


Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Since you’re here...

... we’re asking visitors like you to make a contribution to support this independent project. In these uncertain times, access to information is vital. Threader gets 1,000,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Your financial support will help two developers to keep working on this app. Everyone’s contribution, big or small, is so valuable. Support Threader by becoming premium or by donating on PayPal. Thank you.

Follow Threader