Jump to content

What Apple did last month could lead to better images from the 2020 iPhone cameras - General Hangout & Discussions - InviteHawk - Your Only Source for Free Torrent Invites

Buy, Sell, Trade or Find Free Torrent Invites for Private Torrent Trackers Such As redacted, blutopia, losslessclub, femdomcult, filelist, Chdbits, Uhdbits, empornium, iptorrents, hdbits, gazellegames, animebytes, privatehd, myspleen, torrentleech, morethantv, bibliotik, alpharatio, blady, passthepopcorn, brokenstones, pornbay, cgpeers, cinemageddon, broadcasthenet, learnbits, torrentseeds, beyondhd, cinemaz, u2.dmhy, Karagarga, PTerclub, Nyaa.si, Polishtracker etc.

What Apple did last month could lead to better images from the 2020 iPhone cameras


Ulquiorra
 Share

Recommended Posts

 

Google's Pixel line of handsets has become known for its outstanding photographic capabilities thanks to Google's image processing prowess. Apple, looking right at the Pixel's reputation, took photography on the iPhone 11 and iPhone 11 Pro up to a new level this year. A new ultra-wide camera was added along with a Night mode feature that helps users snap photos in low-light environments; this is similar to the Pixel's Night Sight option and many believe that Apple has surpassed Google to become the leader in low-light photography. However, others say that Huawei's Night Shot Mode is the leader in this area of smartphone photography.

Apple also might have topped the HDR+ system used on the Pixels that increase dynamic range, reduce noise, and enhance the sharpness and colors of photos. The Deep Fusion system used by Apple, like HDR+, snaps a burst of pictures to help create the best image. With HDR+, these exposures are combined to produce a superior image. With Deep Fusion, 24 million pixels are examined over one second with AI employed to select the pixels that when put together, produce a sharper photo with less noise.

Apple traditionally purchases smaller niche companies

And Apple is continuing to work on improving the iPhone's cameras. Filings that surfaced today in the U.K. (via Bloomberg) reveal that Apple has acquired a company called Spectral Edge. The company uses machine learning to help produce sharper photographs with more accurate colors; a photo taken with an infrared sensor is blended with a regular photo to produce the enhanced image.

It isn't clear how much Apple paid for the company. Filings show that last year the firm raised more than $5 million. The documents released today list Apple corporate lawyer Peter Denwood as a director of Spectral Edge with the appointment taking place on November 8th. On that same date, a number of company directors were relieved of their position. And in five days, a document will be made available related to the "withdrawal of a person with significant control" of the outfit.

 

Deep Fusion helped Apple take its photography to the next level this year

Outside of a few large acquisitions over the years, like the $3 billion it spent to buy Beats Audio back in 2014 and the one billion it spent to buy Intel's smartphone modem business earlier this year, Apple traditionally buys small, niche companies and quickly uses the purchase to add features to its products. A good example of this is the 2012 purchase of biometric firm AuthenTec that resulted in the introduction of Touch ID on the iPhone 5s the very next year. The report notes that Spectral Edge is based in Cambridge, U.K. where Apple already has opened offices over the last few years working on AI capabilities for the Siri digital assistant. There is no guarantee that Apple will ever use the technology that it has acquired with Spectral Edge. The company has been known to make purchases just to acquire talent from a target company. After all, none other than Apple CEO Tim Cook has said that Apple makes an acquisition every two to three weeks on average.

 

The picture on the right shows how Spectral Edge's technology enhances the color of a photo taken with a smartphone. The photo first appeared in a 2016 Tech Crunch story

Apple is already expected to improve add a Time of Flight (ToF) sensor to the 2020 iPhone models. This might dovetail with Spectral Edge's technology since it also requires the use of an infrared beam. With ToF, the time it takes for this beam to bounce off of a subject and return to the phone is used to calculate accurate depth information; the data can be used to improve AR capabilities, produce a more natural bokeh blur on portraits and create a 3D map that can be used to deliver a secure rear-facing facial recognition system.
 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Customer Reviews

  • Similar Topics

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.