Getting to know Cinematic Mode in Apple’s new iPhones
As you know The aperture of the main camera lens in two more advanced phones iPhone 13 series is more open than last year’s models and f/1.6 has reached f/1.5. In addition, the size of the pixels of these two products has also increased to 1.9 microns compared to the 1.7 micron pixels of the previous year’s models. But besides these things, which are all interesting, Cinematic mode is probably the most attractive feature of the new series of Apple phones. It is considered that not only in two Pro models but also in iPhone 13 and its mini version is also available. This mode enables functions such as changing the depth of field effect and changing the focus location even after taking the video. In this regard, Apple invited Kathryn Bigelow, the Oscar-winning director, and Greg Fraser, the Emmy-winning cinematographer, to attend the event to introduce the new iPhones.
Cinema mode allows the user to adjust the aperture or focal length (f-stop) to change the depth of field after the movie is taken. This feature also allows not only to change the desired object for focus, but also to change the place and subject of focus. So that the focal point of the frame can be professionally changed. Cinematic mode works like the usual video capture mode, in 16:9 aspect ratio and 1080p video recording resolution at 30 frames per second. What is different from the usual mode is that instead of the resolution and frame rate, the aperture number is shown in the upper and left corner of the screen, and by tapping on it, you can control the depth of field or by reducing the focal length. Blur the background image or focus on the desired subjects by increasing the f number.
When the red video capture button is pressed in cinematic mode, the iPhone starts generating depth data for each frame; A process that requires a very fast processor. In the next step, when the camera is taking video, the iPhone automatically detects people and subjects and then uses neural networks to determine the right time to change focus. For example, when the subject in the film turns his head towards the camera or enters the image frame, the focus changes smoothly.
In the iPhone 13 series, in fact, to identify and determine the depth of field from A method called stereo disparity has been used between wide and ultra wide cameras; The depth of field map resulting from this process is analyzed by the software and selectively blurs the foreground and background and simulates the selected focal length. In fact, even though the powerful sensors of these phones show less depth of field compared to the previous models, the depth of field map can make the background a little blurry and reveal the depth in filming by using the possibility of additional computing similar to the portrait mode. /p>
As mentioned, users can change the focus of any part of the scene, or even a list of items based on their set design and mise-en-scène, to focus from one subject to another. Create another subject.
According to Johnnie Manzari, Apple’s human interface designer, the iPhone 13 series cameras intelligently predict and focus on the subject when it enters the frame. This process is done by receiving additional information from the ultra wide camera. If the subject is looking at another person or subject, the camera will automatically focus on that subject, and when the subject looks away, the focus will automatically be on that person again.
We have seen aspects of this type of technology before, but Apple’s implementation and special attention to the overall experience of this method of filming has given it a special appeal. In fact, after examining the creative options of digital imaging, in order to teach autofocus algorithms, we are now faced with a process that creates a better experience than the previous examples and from recorded videos with the help of simpler algorithms such as autofocus with the priority of the center of the frame or the closest subject. (via facial recognition), offers better quality.
The ability to make decisions about focus and depth of field (after video capture) could also revolutionize not only mobile videography, but the entire videography industry, giving cinematographers and It allows the directors to pay more attention to the actions and capture the moments while filming.
Of course, the quality of the depth of field effect is not known yet, but it will probably be similar to recording portrait images in wide-angle mode and will have limitations. As mentioned earlier, Cinema mode is currently limited to 1080p resolution at 30 frames per second. However, Dolby Vision HDR mode is available for it, which intelligently classifies each frame and uses dynamic data to increase the dynamic range of the output rather than HDR video recorded on other compatible devices as flat and depthless. Do not display the field. This mode also has the advantages of Wide Color Gamut, and therefore the color of the film is displayed beyond the range of sRGB or rec.709 color space.
Apple is not the first manufacturer that seeks to enter the field of cinematography and Before this, companies such as Samsung and some Chinese mobile manufacturers are also looking to use the combined fading mode in Videos have been recorded. Some companies, like Sony, have specialized in this area and smartphones with very suitable shutter angle control. and have offered ten-bit HDR output; As Apple has already provided with the advanced Dolby Vision system in 4K/120p quality.
What gives Apple’s new effort in this field another color and smell is the effort of the American super company to put these features together in a usable format for ordinary users; A goal that looks very attractive.