Steve Jobs and Jony Ive’s impact on Apple’s design process is characterized by a focus on product idealism. The company’s goal is to simplify the complex nature of their products, which combine advanced computer hardware design with massive software code bases.
Apple’s philosophy has driven the creation of highly successful products that are adored by their customers. The new iPhone 15 feature is a perfect example of how Apple’s idealistic approach can lead to innovative and intriguing outcomes.
Keep it simple
In 2016, Apple implemented Portrait Mode on the iPhone with the aim of producing smartphone photos that looked similar to those taken with high-end cameras and long lenses. This involved analyzing the photo and applying a simulated blur to the background.
The initial version of Portrait Mode, found in iOS 10.1, was basic and verbose. It would prompt users to take a step back, find a subject with better lighting, or get closer, as it could only function effectively in limited conditions.
Since that time, we have made significant progress. With the help of machine-learning algorithms and added phone sensors, current portrait mode photos appear much improved compared to those from 2016. However, with the release of the iPhone 15, things have reached a new level: Portrait shots are now captured automatically, even without activating Portrait Mode.
Apple’s commitment to simplicity leads to strategic technical choices. Why must the user manually capture data for a Portrait shot? Previously, Portrait Mode was confined to a separate section of the Camera app due to limitations, but advancements in iPhone technology now allow the phone to automatically determine if a shot warrants Portrait information. This removes the burden from the user.
Visualizing the ideal Camera application
This function made me consider the overall objectives of Apple when creating the Camera app. Naturally, Apple aims to offer users the ability to control the high-quality cameras that are integrated into the iPhone nowadays. As a result, there are numerous advanced settings accessible through toggling external settings or tapping icons within the app. For those who want to fully maximize their camera capabilities, third-party apps such as Halide or Obscura are available.
However, the majority of iPhone users who utilize the device as a camera are not concerned with these technicalities. They simply want to capture a moment and have it appear perfect with minimal effort, by just tapping the shutter button. This is why I am convinced that within Apple Park, there is a fundamental belief that the Camera app should be uncomplicated, and that Apple should continue to develop hardware to support this goal.
What is Apple’s preferred Camera app? In my opinion, it would be one that only has essential modes such as video and stills. Every interface element in the app should be carefully considered, as any unnecessary ones should be eliminated. Apple has also made efforts to streamline the zoom function and utilize high-quality data in its image processing, making it easier for users. The shutter button represents a variety of image captures at different resolutions and lighting, which are then processed into the desired final image.
It has the capability to continue. Why should Action Mode not be activated automatically? Why is it not possible for each shot to contain multiple full-resolution frames, allowing you to choose the best one later on? (Several cameras can already detect when a person in the shot is blinking and will delay the shot until the blink is finished. This is just the beginning of what they can do.)
The main objective is likely to enable you to use your iPhone to record a scene and have the software automatically select the optimal video clips and still images to create a gallery for you. Actually, the ultimate goal is for the recording to be done on a wearable device, so you don’t have to tire your arms holding the iPhone throughout a birthday party. While this may still be years away, it’s safe to assume that Apple is already envisioning it. The goal is to keep it simple.
Simplicity Enabler Button
The addition of the Action Button to the iPhone 15 Pro by Apple was met with criticism as it seemed redundant to have it mapped to the Camera app, since there are already various methods to access the Camera from the lock screen.
While that statement is accurate, each of those tasks necessitates removing your phone from your pocket and performing a swipe or tap in the correct location, likely causing you to look down and confirm the correct gesture. While these gestures are easier than unlocking your phone with Face ID and locating the Camera app icon, there is still room for even simpler options.
Consider the process of accessing the camera by reaching into your pocket when the Action Button is assigned to it. You would retrieve the phone from your pocket, hold it by the edges, bring it up to your face, and already have your finger pressing down on the Action Button. This will prompt the camera to appear. With your finger still in position, you can easily capture a photo by pushing down on the button. This action involves physical hardware (your fingers) on a hardware component (the Action Button) and can become a habitual action ingrained in your muscle memory. Overall, this approach simplifies the process.
This is the main purpose of Apple.
The Apple iPhone 15 Pro Max with 256GB storage capacity.
Please read our comprehensive review of the Apple iPhone 15 Pro Max (256GB).