Inside the phone, a special “neural engine” gives the technology artificial intelligence (AI) and augmented reality (AR) capabilities. The engine could signify the future of smartphones, which rely on machine-learning algorithms. What could the arrival of iOS 11 mean for UX designers? Let’s explore.
Get to Know the iPhone A11 Processor
The iPhone X is replete with exciting new features, from face-scanning ID recognition to the elimination of the home button. The most dramatic change, however, is inside the phone, in the A11 Bionic processer. The A11 processor Apple was developed to give life to the iPhone X is a 64-bit, six-core processor that Apple described as the most powerful chip ever in a smartphone. Here’s a breakdown of what makes the A11 Bionic so groundbreaking:
- 4.3 billion transistors
- Two performance cores that are 25 percent faster than A10
- Four high-efficiency cores that are 70 percent faster than A10
- Second-generation performance controller
- Handles multithread workloads 70 percent faster than current controller
- First Apple-designed GPU (30 percent faster than A10)
- GPU offers same performance with half the power consumption
- Optimized processor for 3D gaming and machine learning
- Faster low-light autofocus
- Hardware multiband image noise reduction
- Apple-designed video encoder with real-time image analysis
- Secure Enclave to protect Face ID data
Part of the A11 Bionic processing system is the neural engine. It is a dual-core engine that can handle 600 billion operations per second. The neural engine does not send any data to Apple. Apple designed the neural engine especially to accelerate AI software, using “artificial neural networks” that can process speech and images efficiently. The new capabilities the neural engine brings to iPhone could change the future for smartphones everywhere.
Benefit from the Exciting New Neural Engine
Apple’s neural engine is a glimpse into the potential future for smartphone technology. A breakdown of the engine shows that there is a pair of processing cores for handling machine learning algorithms. These algorithms give the iPhone X the ability to recognize facial features, create animojis, and handle augmented reality apps with finesse. It masterfully takes on complex AI tasks without lag time thanks to those 600 billion operations per second.
The mainstay feature of the new engine is its impressive AI and AR capabilities. Apple’s approach to artificial intelligence is bringing a revolution to how users experience AI with their mobile devices. Historically, the cloud is how apps and processors have handled AI features. Using the cloud conserves battery power, but it’s less convenient and less secure. Apple found a way to crack the code and bring users all the benefits of AI for mobile without cloud-related drawbacks – dedicating the A11 processor to AI capabilities.
Apple brought AI straight to mobile back in June 2016, when it introduced differential privacy – the company’s way of marking users’ identities when collecting data using statistical methods. The neural engine, however, brings new life to having the AI-capable hardware on a phone. Apple no longer collects or analyzes user data; instead, Apple keeps the AI hardware on the phone itself, eliminating the need for the cloud as a middleman. The neural engine isn’t just the brainchild of Apple – it’s the future of the entire industry. Other phone companies are using the same approach to AI, including Huawei, Google, and Qualcomm.
Learn What the Neural Engine Means for the Future of UX Design
When the iPhone X hit the market, UX design changed forever. The phone’s unique features – new screen size, no home button, rounded corners, and richer colors – begged the question of whether website design for mobile needed to change. But while the aesthetics of the iPhone X are exciting, it’s the neural engine that may impact UX design the most. Now that Apple has placed the power of augmented reality and artificial intelligence into the hands of the masses, UX designers need to broaden the scope of their work.
New and improved AI and AR in iOS 11 means that developers and designers no longer have to bake AR capabilities into the actual apps. This technique is expensive and makes it difficult to control the user experience. Instead, developers can depend on the AR components within the phone’s neural engine to handle applications without having to come up with and incorporate their own AR solutions into the apps. The new engine has opened the door to an enormous AR-ready consumer base that’s easier than ever for developers to tap into.
Specialized AR applications will continue to be a hot trend in UX. The iPhone X marks the beginning of what is sure to be a revolution in the efficiency and dependency of AR applications for mobile. To keep up with the times, UX design will have to adapt. The only thing limiting future AR apps is the imagination of the designers and developers. Bringing AR to the mainstream in a convenient, consistent, and reliable way will naturally change the way developers need to think about AR for mobile.
Thrive in the New AR Environment as a UX Designer
Apple has proven time and time again that change can be a very good thing. The iPhone X’s neural engine is certainly transforming AI and AR for mobile, but this only means that developers have new tools at their disposal for creating exceptional user experiences. The trick is to understand what the new engine means for UX and UI, and how to use its capabilities to the developer’s advantage. Some tips on how UX designers can capitalize on the A11’s features are as follows:
- Establish the trust of the user. The rollout of the iPhone X (specifically, it’s facial recognition feature) spawned some security concerns for users. It is now more important than ever to establish trust with users in the design of new apps. Apps and websites must give the first impression of seamless security.
- Resist hasty AR app rollouts. Developers may feel pressured to crank out new apps that take advantage of the iPhone X’s AR and AI capabilities. It is important, however, to resist the temptation to publish quickly. With about one in four people never returning to apps after initial use (primarily because of UX problems), developers need to spend time optimizing and testing designs before publishing them.
- Test, re-test, and re-test again. It’s difficult to predict how users will respond to new AR apps, but developers can maximize the odds of positive returns by taking the time to test app performance, making adjustments, and re-testing until they feel confident in the app’s design. Testing and re-testing is important for any AR-related endeavors.
- Work with the marketing team. UX developers and marketing teams will need to work as a team to enhance the potential value of a new AR product or app. Marketing will need to come up with online content that advertises the AR app before it ever launches. New developments need to be a team effort.
- Be unique without being radical. Developers should express creative freedom to come up with the next big thing, but also remember to take user preferences into consideration. Take the major flop of Google Glass as an example. Google required users to make a radical shift in how they used technology. Users weren’t ready for this shift, and the product was discontinued.
Predictions for augmented and virtual reality place worldwide revenues at $162 billion by 2020. It is critical for UX professionals to learn about the neural engine in iPhone X, recognize that this technology is becoming worldwide, and take advantage of new AR/AI technologies. The task UX designers now face is to come up with new ideas for AR applications that consumers will embrace, engage with, and truly enjoy. It’s a challenge that UX designers can turn into limitless opportunities with the right tools, knowledge, and a progressive mindset.
Main image source: developer.apple.com