Amazon Fire Max 11 review: not the productivity tablet you’re looking for - The Verge



Amazon Fire Max 11 review: not the productivity tablet you’re looking for

The Fire Max 11 has the most advanced and refined hardware of any Amazon tablet yet. But it is held back by the same old problems.

If you buy something from a Verge link, Vox Media may earn a commission.See our ethics statement.

An Amazon Fire Max 11 tablet in its keyboard case on a wooden table.

The Fire Max 11 looks the part of a productive tablet but doesn’t stick the landing.

What does it take to make a tablet more than just a content consumption machine? We’ve seen Apple’s evolving ideas for a productivity tablet for over half a decade; Microsoft directly leaned into productivity from the start.

Amazon, on the other hand, doesn’t seem to have a clue. The company has been selling “Productivity Bundles” for its Fire tablets for years, but aside from a cursory acknowledgment that typing on a keyboard is better for productivity than typing on a glass screen, it doesn’t go much further than a name.

I was hoping that the new Fire Max 11 would show me that Amazon has finally cottoned on to what makes a good productivity tablet, something more than just a large screen you use to watch movies on the couch or entertain a toddler on an airplane. Maybe this would be the Amazon tablet that I could recommend for those who want to get work done on a budget — it’s $329.99 all in with keyboard and stylus, after all. (You can also buy it for $229.99 without the accessories.)

Unfortunately, this is just a continuation of the same old story with Amazon Fire tablets: the Fire Max 11 is a fine device for watching movies you bought on Prime Video (or most any other streaming service of your choice), though it’s not markedly better than Amazon’s even cheaper options. But it’s certainly not something I can really recommend for work. And yes, you guessed it, it’s because of the software.

The Fire Max 11 certainly looks the part of a productivity tablet, especially when you get it with the keyboard case and stylus. It’s got a metal chassis, an eight-megapixel camera in the bezel on the long edge of the screen (i.e., the correct spot), a keyboard with integrated trackpad that snaps to the bottom edge via magnets, and a stylus that clings to the side of the tablet, also using magnets. There’s even a fingerprint scanner built into the power button, a first for a Fire tablet. 

The point is the Max 11 doesn’t look like a typical Fire tablet, with their plastic backs and generally awkward cases. Amazon made an effort to justify the Max 11’s higher cost with a nicer design, and in that respect, it succeeded. Squint hard enough, and you could mistake it for a much more expensive iPad Pro or Lenovo Chromebook Duet 3.

Stop squinting, though, and it’s easy to see where Amazon didn’t go far enough. The 11-inch LCD screen has a 2000 x 1200 resolution, with punchy colors and wide viewing angles. It’s bright enough for most any indoor environment and, in a pinch, might even work outdoors in the shade. But its 5:3 aspect ratio is cramped when browsing the web or working in documents, and trying to use the Max 11 in portrait orientation is clumsy and awkward. Put it side by side with Apple’s entry-level ninth-gen iPad, and you can see how much bigger the iPad’s screen is thanks to its 4:3 aspect ratio.

The keyboard magnetically attaches to the bottom of the Max 11 and is powered by the tablet, so there’s no Bluetooth pairing or separate charging to worry about with it. The keys are spaced well enough apart and have decent travel, plus there’s a row of function keys for media and system controls.

The Max 11’s keyboard isn’t terrible, but the trackpad is awful.

The Max 11 is the only tablet in Amazon’s lineup that has an aluminum chassis.

But the trackpad frankly sucks: it’s cramped and sticky, which makes two-finger scrolling and gestures hard to perform. It also only supports inverted scrolling (“natural scrolling,” in Apple parlance), with no option to change it to a more conventional scrolling direction.

Like other tablets that use this kind of keyboard case (the 10th-gen iPad and already mentioned Chromebook Duet 3 are but two examples), the Max 11 is floppy and wobbly when trying to use it on my lap. You really have to be parked at a desk or table to use the keyboard with the tablet.

One bright spot here is the stylus. It is a USI 2.0 pointer, complete with a button on the side, and writes smoothly with no perceptible lag. It feels very similar to using an Apple Pencil on an iPad. Samsung’s out-of-the-box writing experience is better with its S Pen, but in terms of hardware, there’s little to complain about with Amazon’s pen.

The eight-core MediaTek processor Amazon’s using in the Max 11 is more powerful than the ones it uses in its lower-tier tablets, and it shows: the Max 11 is snappier and quicker to respond than the others. I can even stream 4K video in the browser, which wasn’t possible on the Fire HD 10 Plus I tested two years ago. The Max 11 won’t hang with Apple’s chips in terms of raw horsepower, but it’s thankfully not a total dog, either.

The Max 11’s skinny aspect ratio means it has much less usable screen space than a ninth-gen iPad, especially in portrait orientation.

As much as Amazon appeared to put effort into the Max 11’s hardware, it seems like it completely forgot about the software. The Max 11 runs the same Fire OS found across Amazon’s lineup, with no improvements or changes to make it more useful for productivity work aside from ensuring support for the stylus and keyboard. It even has ads on the lock screen unless you cough up another $15 to remove them.

The latest version of Fire OS (8.3.1.9) is based on Android 11, a platform that’s nearly three generations out of date. It lacks gestures for navigating the interface, relying instead on three virtual buttons at the bottom of the screen for back, home, and recent apps. It’s capable of split-screening between two apps, but there are no other tweaks or concessions for productivity or multitasking like you’ll find on Android tablets with more modern software. No app dock, no quick launch tray, no pop-up windows.

The homescreen remains a place for Amazon to push you into buying content and products from its various stores, which quickly gets tiresome and spammy-feeling. There are no configurable widgets, no news feeds, nothing beyond basic folders.

The Fire Max 11’s homescreen is basically just a place for Amazon to advertise stuff it wants you to buy.

Two apps side by side is the extent of the Max 11’s multitasking capability.

If you know anything about Amazon’s tablets, you probably know that they don’t have Google’s apps and services available on them, and the Max 11 is no different. That’s not a huge problem when you’re just using a tablet to watch video (unless that video is on YouTube or YouTube TV), but it’s a complete nonstarter for many when it comes to productivity.

In addition to lacking Chrome, Gmail, Google Docs, Google Maps, Google Drive, Google Meet, etc., the Max 11’s app store is missing countless other apps used for getting work done. Outside of Microsoft’s Office suite and Zoom, it’s a ghost town.

The most frustrating part of this is this is the exact same problem I encountered two years ago. I’ll save myself the trouble of writing and just copy and paste what I wrote about the Fire HD 10 Plus in 2021:

Here’s a list of productivity apps I use daily for work that are nowhere to be found on the Fire HD 10 Plus (or any other Amazon tablet):

• Slack

• Asana

• Google Meet

• Feedly (the poorly rated third-party app I tried crashed on login)

• Todoist

• SwiftKey

NY Times (the app in the Amazon store is just a bookmark to the website)

• Bitwarden (Also missing are LastPass, 1Password, and Dashlane. Logging in to apps with my passwords requires juggling my phone and the tablet, and it’s a huge pain.)

• Two-factor authentication apps

• Pocket (Pocket used to be in the Amazon Appstore, but the company has removed it and now instructs Fire tablet owners to sideload the app from its website.)

I’ll add Airtable, Instapaper, Evernote, and Apple Music to that list today. Beyond the fact that there’s been zero progress on available apps in two years, Amazon didn’t even bother to develop a notes or drawing app to be used with its stylus on the Max 11 like it did for the Kindle Scribe. Both Apple and Samsung have built very competent note-taking apps that take advantage of the features of their respective styluses, but Amazon didn’t even try — it expects you to find something in its decrepit app store.

If you’re hoping to use the Max 11 for drawing or artwork, you won’t find many popular art apps in Amazon’s store. There’s no Sketchbook, Clip Studio Paint, or Infinite Painter. (There is an app called Infinite Painter in Amazon’s store, but it is definitely not the one available on other Android devices.) 

Here’s another paragraph from my two-year-old review that’s just as applicable today:

You can get around this problem by sideloading the Google Play Store and its related services onto the Fire HD, but that requires disabling security features, downloading software from sites that don’t have authorization to distribute it, and installing it in a specific order. Frankly, it’s not something most people are going to do, and if Amazon wants to market something called a “Productivity Bundle,” it needs to do a much better job at making its tablet more useful for work tasks.

I was able to draft this article in Google Docs via the Max 11’s rudimentary browser, and I managed my inbox using Microsoft’s Outlook app. (Amazon’s built-in Mail and Calendar apps are so bare-bones, I couldn’t even get them to work with my Google Workspace account.) But when it came time to complete this piece, input it into our CMS, edit and arrange the photos, and publish it, I had to leave the Max 11 behind. Those are things I can do pretty easily on an iPad or even Samsung’s tablets.

If my workflows were more dependent on Microsoft’s apps, such as Word and Teams, I could perhaps use the Max 11 for more things. But even then, the screen is cramped, the trackpad sucks, and I’d just have a much better time on another tablet or even a laptop.

The Fire Max 11 is a cheap tablet for watching video and not much else.

Ultimately, the Max 11 doesn’t change anything about Amazon’s Fire tablets. Its draw is that it’s cheap: the bundle with the stylus and keyboard and six months of Microsoft 365 is the same price as just the ninth-gen iPad alone. That argument is fine for a tablet you’re only going to use for watching video or maybe hand to a kid to keep them entertained on a flight or at a restaurant. Even still, if that’s your planned use cases, Amazon has even cheaper options that work just as well for those things.

But when it comes time to get work done, I wouldn’t hesitate for a second to spend a little more money and get something that actually works.

Photography by Dan Seifert / The Verge


Source

How to customize your iPhone’s app icons - The Verge



How to customize your iPhone’s app icons

How to customize your iPhone’s app icons

How to customize your iPhone’s app icons

/

By creating your own icons from photos or other art, you can add your individual style to your homepage

iPhone with icons and illustrated background

Illustration by Samar Haddad / The Verge

Have you ever wanted to make your iPhone your own, with your individualized style and flair? Sure, you can change your homescreen wallpaper. But if you really want to personalize your phone, why not create your own app icons?

It’s doable, using Apple’s built-in Shortcuts app. You actually won’t be replacing the icons that the apps came with — rather, you’ll be creating separate shortcuts that lead to the app. It’s a tedious and time-consuming process, but in the end, you can have a fully customized iPhone homescreen.

Here’s how you do it:

  • Before you begin, it’s a good idea to find an icon for your new shortcut. There are a bunch of icon sources online (Flaticon, for example), or if you’re artistic and / or ambitious, you can create your own. Whether you use someone else’s or your own, it’s easiest to save the image to Photos.
  • Okay, let’s begin. Find and tap on the Shortcuts app. It’s pre-installed; if you can’t see it immediately on your homescreen, swipe left until you’re at the App Library and start typing “Shortcuts” into the top search bar.
  • Once you’re in the app, tap on the plus sign in the upper-right corner.
  • On the top of the screen, you’ll see that your new shortcut will be named something like “New Shortcut 1.” If you’d rather have your own name, tap on the arrow next to it and select Rename.

Shortcuts can help you create new bookmarks for your apps.

You can name your shortcut whatever you wish.

  • Once you’ve got your shortcut named, tap on the Add Action button below the name.
  • You’ll find yourself on a page that, at first glance, may seem a bit confusing. Basically, you’re looking at all the various things that you can do with Shortcuts. While it would be worth it to spend some time here and try out some customizations, right now, what we want to do is change your app icon.

Type Add Action to start the creating your shortcut.

The number of actions available can be confusing.

  • Type Open app in the search bar and then tap on the Open App link that will show up.
  • Tap on the word App that appears (rather faintly) next to the word Open in the search bar.

Start typing “Open app” to find the right action.

Click on the light blue word to choose which app you’re using for the shortcut.

  • You’ll see a list of your phone’s apps; pick the one you want to customize. The name of the app will now be next to the word Open.
  • Tap Done in the upper right corner. You’ll be taken back to your shortcut page.
  • Select the information icon (an “i” in a circle) at the bottom of the screen.
  • Tap Add to Home Screen

After you’ve chosen the app you’re making a shortcut for, the name will appear next to “Open.”

Now you can add it to the homescreen.

  • You’ll now see a preview of the icon (which will be a standard, uninteresting icon that Shortcuts automatically adds). Don’t worry — we’re going to make it better.
  • Tap on the icon under Home Screen Name and Icon. You’ll have the choice of either taking a photo, choosing a photo, or choosing a file. Assuming you’ve already saved an image in Photo, tap on Choose Photo and select the photo you want to use.
  • If you’ve chosen an existing photo, a highlighted area will indicate what part of the photo will appear as an icon; you can move the photo around until you’re happy with the section indicated. Tap Choose in the lower-right corner.

Don’t like the standard icon? Change it.

The app shows you how much of the photo you can use for your icon.

  • Now, you’ll see your new icon. If you haven’t added a name for your new shortcut, you can still do it here by typing the name next to the icon.
  • All ready? Tap Add in the upper right corner.
  • You should see your new customized icon on your homescreen. Congrats!

You’ll now see what the icon will finally look like.

And here’s your new icon on your homepage.

You can also hide the original app icon so you’ll just have the new one visible. (You don’t want to delete it completely, of course; that would delete the app.)

  • Long-press on your wallpaper until all your icons start wiggling. Tap on the minus sign of the app you want to hide.
  • On the pop-up menu, tap Remove from Home Screen. The original icon won’t be deleted, just hidden; you can always find it in the App Library.

Long-press on the background to reveal the minus signs.

Tap on the minus sign and select Remove from Home Screen.

One note: when you use your new icon to go to the app, you will occasionally get a small drop-down notice that tells you what the original app is called and reminds you of the fact that it is a shortcut. But the drop-down will only last for a second or two, so it shouldn’t be much of a bother.

Update September 21st, 2022, 4:55PM ET: This article was originally published on Jun 13th, 2021; it has been updated to accommodate changes in iOS 16.


Source

Apple iPhone 14 Pro review: early adopter island - The Verge



Apple iPhone 14 Pro review: early adopter island

The Dynamic Island is a potentially good idea that’s waiting for the next step

If you buy something from a Verge link, Vox Media may earn a commission.See our ethics statement.

Apple pulled off some unexpected surprises with the iPhone 14 Pro: there had been lots of solid rumors about the company switching from putting the front-facing camera and Face ID system in a pill-shaped cutout instead of the familiar notch, but the new “Dynamic Island” alert system came out of nowhere. And while it was getting clearer that Apple would have to follow the industry in using bigger camera sensors eventually, Apple went even further and rebooted its entire computational photography system as the Photonic Engine.

There’s a lot of that sort of thing in the iPhone 14 Pro, whose prices in the United States still start at $999 and go up. Apple’s late to having an always-on display, but it’s much more vibrant than other always-on displays. In the United States, Apple’s going all in on eSIM, which no one else is really doing. There’s a basic satellite connectivity system that isn’t quite like anything else we’ve heard about, but Apple is going to ship millions of these phones with the service coming online later this year. All in all, there are more beginnings of big ideas in the new iPhone 14 Pro than we’ve seen in an iPhone for a long time.

That’s the easiest way to think about the iPhone 14 Pro — it feels like the first step toward a lot of new things for Apple and the iPhone and maybe the first glimpse of an entirely new kind of iPhone. But that doesn’t mean all these things are perfect yet.

I will admit to having come around to the name “Dynamic Island” — after all, it’s made everyone talk about it, which isn’t normal for a smartphone status indicator system. If Apple wants to make everyone deeply consider ancillary smartphone interface ideas, I am here for it.

The island replaces Apple’s familiar and oft-reviled notch; it’s where the front camera and the Face ID system live since they’ve got to take up some space on the front of the display. Here’s the thing about the notch, though: after a few minutes of using it, it all but disappears.

The island is different: you are supposed to notice it. It’s located lower on the screen than the notch, and it’s a high-contrast interface element if you run your phone in light mode: a black pill shape in the middle of a white screen. You’re going to see it, especially since it’s animating and moving all the time. It blends in better in dark mode — in fact, I would go so far as to say this is the first iPhone that feels like it’s better in dark mode because of it.

Unlike the notch, which disappeared in use, the Dynamic Island begs you to look at it.

So why did Apple turn the discreet notch into a somewhat more obvious island? Over the years, there have been several different status indicator systems added on to iOS. Plugging in a charger or flipping the mute switch brings up an overlay. Having a call in the background puts a green pill in the corner; an app that’s using location is a blue pill. Screen recording and personal hotspot have pill indicators on the other side. Connecting AirPods is another overlay. And some things, like timers and music playing in the background, haven’t really had useful status indicators at all.

The island is Apple’s way of replacing and unifying all those older status systems with a new home for system alerts and making it work for things like music and the new live activities API that’s coming to iOS 16 later this year, which will allow apps to share even more background info for things like your flight status or a sports score. It is not a replacement for notifications — all of those still appear in the notification center and look pretty much the same.

The simplest way of understanding the island is that it’s basically a new widget system built on that live activities API, and the widgets can have three views: the main view in the island, an expanded view, and an ultra-minimal icon when you’ve got two things going at once. If you have more than two things going, Apple has an internal priority list to put the two most important things in the island.

It’s a neat concept, but like all first versions of anything, Apple’s made some choices that really work and some others that… well, it’s the first version.

The Dynamic Island replaces and improves all the system status indications across iOS, but it’s not a replacement for the Notification Center.

A big choice that works is Apple overdoing things in classic Apple fashion: the island is meant to feel more like hardware than software — almost like a secondary display that can get bigger or smaller. To get this to feel right, Apple’s built a new dynamic subpixel antialiasing system that makes the edges of the island up to three times crisper than all the other animations in iOS, which antialias at the pixel level. In normal room lighting, this really works: it feels like the cutout on the display is getting bigger and smaller, and the animations are really fun. (In sunlight or brighter light, you can see the camera sensors, and the illusion goes away, but it’s still cool.)

The other big thing that works is that moving all those disparate status indicators to the island and making them worth paying attention to is actually pretty great. It’s nice having call info right on the screen. It’s genuinely useful having your timers right there. Making things like AirDrop and Face ID all show up in consistent ways in the same place makes those things easier to understand, which is a win.

Here’s where I think Apple missed the mark a little: in the keynote and all the ads, the island is shown as a thing that’s worth interacting with — it’s always moving around and going back and forth between the main view and the expanded view. In reality, well, it’s not like that at all.

In normal usage, you don’t actually interact with the Dynamic Island very much at all.

The island isn’t a primary interface element; it sits over whatever app you’re actually using, and apps are still the main point of the iPhone. In fact, tapping on the island doesn’t open that expanded widget view; it just switches you back to whatever app that controls the widget. To get the expanded widget that’s shown in all the ads, you have to tap and hold. This feels exactly backwards to me. I think a tap should pop open the widget, and I also think you should at least be able to choose between the two behaviors. 

This is the central tension of the island: it’s much more noticeable and useful than the notch, but you’re not really supposed to interact with this thing — it’s background information. Your music is playing, your personal hotspot is active, you plugged in a charger — all stuff you don’t really need to mess with. I got a lot of questions about whether fingerprints will interfere with the selfie camera, and while it doesn’t seem to be a problem, it’s even less of a problem because, as it stands, you don’t touch this thing very much at all. 

The Dynamic Island is one of those things that needs a year of refinement and developer attention before we can really say how important it is

But because the island is so much more prominently highlighted by the animations, you’re still looking at it all the time. In apps that haven’t been updated, it can cover up some content because it sits lower on the display. So right at this second, the tradeoff between how noticeable the island is and how useful it is is a little imbalanced — it doesn’t quite do enough to always be in the way.

All that said, that tradeoff might totally change when the live activities API rolls out later this year. That’s the other big thing Apple did right: it built the hooks to make this whole thing available to third-party developers, and some of the concepts we’ve seen from Lyft and Flighty and others are really exciting. But right now, the Dynamic Island feels like one of those things that need a year of refinement and developer attention before we really know how important it is.

The camera housings are even bigger this year to accommodate larger sensors.

The big feature of the iPhone 14 Pro camera system is the new 48-megapixel main camera sensor. Apple’s a few years late to this trend; Samsung has used 108-megapixel sensors since 2020’s S20 Ultra, and Google added a 50-megapixel sensor to the Pixel 6 Pro last year. Apple’s also updated the ultrawide and 3x telephoto cameras, but they remain the standard 12 megapixels, and the star of the show is certainly the new main sensor.

The basic idea is the same all around: to take better photos, you need to collect as much light as possible, and to do that, you need bigger pixels. But at some point, making the pixels physically bigger is challenging, so instead, you add a lot more physical pixels on a huge sensor and use software to group them into giant virtual pixels. The concept is called pixel binning, and the math on Apple’s binning is straightforward: it uses four pixels to create a single virtual pixel, which means the 14 Pro’s 48-megapixel sensor generally shoots 12-megapixel photos.

The Pro’s 48-megapixel sensor generally shoots 12-megapixel images

The other big change to the camera system is that Apple’s running its Deep Fusion processing for mid- and low-light photos earlier in the process on uncompressed image data, which is supposed to improve low-light performance by two to three times depending on the camera you’re shooting with. That’s the change that led to the entire image processing pipeline being rebranded to the “Photonic Engine”; Apple is still doing Smart HDR and all of its other familiar processing, but now there’s a fancy name.

We’ve always called Deep Fusion “sweater mode” because Apple loves to show it off with moody photos of people wearing sweaters in dim lighting, but the effects have always been extremely subtle. And, well, the same is true on the iPhone 14 Pro. Sweater mode on uncompressed data is still sweater mode it seems.

In general, the 14 Pro and 13 Pro take really similar photos. The 14 Pro is a little cooler and captures a tiny bit more detail at 100 percent in dim lighting, but you really have to go looking for it. That’s true of the main camera as well as the ultrawide, which has a bigger sensor this year and also benefits from Photonic Engine. In very dim light at 100 percent, details from the ultrawide look a bit better compared to the 13 Pro, but you have to look very closely. 

It’s the same in bright light: these photos of Verge senior video producer Mariya Abdulkaf outside look pretty much the same, but if you zoom in to 100 percent, you can see the iPhone 14 Pro is getting a bit more detail and has a nicer background blur because of the substantially larger sensor. It’s really nice — but at Instagram sizes, it’s not particularly noticeable. The Pixel 6 Pro captures even more detail with its pixel-binned 50-megapixel sensor, along with a wider range of colors.

This is about as different as the Pixel and the iPhone have been in a few years. Both phones grab a lot of detail and have great low-light performance, but the Pixel 6 Pro makes very different choices about highlights and shadows while the iPhone 14 Pro is way more willing to let highlights blow out and even more willing to let some vignetting creep in. I really can’t tell you which is “better.” Both of these night mode photos are terrific, and which one you prefer is entirely down to subjective preference.

iPhone 14 Pro night mode (left) vs Pixel 6 Pro night mode (right).

Where the iPhone 14 Pro falls down in these comparisons is really in the details of the processing: Apple’s been ramping up the amount of noise reduction and sharpening it does over the years, and the 14 Pro has the most aggressive sharpening and noise reduction yet. Sometimes it just looks bad: this night skyline shot is an overprocessed mess compared to the Pixel.

iPhone 14 Pro on the left, Google Pixel 6 Pro on the right.

Compared to the Samsung S22 Ultra, the iPhone is less predictable. The S22 Ultra consistently holds on to more color detail in low light, and it’s not as heavy-handed with noise reduction and sharpening. In bright light, the differences between the 14 Pro and the S22 Ultra are more subtle, but Samsung still does a better job with detail. In true Samsung fashion, you get much punchier and warmer colors compared to the more natural look of the iPhone; Samsung’s color ideas are sometimes from a different planet entirely. But photo for photo, the S22 Ultra is more consistent with better fine detail.

Having a big sensor with a lot of pixels opens up other possibilities: in addition to pixel binning, Apple’s also cropping it to generate what it claims is an “optical quality” 2x zoom. Basically, it’s just taking the middle 12 megapixels from that 48-megapixel sensor; if you shoot in ProRAW mode at the full 48 megapixels and just cut out the center of the image, you’ll get the same photo. Hardware-wise, this is still a leap over the 2x telephoto lens of the iPhone 12 Pro from two years ago, but since you don’t get the benefit of pixel binning, it gets into a little trouble in lower-light situations. But it’s nice to have and a nice middle ground between the standard wide and 3x telephoto. 

Portrait mode on the 14 Pro (left) can’t match Samsung’s S22 Ultra (right) when it comes to cutting around fine details like hair.

That 2x crop is also the default for portrait mode, which doesn’t seem to have improved all that much. Both the S22 Ultra and even the regular S22 take better portrait photos. Samsung’s nailed cutting the subject out of the background down to individual strands of hair, and the 14 Pro isn’t quite there yet. It went ahead and just chopped off part of Mariya’s head in perfectly bright light.

I did take a few photos in ProRAW at 48 megapixels, and there’s a lot of detail in there and a lot of room to edit. If you’re the sort of person who is excited about ProRAW on an iPhone, the iPhone 14 Pro will be endless fun to play with. But I don’t think normal people should take 48-megapixel photos on their phones.

Apple has added autofocus to the selfie camera, which is probably useful in some situations, but compared to the 13 Pro for some regular selfies, the overall differences were so mild that they were barely discernible.

iPhone 13 Pro selfie (left) / iPhone 14 Pro selfie (right).

I asked Verge senior video producer Becca Farsace to play with the video features on the iPhone 14 Pro, and she found that things are looking as good as ever, but there’s not a huge leap over the already excellent iPhone 13 Pro. You can see more in our video above, but here are the highlights:

Cinematic mode on the 13 Pro was more than a little messy last year, but Apple’s continuing to put more work into it, and on the 14 Pro, it does a better job separating faces from the background so it can apply blur. And it can now be used with 4K video resolution. Becca did quite a bit of testing of it and said it works best with faces but struggles with any other kind of subject.

Cinematic mode now supports 4K resolution

Action mode, which is a stabilization system designed to let you leave things like a gimbal at home and still get smooth and steady footage, is the other big new video feature this year. But it comes with some significant compromises: you need a ton of light for it to work, and there’s a massive crop to the footage that’s captured — it maxes out at 2.8K, not 4K. It’s fun to play with, but it’s another feature that feels like it’s a year away from being useful.

As for the basic image quality question, Becca says that, in good light, it’s very hard to tell the difference between the 14 Pro’s footage and the 13 Pro’s, but in low light, the telephoto on the 14 Pro produces a noticeably crisper image with less noise. 

Overall, the iPhone has been the top contender for smartphone video for years, and the 14 Pro maintains that lead. Really, check out the video for more; writing about the video features is like dancing about architecture, you know?

The 14 Pro and Pro Max have similar size displays as prior models, but they can get brighter when viewing HDR content or when out in direct sunlight.

At long last, Apple added an always-on display mode to the iPhone 14 Pro, which, well, Android phones have had always-on displays for a long time now. It’s fine! The display refresh rate drops to just one hertz, and the brightness goes extremely low to save battery life. Apple’s done some nice work to keep wallpaper colors accurate in the low-power always-on mode, but honestly, I would prefer a Pixel-style black and white clock to something that sort of looks like my phone is awake all the time. I hope we see some customization options here in the future.

Allison Johnson has the iPhone 14 Pro, while Becca and I tested the iPhone 14 Pro Max. And while battery life certainly ran all day, all three of us felt as though the battery ran down a little bit faster than before. To be fair, all three of us were running around taking lots of photos and videos and generally testing these phones like mad for the past week, but, well, we test a lot of phones like that. Apple claims the 14 Pro and Pro Max will get slightly better battery life than the 13 Pros, and we all still got through a full day with the 14 Pro Max, so maybe that always-on display was just taking its toll. In any event, it’s something we’ll be keeping an eye on over time.

Other than that, the display is slightly brighter than before — it can hit a peak brightness of 1,600 nits when displaying HDR content, up from 1,200 nits, and in bright sunlight, it can go to 2,000 nits. It also retains the 120Hz ProMotion feature from the 13 Pro for smooth scrolling and interactions. I’ve long thought Apple’s mobile displays are consistently the best in the industry, and it’s no different here.

Say goodbye to your SIM card — the 14 Pro is eSIM-only.

In another unexpected move, Apple made a big move to drop SIM trays from iPhones in the US, which means it’s time everyone got used to eSIM, which lets you access mobile networks without needing a physical SIM card. The 14 Pro can store at least eight different eSIMs, which is pretty intense, with two of them being active at the same time. It worked well in my testing: my AT&T account transferred over from the physical SIM in my iPhone 13 Pro right over Bluetooth, and I added my Google Fi account from the web with a handful of taps. 

Now, it’s not nearly as easy to move eSIM info from iPhones to Android phones, and carriers are certainly going to play some weirdo lock-in games here because they are carriers and weirdo lock-in games are just what they do. It will also likely present issues for travelers who are used to buying local SIM cards in countries they are visiting. And I was hoping Apple would allow iPhone users to scan for available networks and sign up right from the phone like you can do on an iPad, but that doesn’t seem to be an option — you need an eSIM activation kit, which usually involves scanning a QR code.

But being able to add new networks quickly and easily to your phone also theoretically means we can all force the carriers to compete a little more, and that’s definitely a good thing.

The 14 Pro can store at least eight eSIMs, two of which can be active at the same time.

Apple’s emergency satellite connectivity system isn’t rolling out until later this year, but Allison got an early demo on the Apple campus, and it looked slick. The software prompts you to try to make an emergency call on cellular, and if that fails, then it deploys the satellite option. The system walks you through a series of questions to help first responders understand your situation; then, the UI shows you where to point the phone to access a satellite. You’ll even see the satellite icon on the screen changing position as the actual satellite moves across the sky. The demo was under the most controlled of controlled conditions, so take this with a grain of salt, but messages got out in less than 30 seconds, even with a little foliage between the phone and satellite.

Satellite SOS is one of those features that you’re either into or you’re not. If you’re an outdoors enthusiast, it might be appealing as a peace of mind kind of thing. But if you never leave areas with cell coverage, then you probably have no interest in a satellite emergency communication system. In either case, it’s likely going to cost extra at some point — Apple won’t say how much, but it’ll be free for the first two years on the iPhone 14. It won’t be available right off the bat, either — it’s coming in November as an update.  

Satellite SOS is free for two years, but Apple won’t say how much it costs after that

There’s another feature for grim scenarios, and that’s Crash Detection. It appears to work a lot like Google’s very similar Pixel feature and uses input from multiple phone sensors to detect when you’ve been in a car accident. The 14 and 14 Pro (and new Apple Watch models) are equipped with a specialized accelerometer that helps enable the feature, so don’t expect it to come to older iPhone models. Unlike SOS via satellite, Crash Detection doesn’t require any input from the user. If it detects a crash, it will display a prompt to call emergency services or dismiss the notification. If you don’t respond within 20 seconds, it will automatically call for help. We haven’t quite figured out how to crash a car to test it yet, but we’ve got some ideas. Anyway, assuming it works the way it should, it’s free and requires no setup — nice to have even if you never have to use it.

That’s really everything you need to know about the 14 Pro and 14 Pro Max. There’s slightly more to say, but most of it doesn’t change the fundamental iPhone equation: the 14 Pro is the only model this year with Apple’s latest A16 Bionic silicon, which is more powerful and has a faster GPU. It’s fast, sure, but so were the 13 Pro and the 12 Pro before it, so it’s hard to quantify the difference here. Apple’s performance advantage continues to be best expressed as longevity: these phones are so fast, they won’t feel slow for years to come.

The Pro is available in four colors, including the new purple and darker black options. We have the purple and the black, and everyone seems to really like them. But, well, almost everyone puts a case on these phones anyway. Actually, the Max is so big it’s almost not useable without a case to add some grip to the situation.

Apple seems committed to muted colors for its Pro phones.

The way I’ve been thinking about Apple’s current iPhone lineup is that the iPhone 13 Pro was the culmination of a lot of ideas for Apple — it was confident and complete and kind of hard to criticize. The iPhone 14 is basically a slight remix of the 13, and it feels like the same culmination of ideas in many ways.

The iPhone 14 Pro, on the other hand, is the clear beginning of lots of new ideas, like the Dynamic Island, the new camera, and that satellite connectivity system. Because these ideas are new, they’re inherently incomplete. But they’re worth criticizing, which is its own kind of victory and a sign that Apple isn’t holding still with the future of the iPhone. I think we could all stand to think more deeply about how our smartphones work, and things like the Dynamic Island are evidence that Apple is still thinking deeply about parts of the iPhone experience.

The 14 Pro is the clear beginning of lots of new ideas

What I don’t know is if all these new ideas are worth it yet. If you’re the sort of person who’s willing to accept some rough edges to be on the bleeding edge, you’re going to have a lot of fun with the iPhone 14 Pro — in many ways, you’ll be figuring it out right alongside Apple. But if you’re happy with your current phone, it might be worth holding out for another year to see how some of these things work out.

Photography by Amelia Holowaty Krales and video by Becca Farsace . Additional reporting and testing by Allison Johnson and Becca Farsace.


Source

Search This Blog

Halaman

Categories

close