The Vision Pro’s biggest advantage isn’t Apple’s hardware - The Verge



The Vision Pro’s biggest advantage isn’t Apple’s hardware

Apple’s developers already have the tools they need to create apps for the system.

If you buy something from a Verge link, Vox Media may earn a commission.See our ethics statement.

Image: Apple

Apple used the Vision Pro’s $3,499 price tag to give the headset every advantage over the competition. It has dual 4K displays, runs one of the best laptop chips in the business, and comes with sophisticated eye- and hand-tracking technologies. But it also has one advantage money can’t buy: Apple’s developer ecosystem. Perhaps the headset’s single biggest advantage will be the ability for iPhone and iPad developers to easily plug their existing apps into the device’s operating system using familiar tools and frameworks.

Already, the system stands in stark contrast to headsets from Meta, Valve, PlayStation, and HTC, which mostly rely on apps and games made in Unity or OpenXR to power their virtual and augmented reality experiences. While some competitors, like the Meta Quest, have key apps like Microsoft Office, Xbox, and Netflix, offerings beyond this are limited. In the several years that Meta’s headset has been out, the Meta Quest Store has only released about 400 games and apps. That isn’t necessarily a bad thing, but it’s a sign that there’s a serious lack of content optimized for VR.

Unlike other headset ecosystems, though, Apple is promising hundreds of thousands of apps on day one, a feat it’s able to pull off thanks to work on other platforms. Apple will automatically convert iPad and iPhone apps to “a single scalable 2D window” that works on the Apple Vision Pro — with no work required from developers unless they want to make any changes. And for the developers who want to create something new for the headset, Apple is making it easy for those already acquainted with its ecosystem to create apps for visionOS, its new mixed reality operating system.

“visionOS is not so different than iPadOS with ARKit”

“visionOS is not so different than iPadOS with ARKit, the augmented reality kit that developers have had access to for a couple of years now,” Maximiliano Firtman, a longtime mobile and web developer, tells The Verge. “iOS and iPadOS developers will be able to use their classic UIKit apps, Unity apps, or their more recent SwiftUI apps for visionOS.”

The frameworks developers can use to build apps for iOS and iPadOS — SwiftUI, RealityKit, ARKit — have all been “extended for spatial computing,” Apple says, allowing developers to craft immersive AR and VR experiences for the Vision Pro. They can also build their apps with the tools already available to devs, including Xcode and Unity as well as Apple’s upcoming Reality Composer Pro that should let devs “preview and prepare 3D content” for visionOS apps.

Firtman adds that even though the visionOS software development kit isn’t out yet, web developers can still use “WebXR for immersive web apps and web experiences using Safari on visionOS… as most of the knowledge needed to create apps is already out there.”

This means that, in addition to Apple’s native apps, we’ll likely see a lot of iOS and iPadOS apps make their way to the Vision Pro at launch.

For developers making the jump, Apple is encouraging them to expand what their apps can do. A simple port might display an app on the Vision Pro as a “Window,” creating a floating version in mixed reality. Apps with 3D elements might present content as a “Volume” that adds depth that’s viewable from all angles. More immersive apps might build a “Space” that can take up a user’s entire view.

“Apple will want to feature apps that take advantage of the new Volume and Space app paradigms,” Steve Moser, an iOS developer and the editor-in-chief of The Tape Drive, tells The Verge. “I imagine developers will want to quickly recompile their existing iOS and iPadOS apps for visionOS so that they will be on the visionOS AppStore on day one and potentially get an opportunity to be featured by Apple.” 

This is good news for Apple, which is looking to prime its App Store with services making its headset useful. But the approach falls short in one area where Apple’s competitors are strong: gaming. When the device comes out early next year, Apple says it will house over 100 games from its Arcade service, which is a nice perk, but most of these games aren’t built specifically for VR. That makes a pretty big difference, as users could just as easily whip out their iPhone or iPad to play an Arcade game, rather than put on an entire headset just to play Angry Birds Reloaded or Temple Run

After all, people are buying the Valve Index or the Meta Quest 2 just so they can access libraries of VR-only games like Beat Saber and Half-Life: Alyx. A lack of serious VR titles risks putting the Vision Pro in the same position as the Mac — a device mainly for productivity, not a hub for gaming. While Apple is trying to coax game developers into putting their titles on macOS with a new porting tool, the fact is that most developers aren’t prioritizing Mac as a platform because the majority of gamers use Windows, and up until now, Apple didn’t exactly make it easy to bring over games from other OSes. (We’ll still have to see how well these newly ported games actually perform.)

“They clearly aren’t focused on the current VR ecosystem and game developers like myself, but that may be the right move in the end.”

Even though Apple’s headset might not immediately have some of the riveting experiences that come along with playing VR games like Arizona Sunshine and Blade and Sorcery, it’s not likely to make or break the headset’s success. “They seem to be nailing all of the points that Meta has been fumbling for [the] last few years, namely overall UX,” Blair Renaud, VR game developer and the director of IrisVR, tells The Verge. “They clearly aren’t focused on the current VR ecosystem and game developers like myself, but that may be the right move in the end. For the industry to move forward, we need all the things I mentioned, not just incremental hardware improvements.”

Apple’s slow, careful approach to VR is reflected within the device itself. Instead of presenting you with a somewhat jarring and unfamiliar UI that engulfs your reality, the Vision Pro surfaces a set of recognizable apps that exist atop your real-world environment thanks to video passthrough. Of course, there is the option to turn on full VR using the digital crown, but Apple mainly left this application for watching movies or replaying videos. You won’t have to worry about getting used to controllers, either, as you can navigate through the device using just your eyes and hands. 

Based on first impressions of the Vision Pro, the technology is clearly there for it to succeed. But like most devices out there, the apps are what make it. Fortunately for Apple, it’s easier to build upon a foundation that’s already been established, rather than build one from scratch.


Source

Microsoft’s Phone Link app now lets you use iMessage from your PC - The Verge



Microsoft’s Phone Link app now lets you use iMessage from your PC

Microsoft’s Phone Link app now lets you use iMessage from your PC

Microsoft’s Phone Link app now lets you use iMessage from your PC

/

Phone Link finally supports iOS devices and lets you send and receive messages and calls on a Windows device.

The Phone Link app now supports iPhones.

The Phone Link app now supports iPhones.
Image: Microsoft

Microsoft is bringing access to iMessage on Windows through its Phone Link app. A preview of the updated app will be available for Windows Insiders today. The Phone Link app allows iPhone users to connect their devices to a Windows laptop or PC and, with the update, will let iPhone users send and receive messages via iMessage, make and receive calls, and see their phone’s notifications inside Windows 11.

Microsoft is using Bluetooth to link Windows devices to iPhones, passing commands and messages to users’ Messages (iMessage) app. That means you’ll be able to message contacts that also have iPhones straight from your PC, but there are some limits. You won’t be able to send pictures in messages or participate in group messages. As you can see in the screenshot of the Phone Link app at the top of the story, PC users will be shown their iMessages conversations in a simplified form.

“We send the messages back and forth via Bluetooth, Apple I think in turn sends those as iMessage once it gets onto their system,” explains Yusuf Mehdi, Microsoft’s head of consumer marketing, in an interview with The Verge.

The setup process for Phone Link now includes an iPhone option.
Image: Microsoft

You also won’t see the full message history in conversations, as only messages that have been sent or received using Phone Link will be displayed. Microsoft isn’t using blue or green bubbles in Phone Link either, as the company isn’t able to differentiate between a standard text message and one sent via iMessage.

The Phone Link integration for iOS is basic compared to what’s available for Android, but Microsoft has never supported messaging or calls for iPhone users before, so this is a step in the right direction. As Phone Link is bundled with Windows 11, it’s also native compared to alternative PC link apps we’ve seen from Intel, Dell, and others.

“It’s something we’ve been wanting to do for a long time,” says Mehdi. “The experience with Android is quite good, and we felt like we needed to get something out [for iPhone].”

While there won’t be any photos integration in Phone Link, Microsoft already offers iCloud Photos integration right inside the Windows 11 Photos app. It’s easy to imagine this might appear in Phone Link at some point in the future, too. This new Phone Link support arrives alongside a big new Windows 11 update that includes AI-powered Bing on the taskbar, a screen recording feature, better touch optimizations, and more.

If you’re interested in testing this new Phone Link support for iOS, it will be available for Windows Insiders in the Dev, Beta, and Release Preview channels, but Microsoft is kicking off testing with a “small percentage” of testers this week. “We will increase the availability of the preview to more Insiders over time and based on feedback we receive with this first set of Insiders,” says Microsoft’s Windows team in a blog post.


Source

The iPhone 14 Pro is a demonstration of what our phones could do better - The Verge



The iPhone 14 Pro is a demonstration of what our phones could do better

The always-on display, Live Activities, and Dynamic Island tease a frictionless future where we spend less time in apps.

iPhone 14 Pro on a table showing lock screen with baseball game score displayed.

Apple’s Live Activities and always-on display are a showcase for what our phones could be.
Photo by Allison Johnson / The Verge

Listening to a baseball game on the radio while following the score on your phone can make you feel a little bit like a psychic. I watch the score tick to 10-8 on the iPhone 14 Pro’s display several pitches before Dave Sims goes wild on the broadcast calling Cal Raleigh’s home run. This is a lot less fun when you see the score go the other direction, but that wasn’t the case on Sunday when the Mariners eventually beat the Blue Jays in extra innings. It was a classic — a grand slam, a tablet smashed in a fit of rage, all the stuff you love to see. I kept the final scorecard on my lock screen even after the game was over, just to keep soaking in the victory.

I’m coming back to the 14 Pro after testing a string of high-poweredAndroid phones. I used the iPhone quite a bit last fall, but some things have changed since then for the phone’s marquee features. The Dynamic Island — that’s the free-floating notch at the top of the screen that hosts at-a-glance info — and the always-on display can do a little more these days since Apple opened up Live Activities to third-party developers. Oh, and baseball season started up again, which is an important use case for me.

At launch, Dynamic Island was limited to tasks like timers and phone call information.
Image: Nilay Patel / The Verge

Live Activities is an iOS 16 feature; it’s not exclusive to the 14 Pro. It’s a way for apps to provide live updates for time-sensitive events. They show up on the lock screen for most iPhones, but on the 14 Pro and Pro Max, some of that information will also appear in the Dynamic Island so it’s visible while you do other things on your phone. On the 14 Pro models, it also remains visible on the always-on display, which, unlike a traditional AOD, is just a dim version of your lock screen. You can set your phone down and still check the game score or the whereabouts of your Uber ride without lifting a finger. 

This neat little trifecta — more apps supporting Live Activities, the Dynamic Island, and the always-on display — brings the 14 Pro’s whole vibe into better focus now than six months ago. And I like it. I like being able to keep tabs on a Mariners game without opting in to notifications or having to pick up my phone and open an app. I like knowing whether my Uber ride is five minutes away or right around the corner without having to obsessively check the app.

Love to be browsing my favorite websites while keeping tabs on the game score.
Photo by Allison Johnson / The Verge

Ultimately, these features help address what I want less of from my phone. I want to spend less time fumbling around in apps — that “What was I doing here?” scrolling when all I wanted to do was check the weather. I want a little less friction as I go about my daily phone chores.

I know I’m not alone. In fact, there seems to be sort of a consensus lately that phones as they exist now are categorically bad, and they need to be replaced with something less disruptive and terrible for our mental health. That’s the thinking behind something like the gadget that Humane, uh, “demoed” at a recent TED Talk. Based on leaked videos, it appears to be some sort of replacement technology for your phone that includes a tiny projector you put in your shirt pocket so you can use your hand as a kind of quick info display. The premise is shaky, and the company is being secretive about what it’s actually making, but it’s hardly the first ill-advised attempt to put something in front of our faces that isn’t a phone.

The thing that the “phones are bad” crowd forgets is that phones are still utterly essential to modern life. How exactly am I going to sign my kid out of daycare using a little projector I clip to my pocket? There are a lot of things we generally like about our phones, too, that aren’t destructive to our mental health. I like that my phone allows me to confidently navigate public transit systems I have no familiarity with. I like that I have a device in my pocket that I can use to video call my parents at a moment’s notice so they can see their grandchild who lives across the country. I like that I can finish a book in the Libby app, browse what’s available from the library, and check out another book all while sitting on the bus.

I have a feeling that apps — not phones — are to blame here. App developers have lots of incentive to keep us scrolling and buying things and very little incentive to help us maintain healthy relationships with our phones. This is how we ended up in our current notification hell, with phone makers throwing us a couple of life preservers in the form of focus modes, weekly screen time totals, and scheduled notification summaries. Thanks, guys.

Apple is providing another little life raft with the 14 Pro’s new hardware features, too, but the lasting impression I have after taking a trip back to Dynamic Island is that they could do a lot more. There are obvious things that just aren’t supported right now but seem well within the current capabilities. While the Uber app supports Live Activities, Uber Eats doesn’t (yet?) support real-time updates on your dinner’s whereabouts. There’s also no way to just opt in to all real-time updates for every game your team plays — instead, you get a notification that the game’s about to start, and tapping it will take you to the Apple TV app to enable live updates. 

Live Activities are designed for events with defined start and end times. (Don’t at me about baseball games going on forever. We have the pitch clock now. Are you happy, you monsters?) They’re events that you have an obvious interest in following in real time, be it a game or a timer or a rideshare ride, and once they’re done, the information goes away. What I’d actually like more of are features that surface info related to my habits and daily activities, which is a little trickier.

Surely there are other useful things my phone could be doing for me that don’t involve selling me something

Why can’t I have bus arrival times appear in a lock screen widget whenever I’m hustling to the transit stop by my house? What if my phone automatically opened up the app our daycare uses when I approach the building, like I do without fail five goddamn times a week? I can set up an automation for this, but it’s far from straightforward and depends on me telling my phone what to do, rather than it anticipating my needs. Beyond that, have you tried setting up an iOS shortcut more complex than “open X app”? You need an advanced engineering degree to understand it. I’ll wager that most iOS users have no idea what an automation is, let alone any interest in setting one up.

The apps on my phone can tell who I’ve been hanging out with lately and what brand of artisanal candles they just bought so they can serve me the right ad. Surely there are other useful things my phone could be doing for me that don’t involve selling me something.

That’s what makes the 14 Pro’s new features feel sort of refreshing. They put useful information where I need it when I need it — mostly without additional input from me. To live up to their full potential, more third-party app makers will need to get on board, but that seems likely to happen given that Dynamic Island looks like it’ll be on all iPhone 15 models. If that’s the case, it’ll be just in time to help me keep an eye on the Mariners’ postseason games.

Correction May 5th, 3:45PM ET: A previous version of this article stated that it wasn’t possible to set up an automation to open a particular app when arriving at that location. It is possible using a slightly different setup method and this article has been updated to reflect that — hat tip to MacGyverLite . We regret the error.


Source

Google’s latest Pixel drop adds macro video, cinematic wallpapers, and more - The Verge



Google’s latest Pixel drop adds macro video, cinematic wallpapers, and more

Google’s latest Pixel drop adds macro video, cinematic wallpapers, and more

Google’s latest Pixel drop adds macro video, cinematic wallpapers, and more

/

Recent updates have been light on new tricks, but the June feature drop adds useful new capabilities and deeper personalization options for Google’s recent phones.

A hand holding the Hazel colored Pixel 7 Pro

Photo by Amelia Holowaty Krales / The Verge

Google is just a couple weeks out from the release of its first Pixel foldable, and the company is also inching closer to the launch of Android 14. But despite that busy calendar, today, the company is announcing one of the more substantial “feature drop” software updates that it has rolled out to Pixel smartphones in some time. The additions range from camera enhancements to new personal safety features. And Google isn’t stopping at phones; this feature drop also brings new capabilities to the Pixel Watch and Fitbit devices.

For starters, the Pixel 7 Pro’s macro mode can now be used when recording video. That phone hit the market in October, and we’re firmly in June, so I’m more than a little curious about what took so long to make macro video happen. But at least it’s here. “You can create larger-than-life videos of the smallest details, like butterflies fluttering or flowers waving in the wind,” Google wrote in its blog post. The fine print notes that macro video is “not available for all camera apps or modes.” Macro video is exclusive to the Pixel 7 Pro, but all phones from the Pixel 6 onward will gain a new hands-free gesture for activating a timed shutter: you just hold up a palm within the viewfinder frame, and the camera will count down from either three or 10 seconds before snapping a shot.

Cinematic wallpapers will have a parallax effect that separates the subject and background.
Image: Google

If you’ve got a Pixel 6 or newer, you can now create cinematic wallpapers to give your lock screen greater depth and a slick parallax effect. Google says it’s using AI to “transform your 2D wallpaper photos into dynamic 3D scenes for a truly magical look.” Cinematic photos have been one of those Google Photos specialty tricks for a few years, but the app would randomly choose which images in your library were a good fit for the effect. Now, you can make them on demand. And like Apple has already done with iOS, Google is also letting you create custom, emoji-laden wallpapers.

Google is also continuing to build out its personal safety features with this feature drop. After the update, you’ll be able to “use your voice to ask Google Assistant on your Pixel phone to start emergency sharing or to schedule a safety check for some extra peace of mind.”

If you’re out for a night run, just say, “Hey Google, start a safety check for 30 minutes.” If you don’t respond to your safety check in the set duration, your emergency contacts will be notified and your real-time location will be shared.

In countries where the Pixel supports car crash detection, you can configure the phone to notify your emergency contacts — in addition to emergency services — and share your real-time location with them within seconds of an accident being detected.

There are also more subtle tweaks to haptics and adaptive charging. Google says the Pixel 6A and 7A will automatically lessen the intensity of their vibrations when on a hard, flat surface (like a table). Those devices can occasionally exhibit some unpleasant rattling at full power, so it’s nice to see Google acting on customer feedback.

You can now export videos of your Recorder voice memo transcripts.
Image: Google

And adaptive charging “now uses Google AI to help extend the lifespan of your Pixel battery. When you plug in your phone, it can predict a long charging session based on your previous charging habits, and slowly charge to 100 percent one hour before it’s expected to be unplugged.”

Last up are some upgrades to the excellent Recorder app. With this feature drop, you’ll be able to export video clips of your transcriptions (complete with speaker labels), which could be useful for social media purposes or sharing with colleagues. Google says this feature drop will reach some customers as early as today and will continue rolling out over the next few weeks.


Source

Search This Blog

Halaman

Categories

close