Thoughts after Apple iPad event with implications for : Today, Apple positioned iPad and VisionPro for professional use, including movie production and sound editing (e.g., FinalCut & Logic Pro on the iPad), and training (VisionPro). They also updated the Apple Pencil. Here's an exciting idea:

An issue to some with Vision Pro has been the lack of strong integration of hand controllers, especially compared to more gaming-centric headsets. For serious use of VisionPro's initial major pro app, Excel, I think it helps to use a physical keyboard and trackpad, which it does support. But that's not rich enough for many more advanced uses.

I think in the not-too-distant future we’ll see the iPad integrated with VisionPro like the Mac started, if not more so. You’ll use an iPad, perhaps with a Magic Keyboard, and the new Apple Pencil Pro for professional-level control. Having both a pencil, with squeeze, twirl, haptic-feedback, hover, etc., along with the current full-motion hand and arm movement in 3D-space, gives you the start of a very rich and precise way of interacting with spatial computing. Moving on the hard iPad surface could be quite superior to waving something in the air or using a joystick. The Mac is not for using a pen, but the iPad is. I’m thinking long-term, not just the current headset. The videos they showed of their pro-apps on iPad, and the VisionPro update which included touting a film director using it to oversee the editing and visual effects for an upcoming film, hinted towards this convergence to me. I wonder if it's true.

@danb The AvP can essentially project you any "device" that seems appropriate for a task in your hands. For haptic feedback all you might need is a plain plate.

@helge I wasn't thinking of the iPad just for an image. Yes, the AvP can do that, and probably would to some extent. Their pencil, though, is tuned to writing on their screen and may need some electronics and processing there. (AI for good tracking?)

@danb I think what I'm saying is that it doesn't make much sense to use devices designed for the real world, as-is, within visionOS. Yes, you might need a smarter plate, but it wouldn't need that fancy OLED display, as it goes through cameras anyways. Rather have the device directly render into your AvP (which is what the macOS screen does).

Follow

@helge Yes, I don't think a great OLED display is needed. Yes, like the Mac. I was thinking more of having the thing you are manipulating being not below the pen but elsewhere. Like a mouse on the desk and the screen vertical in front of you. I guess I wasn't clear enough.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.