The thing that excites me most about the Apple Vision Pro is the ability to capture someone's point of view.
General purpose vision tracking at this scale and resolution, in a consumer device, is entirely new thing. And I think it has the potential to alter the course of the human experience.
Cameras can record a scene. Video cameras can record many scenes at a high frame rate. And a skilled photographer or cinematographer / director duo can direct your gaze at a specific point, region, or sequence. And that's resulted in amazing pictures, movies, and visual experiences.
But what I think is super interesting about the Apple Vision Pro is the potential to be able to literally see through someone else's eyes. Not just see their field of vision — you can get at that with head or eyeglass mounted cameras — but to actually see where they're looking. To know what they're focused on. To lock in with them. To see how they see. To watch them look from their point of view.
Standing in someone's shoes is one thing, but even if you could do that, you'd still be looking through your own eyes. But to literally see as they'd see from someone else's point-of-view perspective feels groundbreaking.
If I was making an app for this, I'd call it "See With".
Imagine being able to see with an artist. What do they look at when they paint? Where are they looking? How are they looking? How often are they darting around, vs. fixing their focus on a specific spot? When they choose a brush, are they looking at the toe, the belly, or the heel of the brush? Or is the right handle they're after for the control they seek?
Imagine being able to see with a furniture designer. What do they look at specifically when they look at a piece of wood? How do they see the grain? When do they look at the whole vs. the tiniest detail. How do they see their tools. How do their eyes help them make their choices and decisions?
Imagine being able to see with a florist. What part of the flower are they looking at? What catches them? When they look at entire bouquet, what bits do they focus on first? When they're looking through 36 roses, how do they see those? What are they looking at to pick that specific one?
Imagine being able to see with a programmer. When they look at code, what do they see? Where are they looking? What are they looking for? How often are they referencing this bit or that bit? Do they continually read through the entire thing like a story, or are they just focused on one line at a time? How do they spot a bug?
Same with a writer. A drummer. A driver. A fly fisherman. A landscape designer. A tailor. The list goes on.
I'd find this endlessly fascinating. Add in voiceover where someone can describe what's going through their head while they work, choose, consider, and see, and you've got a first-person observational learning experience that's radically new.
I hope this happens.
General purpose vision tracking at this scale and resolution, in a consumer device, is entirely new thing. And I think it has the potential to alter the course of the human experience.
Cameras can record a scene. Video cameras can record many scenes at a high frame rate. And a skilled photographer or cinematographer / director duo can direct your gaze at a specific point, region, or sequence. And that's resulted in amazing pictures, movies, and visual experiences.
But what I think is super interesting about the Apple Vision Pro is the potential to be able to literally see through someone else's eyes. Not just see their field of vision — you can get at that with head or eyeglass mounted cameras — but to actually see where they're looking. To know what they're focused on. To lock in with them. To see how they see. To watch them look from their point of view.
Standing in someone's shoes is one thing, but even if you could do that, you'd still be looking through your own eyes. But to literally see as they'd see from someone else's point-of-view perspective feels groundbreaking.
If I was making an app for this, I'd call it "See With".
Imagine being able to see with an artist. What do they look at when they paint? Where are they looking? How are they looking? How often are they darting around, vs. fixing their focus on a specific spot? When they choose a brush, are they looking at the toe, the belly, or the heel of the brush? Or is the right handle they're after for the control they seek?
Imagine being able to see with a furniture designer. What do they look at specifically when they look at a piece of wood? How do they see the grain? When do they look at the whole vs. the tiniest detail. How do they see their tools. How do their eyes help them make their choices and decisions?
Imagine being able to see with a florist. What part of the flower are they looking at? What catches them? When they look at entire bouquet, what bits do they focus on first? When they're looking through 36 roses, how do they see those? What are they looking at to pick that specific one?
Imagine being able to see with a programmer. When they look at code, what do they see? Where are they looking? What are they looking for? How often are they referencing this bit or that bit? Do they continually read through the entire thing like a story, or are they just focused on one line at a time? How do they spot a bug?
Same with a writer. A drummer. A driver. A fly fisherman. A landscape designer. A tailor. The list goes on.
I'd find this endlessly fascinating. Add in voiceover where someone can describe what's going through their head while they work, choose, consider, and see, and you've got a first-person observational learning experience that's radically new.
I hope this happens.