Your brain’s visual system is economical. Rather than capturing a big high-resolution image, only a small portion of your retina has light receptors packed together closely enough to capture fine detail. To construct a big useful picture of your surroundings, your brain relies heavily on eye movement.

One nice trick your eyes and brain use is smooth pursuit. You can lock your eyes on a moving object and move them continuously, not as a set of discrete changes, but by actually giving your eyes and head an angular velocity. Your visual system uses low-level details to perform quick error-corrections to stay locked on the object. This lets you give the most powerful part of your retina a stable input image for the moving object, allowing you to see fine detail even for fast-moving objects.

This is an elegant design. But this design breaks down if you’re looking at a screen.

Moving text is easy to read, unless it’s on a screen

Watch this animation.

1200
The quick brown fox jumps over the lazy dog

Press play and try to read the text.

(Turn your phone, it's better in landscape.)

Painful, right?

Now pick up a piece of paper, a book, or some other device. Move it in front of your screen at the same speed as this animation. You’ll see that it looks amazing compared to the animation on this screen. You’ll see that “1200 pixels/second” is not actually very fast.

The text isn’t moving, it’s teleporting

This may be obvious to some of you.

On a screen refreshing at 120 frames per second (fps), the text isn’t moving smoothly—it’s teleporting between positions every 8.3 milliseconds. Here are 12 frames of the animation:

1200
Simulated display refresh rate:
At 120 frames per second, the text shifts 10 logical pixels per frame.

What your moving eyes see

When your eyes track moving text on a screen, the actual image passing into your eyes is the text oscillating repeatedly.

1200
Simulated display refresh rate:

Actual time-varying input to your retina during "smooth pursuit" (slowed 8x)

The quick brown fox

Actual "static" input to retina during "smooth pursuit" (average pixel at each position)

This phenomenon is called display motion blur. Motion is much more pleasant in the real world than on screens.

Designing for this

In the age of 120 frames-per-second ProMotion™ Retina™ displays, you may have the impression that screens are now good at showing motion. Nope.

As we try to build “buttery-smooth” animations, we need to be aware that it’s a fool’s errand. Every motion animation that is much faster than 1 pixel per frame is going to look terrible if you actually pay attention to it. The upgrade from 60 fps to 120 fps doesn’t make motion animations good, it just makes them less bad. For designers, this is an important constraint; we should avoid creating situations where the user tracks a moving object. These animations can happen, but users shouldn’t be paying too much attention to the moving objects, or they will become unhappy. Until we invent screens that move with your eyes, motion is going to look ugly on screens.

Motion in movies

In these animations, your computer is generating individual discrete frames. In video recordings, the camera stores an aggregate blurred image for each frame. So, videos have pre-blurred images; they essentially render the blurry version of “The quick brown fox” above. Like software designers, filmmakers have to design around this; they need to avoid situations where viewers are tracking fast-moving objects.

Pre-blurring the images obviously doesn’t solve the problem of making the image look realistic and readable. It does solve a different problem: making the animation look smooth and blurry, rather than choppy. Game developers intentionally include “motion blur” in games, rendering multiple positions of an object into a single frame to simulate the film effect. This blurs the image, but it also makes it more obvious what motion occurred, which is important in fast-moving video games, where it’s more important to see where an object went than to show its fine detail in individual frames.

An interesting observation of this blog post is: in video games, even without motion blur, you’ll still perceive moving objects as blurry, if you’re viewing them on a screen.

Attaching a screen to your face

Screens are incompatible with smooth pursuit, and this has implications for VR / AR / “spatial computing”, since anything that gets rendered into the world will be blurry while you move.

What would happen if you wear an Apple Vision Pro while driving? Let’s assume it works perfectly, with a perfect resolution screen and perfect passthrough.

Each of these frames is a clear picture of a sign, and yet you perceive a blurry one, because the screen is not behaving like reality.

Highway sign

Actual input images to your retina when you track the sign

Actual time-varying input to your retina (slowed 8x)

Highway sign

Average pixel at each position

What you would see if you take the VR headset off

Highway sign
Sign distance:
Simulated display refresh rate:

This might mean that the soldier or police officer of the future will not use VR with video passthrough.

What about AR glasses? Passthrough VR can create amazing experiences, but it has the above problem. Glasses let the real world pass directly through, then add their own additional photons.

AR scene, showing actual food ingredients with AR text overlays on them

(Source: Meta)

As you’re working in the kitchen, these text overlays will be blurry. You’ll be forced to pause more often. AR devices have to render motion that otherwise isn’t there, because you are always moving. A screen in the kitchen can actually be useful, because it sits still. But with AR glasses, everything that “sits still” is actually constantly moving on your AR screen.

I wonder what the AR / VR people will come up with. Do their hardware labs have secret rotating screens that move with your eyes? Or will they treat this as a design constraint, and carefully design experiences around the limitations of screens?

(Thanks to Rosanne Liu and Charlie Liu Lewis for reading and listening to drafts of this post.)