Google turns Android into a VR Ready OS
MOUNTAIN VIEW, Calif.—As we’ve been obsessively tracking for about a year and a half now, Google is making a big push into virtual reality. This week at I/O 2016, the company is finally ready to talk about its VR ambitions, and the first news out of the gate is about the “Virtual Reality Mode” built into Android N Developer Preview 3. Google is also announcing a hardware certification program that allows an Android phone to earn the title “VR ready,” and the first “VR ready” phone will be the Nexus 6P.
Google has done a lot of work whipping Android into shape for VR with Android N DP 3. Previously Google’s only smartphone VR project was “Cardboard,” a cardboard box with plastic lenses that could hold a smartphone. Cardboard gave a rough approximation of VR at a very low cost, but it wasn’t a serious platform for real VR immersion.
“Google currently has Cardboard, but Cardboard worked in spite of Android, if you’d like,” explained Android VP of Engineering Dave Burke to Ars. “It’s clever and simple but we never did anything at the platform level to make it work. With N, we have.” In Android N, those changes come down to improving motion-to-photon latency—how quickly you can get the display pixels to change in response to your head moving. When you move in VR, the sensors detect the movement, signal the GPU to draw new frames, and those frames get sent to the display to be drawn. If this doesn’t happen fast enough, you’ll feel sick.
When used with a certified device, Android N can kick over into a low-latency “VR Mode,” which ratchets up the entire processing pipeline. An “exclusive performance mode”—which we spotted in N Preview 2—dedicates a CPU core to the UI thread to try to prevent nausea-inducing hiccups during heavy processing. The motion sensor pathways have been tuned as well, so Android N will get faster updates from the gyroscope and accelerometer.
In Android, the GPU normally passes frames to the display in a “double buffering” mode. This has the screen display “Frame A” while it draws “Frame B” into an intermediate frame buffer. When it comes time for the next display refresh (this happens 60 times a second), the display takes down “Frame A” and pulls “Frame B” from the frame buffer. Double buffering prevents the undesirable visual phenomena known as “screen tearing,” but the extra step of drawing to the frame buffer delays the frame data—you’re a frame slower than you could be.
In VR Mode, Android N switches to single buffering. This skips the intermediate frame buffer and draws the frame data directly to the display, which is the fastest way to get new pixels on the screen. The tricky part is that the GPU frame refresh has to perfectly sync up with the display’s own refresh cycle or you’ll get more screen tearing—a mixing of Frame A with Frame B, which makes the picture look like it was cut in half. A display fills in the pixels row by row, and as Burke puts it, you’re “chasing the scan lines” of the display with the GPU when single buffering. There’s no room for slowdowns or the image suffers.
VR Mode also adds “electronic display stabilization” (AKA “Time Warping“) to the mix. In a normal VR frame rendering pipeline, the position of the user’s head is measured, the in-game virtual camera is pointed to a matching position, and the frame is rendered accordingly. On Android N this happens 60 times a second, so you have 16ms to get the frame rendered and pushed to the display. The problem is, the position of the head was measured at the beginning of the frame rendering process, and by the end it’s 16ms out of date. Time Warping improves this by taking the finished frame, measuring the head movement that occurred in those 16ms, and transforming the finished frame a tiny bit to match. This transformation isn’t perfect, but when you’re only moving a tiny distance, it’s a useful “cheat” that cuts down on the perceived latency.
The end result of all this work? For the Nexus 6P on Android 6.0 Marshmallow, motion-to-photon latency was over 100ms. On Android N it has been reduced to about a fifth of that: less than 20ms. That should put the headset with the 6P in the same league as the Gear VR, which is advertised as having “sub-20ms” motion-to-photon latency.
Parts of the Android system UI are being updated to work in VR, too. If you’re in a VR app and a notification comes in, Android’s notifications will be rendered in stereo so they can seamlessly pop into the VR environment. And it won’t stop with notifications: Dave Burke told us that Google intends to update even more of the Android UI to be VR compatible. “For what we’re launching with N, it’s just a VR Mode, so it’s up to the app to do something,” Burke said. “And then in the future we’ll have a VR Home, like a launcher.”
Google’s “VR Ready” smartphone certification
Short photon-to-motion latency VR requires hardware and software working together, so Google will be passing down requirements to OEMs that ensure their devices will work with Android’s new VR mode. This will be a whole new section of the Android Compatibility Definition Document with requirements and tests to ensure an OEM’s phone will be up to Google’s VR requirements.
For now, only the Nexus 6P makes the “VR Ready” cut. The Nexus 5X isn’t eligible for one very important reason: it has an LCD screen. AMOLED displays are mandatory.
An LCD works by using a spiral of liquid crystals to control light flow in between a pair of polarizers. Applying electricity to the LCD spiral causes them to straighten out, which, when combined with the polarizer layers, works to turn light off (twisted crystals block the light) or on (straight crystals align with the polarizer and allow light through). Having these crystals twist and untwist takes time, which is called the “response time” and is usually measured in milliseconds. AMOLEDs are literally just a bunch of tiny LEDs—active matrix organic light-emitting diodes, recall—so flipping them on and off is a much quicker matter of simply applying and removing current.
Google’s VR program mandates AMOLEDs due to their much faster pixel response time. This cuts down on motion blur, which is critical when you’re whipping your head around in VR. When it came time to pick displays for the Oculus Rift and HTC Vive, those companies went with AMOLED, too—LCDs just aren’t cut out for VR. However, unlike current consumer VR head-mounted displays, Android displays still only run at 60Hz. This is significantly slower than the 90Hz refresh rates of the Oculus Rift and HTC Vive.
Democratizing Gear VR-style devices
Most of this sounds like it is following the path blazed by Oculus and Samsung with the Gear VR. It’s great seeing this list of improvements and then watching this Oculus Connect 2013 talk from legendary developer John Carmack. In the talk, Carmack details the various hacks he did to improve the Gear VR, which at the time only worked on the Galaxy Note 4.
For example, Carmack discusses how he needed to cut down Android’s triple buffering graphics pipeline to something faster and how he needed a high-performance mode that disabled many of the power saving features. He talks about how he practically invented Time Warp for modern VR. For the first run of the Gear VR, Carmack described that development process as being like that for “a game console” since Oculus was only targeting a single device, the Note 4. Google’s integration of similar improvements directly into Android should open the VR floodgates for the rest of the Android ecosystem. There will now be a standard that all smartphone OEMs can follow to make a VR-ready phone. And, since it’s part of Android, it’s all being open sourced. This will probably result in a ton of Gear VR competitors, but it should help Samsung and Oculus improve their products, too.
Now that the Android software is equipped to pump out decent VR, we just have to settle down and wait for hardware. Surely, Google can’t expect users to continue to use a cardboard box, right? We don’t have any details on a “Google VR” head-mounted device yet, but we’ll keep our eyes peeled at the show.
Via: Ars Technica | Ron Amadeo