We’ve already seen a ton of wonderful things come of developers’ work with augmented reality (AR), but a seemingly non-AR project might prove the missing link that makes AR truly useful on a day-to-day basis. In the video below, Bob Burrough shows off his “environmentally-lit user interface,” which morphs as he navigates from room to room in a home, altering the lighting as he goes. The interface is crude, but we’ve all been at the Alpha stage of development, so no judgements there. Equipped with an add-on camera accessory, the camera continually takes a wide-angle shot of the environment, which is then fed into a scene. This helps the interface judge lighting sources and brilliance. This is a bit like running Unreal in real-time, in a real-world environment. Instead of feigning a light source, your device can identify where light is coming from, and how bright it is. This also sidesteps one major hurdle of augmented reality, which is creating experiences in environments that are well-lit or have “predictable” lighting conditions. Skeuomorphism plays a key role in Burrough's project. At the 1:45 mark in the video above, he goes from a well-lit room through a dark hallway, then moves the device from shadow to the light a few times. When dark, the ‘settings’ bar at the top of the screen goes flat, and some on-screen toggles disappear (presumably because they’re not ‘on,’ and therefore unnecessary). When the device detects light, the bar gets a nice shaded quality while getting lighter. Like those old icons from iOS 6.0 (which had lots of shadows and realistic detailing), this lighting project helps things 'feel' a bit more realistic. This testing is done on an older iPhone, suggesting any processor heavy lifting isn’t so heavy after all. This platform is clearly designed for mobile ARM processors, though it’s unclear how taxing it might ultimately prove on hardware. While traipsing around with an accessory clipped to your phone is not ideal, the wide-angle lens is necessary. One way this could be better achieved is via a heads-up augmented reality display. Rumors suggest AR glasses are coming, at least from Apple (we’d guess Google might prove a little adverse after its Google Glass debacle). Some strategically placed cameras or sensors on the eyewear would allow your augmented software to measure real-world lighting accurately. Context is king, and for AR there’s nothing more contextual than environment. While we may overlook it, lighting plays a massive role. Something akin to Burrough’s project for augmented reality would be huge; imagine the ‘fog of war’ becoming real in an AR game, and even navigation items for maps integrating seamlessly with the surrounding environment. All told, real-time lighting adjustments would make augmented reality a bit more ‘real’ and a lot less ‘augmented,’ and that’s probably best for long-term success.