Learning about a world outside the car window

Tony Hicks
5 min readJan 4, 2021

--

Knowledge is like air — everywhere and not necessarily noticeable.

Some people best absorb knowledge without having to focus on the learning itself, benefitting from seeing something, then doing it. It’s a way of connecting the dots easier with some explanation.

Riiid Labs wants to help connect more dots by developing augmented reality in the vehicles in which we spend so much time.

“Augmented reality is like a sibling to virtual reality, but they’re different,” said Samy Abbas, a senior UI and UX designer for the Riiid Labs innovation team. “When you put on a virtual reality headset, you’re looking at a different world.

“Augmented reality is overlaying information on top of reality.”

In this case, “reality” is the world someone can see from inside a vehicle. Modern technology can teach in an otherwise mundane setting simply by pointing out landmarks, in historical or scientific context. Artificial intelligence can preview what’s ahead, where to find favorite types of food, or the best places to stay.

It can tell passengers which stars are visible through the darkness. It can point out the people who passed this way before. It can answer whatever questions we might have about just about anything out there. A vehicle becomes so much more than a simple means to get from one place to another.

“This idea began when Rob (Barrett, Riiid Labs chief innovation officer) asked ‘How can education leave the classroom?’” Abbas said. “There’s still in-person class, of course. But why does learning have to stop there? Why can’t learning continue when you’re out in the world?

Rob framed the next question like this: ‘If you’re eating a Pop Tart, the knowledge baked into this Pop Tart is so much more than we can imagine. How much nutrition is in this? What’s the process that goes into making it? How much knowledge goes into making a Pop Tart’

“There’s so much knowledge in the world and we don’t see it.”

Abbas said imagine having a high school student out with the family during a school break. Traditionally, a “break” for a student means fleeing the tedium of the same old classroom learning model.

“What if, five years from now, your child is learning physics at school,” Abbas said. “Your family takes winter break to go to Yosemite. What if those physics lessons can be applied to the surrounding environment? As they’re gazing out the window of the family car, the information they just learned in class is now being applied to the outside world. Maybe the augmented reality can break down the physics and place it into the context of the outside world. That’s the superpower here. There’s missed opportunities all around them”

Abbas said planners have moved from conceptualizing to building the requisite AI to spread education into other conduits like vehicles, glasses and other everyday areas.

“The virtual Lego blocks to build it are already being constructed today,” he said. “We’re going to have what some are calling reality 2.0. Think about how fast the progress is being made. Just 20 years ago, dads were unfolding maps and Thomas Guides to get directions. We’re living in a completely different reality now, and it’s only been 20 years.”

“Now, fast forward to 20 years from now, and we’ll have all this overlaid information wherever we go on a reality chip,” Abbas said. “It’s so crazy. My child will say ‘How did you do anything?”

Knowledge will appear and surprise people, Abbas said, like kids playing Pokémon Go.

“It’s world building,” Abbas said. “Can you touch it? Probably. We can tap into that world through a phone screen. It will be built into glasses, or a window or anything else we can think of.”

Officially, Abbas said, the five-person Riiid team is physically developing their concepts. “There’s a million different directions we can take this. Right now, we know we’re building an augmented reality demo for a car, as opposed to a house. The unique thing is you’re moving through space. Now we have to dive deeper and find the constraints, which can also breed unique opportunities.”

Abbas said developers must consider the three different states vehicles can be in: parked, driving approximately 30 miles per hour on surface streets, and moving 70 to 90 miles an hour on highways.

“What makes two and three different is the parallax effect (in which the background and the foreground of a particular view pass at different rates). The closer to you — the closer you get — the faster something whizzes by, like trees. Something might only be in the field of view for two to four seconds. Things farther afield take longer, say like the peak of a mountain, which can be there for 30 minutes.”

“Where’s the sweet spot? How much time do we have to overlay the information? What’s the minimum distance to give the user time to digest the information overlay?” Abbas asked. “Maybe on a freeway we need to increase that distance. Right now, we’re figuring out what that minimum distance is to consume information. Where’s the sweet spot?”

Designers are also trying to determine what information should be included.

“What makes something unique? Where are the restaurants we like? One decision we need to make soon is what there is to learn in the space of education. Which topics lend themselves best to this space?”

While not giving specific details as to what designers will use for the first demo, Abbas said to expect relevant subjects like sustainability or astronomy (especially for night driving).

“Do we spread ourselves thin with ten subjects or maybe demo three in a more robust way? Which topics lend themselves best to the demo to show the power behind the tech?” Abbas said.

While the information would mostly be for passenger’s consumption, eventually it could include directions in the driving space.

“Maybe we could highlight the road,” Abbas said. “We could show a student ‘Hey, did you know that 80% of this road is made up of recycled tires?’ That’s just one example.”

The Riiid Labs team is exploring how overlays work with different types of glass, and how the user would interact. Would the information just appear, would a user ask questions, could the user touch the glass to find the name of a plant outside, for example. Would there be a mechanism to zoom in? Can information be layered and where should it appear?

“We’re looking at that budding tech right now,” Abbas said. “Even things that aren’t viable now, but in two, three or four years from now it will be, and we can start planning now. We’re dreaming big, but it’s all rooted in logic.”

“We’re gathering all the info we need right now. We’re researching design patterns, essentially designs that can show up in many applications. We want to piggyback on what other projects have pioneered. You want to be as understandable as possible.”

Exactly.

--

--

Tony Hicks

Tony is an award-winning journalist who spent more than 20 years writing news, columns, features, and music and film criticism for Bay Area News Group.