Breaking down the language barrier between autonomous cars and pedestrians – Info Gadgets

A case study providing a way for self-driving cars to communicate their intentions on the road.

You’re walking down the sidewalk and reach the intersection at the same time as an Uber. The driver raises his hand off the wheel and waves, motioning you to cross. A minute later you reach a second intersection right after another car does. The driver lifts his hand, telling you that they’re going first.

But as you reach the next intersection, a self-driving taxi comes to a stop. How do you know what its plans are? Should you go first or does its AI brain think the car should?

This question is being asked across the autonomy world: how can a self-driving car tell the people around itself what it’s thinking?

The lay of the land

Right now, the majority of companies aren’t addressing this question at all. Alphabet’s Waymo and GM’s Cruise haven’t mentioned it once in public, strange for two companies who prioritize safety over profitability.

Most companies just have ideas of what they’ll do at some point in the future. Zoox, the startup that’s planning to build an autonomous vehicle (AV) from the ground up — instead of retrofitting SUV’s like Waymo or Cruise — has one of these ideas.

Bloomberg Businessweek: Zoox’s prototypes VH1, VH2, and VH4

“Lines of LEDs on the front and rear of the vehicle would send signals to other drivers… its directional sound system would let out a bleep or a blurp to tell a pedestrian in a crosswalk that the vehicle saw him,” writes Bloomberg Businessweek’s Ashlee Vance

Although still in the planning phase (their first car is set to be seen for the first time in 2020), Zoox’s current proposal only strengthens the between man and machine.

By using external lighting, Zoox is trying to build a “light language” of sorts. The question is, “How will the public be educated about this new standard?”

With the introduction of new colors, flashing patterns, and even lights themselves on the car, there’s a high chance of people being flustered, not knowing what’s going on. The same goes to be said about the “directional sound system” mentioned by Vance. Will people understand the distinction between aggressive and peaceful tones?

People resist change, it’s a fact. Take for example when the U.S. tried to change from the Imperial to Metric system. So few people wanted to learn a new measurement system that the effort failed. A handful of towns changed their road signs from miles to kilometers, but never went further than that.

drive.ai: a passenger entering one of their taxis in Frisco, TX

Drive.ai is one of few companies with publicly-accessible cars on the road. And as a matter of fact, they’re the only one to have their cars communicate with pedestrians at all.

Piloting a ride-hailing service down in Texas, their big, orange vans come equipped with boxes on every side of the car. Only displaying one language, and with text only readable from certain angles, drive.ai has cornered itself into a design problem.

Using my hometown for example: in New York alone, 800 languages are spoken; English just isn’t the main language of many New Yorkers. Using text to help convey intentions locks drive.ai into a certain demographic.

In the U.S., around 21 million people are also visually impaired. To create maximum visibility, there has to be contrast on a large scale. Screens are known for not providing that in broad daylight, making glare a problem.

Nissan Research Center: a slide shown at Google’s Design Is [Autonomous]

Nissan hasn’t announced their car yet, but has a ton of research under its belt. Combining Zoox’s and drive.ai’s plans, Nissan wants to use external lighting as well as screens.

Bringing back the issue of contrast, these lights and screens will have to be super bright, especially in the daytime, to bring awareness and communicate intention. This form of communicatation, language, between car and pedestrian needs to be stronger.

When I saw this, I had to ask myself, “What’s a solution that requires no learning, but is effective for everyone in the community?”

The spark

New York City is notorious for its energy. Born and raised in New York, I walk fast. And as the GIF above perfectly illustrates, one of our favorite sayings is, “If it’s red, it’s green.”

Citi Bike was a natural fit for NYC, allowing New Yorkers to get where they need to go faster than before. Even though the City’s main form of transportation is the subway, Citi Bike, and biking as a whole, is beginning to be more and more popular in the Big Apple.

Credit to Citi Bike and Blaze

Last year, Motivate implemented a laser on Citi Bikes’ baskets-of-sorts-but-not-really to project a bike icon on the pavement 15 feet ahead of itself (they’re looking at you, person who doesn’t look up from their phone). Not only is this visible at all times of the day, it’s a perfect warning system for the nascent bike lane system in NYC.

Although this laser has its cons — it’s pretty shaky on the pavement and is a little small — it got me thinking. “Could this be adapted to AV’s?,” “Could this change colors?,” and, “What would happen if these lasers were attached to a motor and gimbal?” were all thoughts that popped into my head immediately.

The proof of concept

I realized while playing around on Sketch that cars can use this technology just like bikes do, and way better. Enter Wave: Instead of turning its lights red to signal to a pedestrian that they should stop, a car could project a stop icon on to the pavement, clearly communicating its intention.

Being attached to a motor, the projection could even stay in place while the car drives towards it, eventually driving “over” the projection itself.

Starting to prototype exactly that, the first stop I made was at the Noun Project, the one stop shop for icons.

The Noun Project’s results for stop sign

With the majority of icons being octagons, duh, I set out to design what a car would look like projecting a stop sign on to a crosswalk.

I thought that this way, a pedestrian wanting to cross the street would see that the car was going to go first, and they should stop before crossing.

The “I don’t get it”s

Showing my friends Wave, I was surprised to be met with confused faces. “Wait, is the car saying it’s stopping or is it telling the pedestrian to stop?”

I realized a fatal flaw in my design: it wasn’t neutral ground for drivers’ and pedestrians’ separate languages. While cars yield when they see a red octagon, pedestrians only stop when they see a red hand. More confusing, cars go when they see a green circle, pedestrians cross when they see a white, walking stick figure.

It was clear that there needs to be a clear differentiation between the projection signaling the car’s intention versus what it’s asking the pedestrian to do.

And, to unify the difference between the meaning of colors and icons around the world, I couldn’t stop at the Noun Project. I needed to scour the web for commonalities among countries’ traffic icons to create colored icons that could be understood by all.

The next step

After researching, I found that for an AV to communicate with a pedestrian, the car needs to project icons familiar to pedestrians when they’re on the move. At a crosswalk scenario, the car projects a green walk icon to tell the pedestrian it could cross. And after crossing, the car can project a red hand to signal to others that it’s now its turn.

Another advantage of Wave is that unlike the second or two a driver today waves or raises their hand while driving today, these icons can project for five times as long. Not only does this increase visibility, it maximizes the safety self-driving cars promise.

The turn

The U.S. is home to one of the strangest traffic conventions ever: the four-way stop. If you think about it, the four-way stop is a symphony of different moving parts. The politics between cars and pedestrians, and cars and cars, is a lot for an AV to observe and act upon. Melissa Cefkin explains this really well.

Remember that quote about Zoox’s plan earlier in the article? Cefkin actually has the patent.

The biggest pain point for both cars and pedestrians is them not knowing what their counterpart’s next move will be. “Do I have enough time to run across the street?” is a common thought for pedestrians and “Did I get to the intersection first or did they?” is one of drivers.

Well, Wave can solve this too. With Lidar, aka the eyes of any self-driving car (or the spinny thing up top), AV’s know their planned path 2–3 football fields in advance. Cars can easily take advantage of that at four-way stops.

Going into an intersection with the intention of making a left turn, Wave enables a car to project their planned path many feet ahead, before they execute the turn.

This allows pedestrians and other drivers alike to see a self-driving car’s intention and where it plans to go, giving them more knowledge to act upon. Instead of a pedestrian stopping right in front of a car, they can stop a foot away from the car’s projected path, knowing what its turning radius will be because of Wave.

Radically reducing unintended confrontations, simplifying intersections, and making autonomous vehicles even safer, Wave can change how pedestrians and cars interact with each other even before that kind of problem arises.

Waymo’s Firefly with Wave technology superimposed

What now?

Wave can only go so far on my computer. The next step in the process is obviously to test it in the real world. Sadly, that’s where my expertise stops short. Not only do I not have a self-driving car to play around with, I don’t even have a dummy car to run experiments on.

The limitations of my computer don’t stop at just testing, they hold me back from getting feedback from people outside my circle. That’s why I’m here on Medium, where hopefully a researcher can help me take Wave to the next level (wink, wink).

The self-driving car race is at the highest point it’s ever been. Companies like Waymo have started testing programs with real people and no driver behind the wheel. Companies like Uber have retreated over the stress of the industry. And, companies like Zoox can come from out of nowhere to take the fallen’s place.

If you have feedback, I want it. If you have resources to help test Wave, I would love to talk. If you have the cash to sponsor the patent, email me immediately.

For now, I’ll stick to conducting more case studies in the self-driving space. Want a hint of what’s coming next? Well, combine the overall theme of this past post with this one. You can always reach out for more.

Article Prepared by Ollala Corp

You might also like
Leave A Reply

Your email address will not be published.