Thursday, June 7, 2018

Singapore relooking road rules to allow for self-driving vehicles

There are various areas being looked at such as how these autonomous vehicles should interact with other vehicles on the roads, says a Ministry of Transport official.

By Kevin Kwang

06 Jun 2018


SINGAPORE: As Singapore drives towards a future with autonomous vehicles (AVs), the Ministry of Transport is starting to look at how road rules can be extended to self-driven vehicles in the future.
Mr Chris Leck, director of the Futures Division at the ministry said on Wednesday (Jun 6) that those working on AVs are “quite aware” of the need to address issues like who has right of way during a situation between a human driver and an autonomous system.

There is a standards development team within the ministry working on such a framework currently. They are also looking at things like data management - “as AVs gather a lot of data" such as video surveillance of its environment - as well as cybersecurity, Mr Leck shared during a panel session at the ongoing Innovfest Unbound event.

They are also looking at international standards, such as those which relate to functional safety for cars, and how these can be incorporated into the framework, he told Channel NewsAsia on the sidelines of the event.

His comments came after another local official, Mr James Tan, shared about the experience of deploying autonomous wheelchairs at Changi Hospital - a collaboration between the Singapore-MIT Alliance for Research and Technology (SMART), MIT and National University of Singapore.

Mr Tan, who is the principal engineer of Sensors & IoT at Government Technology Agency (GovTech), said during the same panel session that two autonomous wheelchairs were deployed at a taxi stand at the hospital to help ferry patients and the designated distance was 200m. Advertisement

While the distance was short, human traffic was heavy, he noted, adding that the algorithm had a rule that said the wheelchair must give way to humans.

In the end, it took nine minutes to travel the 200m, which was not acceptable, the GovTech official said, raising that as an example of how traffic rules for AVs can be complex and need to be carefully thought through.

That said, Mr Leck shared that the “ultimate goal” is for these self-driving vehicles to be able to interact with and be part of mixed traffic situations, and they will not need dedicated lanes on the roads.

AV TRIALS

Singapore has been a strong proponent of autonomous vehicles for some time now, and it amended the Road Traffic Act last February in order to make AV trials a reality.

With the changes, MOT is allowed to create new rules which can place time and space limits on the AV trials, set standards for the design of the AV equipment, and impose requirements to share data from the trials.

The regulatory framework can also exempt AVs, operators of AVs and those conducting or participating in trials of AVs from existing provisions of the Road Traffic Act, which make a human driver responsible for the safe use of a motor vehicle while on a public road.

Transport Minister Khaw Boon Wan announced last November that Punggol, Tengah and the Jurong Innovation District will be the first areas in Singapore to have self-driving buses and shuttles plying the roads from 2022.




Commentary: Will my driverless car know what to do when honked at?

Driverless cars are getting smarter. A sociolinguist from the University of Pittsburgh discusses if they are smart enough to read the visual cues drivers typically give each other.

By Abdesalam Soudi

21 Jan 2018


PITTSBURGH: Recently, while on my way to the University of Pittsburgh’s campus, I made a quick “Pittsburgh left” – taking a left turn just as the light turns green – while facing a driverless car.

Instead of jolting forward or honking as some human drivers would be tempted to do, the car allowed me to go. In this case, the interaction was pleasant. (How polite of the car to let me cut it off!)

But as a sociolinguist who studies human-computer interaction, I started thinking about how self-driving cars will communicate with the human drivers they encounter on the road.

Driving can involve a range of social signals and unspoken rules, some of which vary by country – even by region or city. How will driverless cars be able to navigate this complexity? Can they ever be programmed to do so?

WHAT DRIVERLESS CARS CAN DO

In Pittsburgh, Uber has tested self-driving cars with a backup driver behind the wheel. In Phoenix, Waymo’s cars operate in a limited part of the city without any backup driver at all.

We know driverless cars are equipped with a technology called LIDAR, which creates a 360-degree image of the car’s surroundings. Image sensors can interpret signs, lights and lane markings.

A separate radar detects objects, while a computer incorporates all of this information along with mapping data to guide the car.

Although ideally autonomous vehicles will be able to “talk” to one another in order to allow smoother navigation and reduce crashes, this technology is still in the early stages.

But any autonomous vehicle will also need to be able to interact with traditional cars and their drivers, as well as pedestrians, bikes and unforeseen events like lane closures, disabled stop lights, emergency vehicles and accidents.

This is where things can get murky.

THE COMPLICATED LANGUAGE OF DRIVING

If you’re driving and pass a speed trap, you might flash your headlights at drivers coming in the other direction to let them know.

But flashing headlights can also mean “your high beams are too bright”, “you forgot to put your headlights on” or “go ahead” in situations where it’s unclear who has the right of way.

In order to interpret the meaning, a person will consider the context including the time of day, the type of road, or even the weather.

But how would an autonomous vehicle react?

There are other forms of communication to help us navigate, ranging from honks and sirens, to hand signals and even bumper stickers.

Of course, humans use all sorts of hand gestures – waving a car in front of them, indicating that another driver needs to slow, and even giving rude hand gestures when angry.

Sounds can communicate love, anger, arrivals, departures, warnings and more. Drivers can express total disapproval with a hard, extended hit of the horn.

Of course, emergency sirens encourage drivers to make way.

But specific meaning can vary by region or country. For example, a few years ago, Public Radio International ran a story about the language of honking in Cairo, Egypt, which is “spoken” primarily by men.

These honks can have complex constructions. For example, four short honks followed by a long one mean “open your eyes” to warn someone who is not paying attention.

In Pittsburgh, people tend to honk before going through a short, narrow or curvy tunnel.

In Morocco, where I’m originally from, drivers perform varied honks when passing. They’ll honk once before passing to secure cooperation, again as they pass to signal progress, and lastly after they pass to say, “thank you.” Yet this might be confusing or even perceived as rude to drivers in the US.

Written communication also plays a role between cars and drivers. For example, signs such as “baby on board” or “students on board” are supposed to encourage the drivers following these vehicles to be even more careful.

Bumper stickers like “Caution: Wide right turn” or “this vehicle makes frequent stops” can be critical to safety.

What if there’s a communication breakdown?

Vehicles can be taught to “read” road signs, and thus presumably can be taught to recognise common warnings on bumpers.

Yet navigating construction sites or accident scenes may require following directions from a human in a way that cannot be programmed. This creates a huge opportunity for error.

Because hand signals vary widely from region to region (and even person to person), autonomous cars could fail to recognise a signal to go or, more catastrophically, mistakenly follow a hand gesture into a barrier or another car.

This gives me pause. How much knowledge about our societal and linguistic values are built into the system? How can driverless cars learn to interpret hand and auditory signals?

Google cars can apparently recognise hand signals on bikers, but what if the biker doesn’t use standard signals? Who gets to embed the algorithm in the machine, and how are sociolinguistic values assigned?

In my experience, the self-driving car was very polite and didn’t honk or otherwise chastise me for my behavior (though the human passenger did communicate his displeasure with a gaze). But had I waved it in front of me, would it have been able to respond appropriately?

A 2015 story in Robotics Trends described how a bike and a Google car got stuck in a standoff when the car misread signals from the biker.

Cities (and countries) possess a variety of sociolinguistic cues. It remains to be seen if the engineers working on driverless cars will be able to programme these subtle but important differences into these vehicles as more appear on the roads.

Abdesalam Soudi is a sociolinguist at the University of Pittsburgh. This commentary first appeared in The Conversation. Read the original here.










No comments: