Previous Page  58 / 76 Next Page
Information
Show Menu
Previous Page 58 / 76 Next Page
Page Background

5 8

ACROSS THE TRADES Spring 2019

will be applied to enhance the driving

experience.

Sorry, that should be ‘human mobility

experience’ given ‘driver’ will by then be

an archaic term and the cabin will not

be encumbered with a steering wheel

and other controls. It will be more like

the home or office, where technology

dominates, and at CES Kia turned up

with what it claims is “the automotive

industry’s first technology converging

human senses-oriented in-cabin

environment control and AI-based

emotional intelligence”.

Or, to put it more simply, the tech

is called READ, short for Real-time

Emotion Adaptive Driver.

Developed in collaboration with the

Massachusetts Institute of Technology,

the READ system is designed to

optimise and personalise the cabin

space by analysing a driver’s emotional

state in real time via AI-based bio-signal

recognition tech.

Kia says the technology monitors a

driver’s emotional state using sensors to

read his or her facial expressions, heart

rate and electrodermal activity. It then

tailors the cabin environment according

to its assessment in an effort to create “a

more joyful mobility experience”.

AI deep-learning technology enables

the system to establish a baseline

in user behaviour, and then identify

patterns and trends to customise the

cabin accordingly.

Forming part of the READ system

is another claimed world first in the

form of virtual touch-type gesture

control technology. Dubbed V-Touch,

this application employs a 3D camera

to monitor users’ eyes and fingertips

and allows the occupants to control

cabin features such as climate, lighting

and infotainment via a head-up display

and using simple hand gestures, thus

eliminating the need for conventional

switchgear or even touch screens.

Capping it off, the READ system also

includes music-response vibration

seats, where occupants can ‘feel’ their

favourite songs as well as listen to

them. Sensory-based signal processing

technology adapts the seat vibrations

according to sound frequencies of the

music being played.

The vibrating seats also have settings

for massage and, should something

go wrong in this utopian accident-free

autonomous environment, can provide

haptic warnings from the advanced

driver-assist systems on-board.

Along similar lines, AI company

Nuance also used the CES to introduce

a new innovation in its ‘Dragon Drive’

intelligent automotive assistant platform

using voice, sight, gesture and emotion

interaction that “transforms it into a

conversational, humanised mobility

assistant that will be core to the digital,

button-free car of the future”.

AN ALTERNATIVE ROUTE

Just as Audi has forged close ties with

Disney, American tech giant Intel has

joined forces with Warner Bros – using

a specially modified BMW X5 SUV – to

explore the potential of next-generation

entertainment when the vehicles are

driving by themselves.

Kia’s future self-driving cars might

look to read the occupants’ mood,

but here Intel tech and Warner Bros

blockbusters combine to make the

journey one that could potentially be

‘controlled’ by a fictional character and

the trip itself set in a fictional place,

breaking the boredom of the daily

commute or a long-haul drive.

In this case, the virtual ride –

complete with giant screen, projectors,

sensory and haptic feedback and

immersive audio and lights – takes place

in Gotham City, moderated by Batman’s

trusted butler Alfred, who comes to

life, in a sense, by interacting with the

occupants, keeping them comfortable

and informed of actual events occurring

outside in the real world: traffic jams,

road closures, route changes, and so on.

But the chaperone/navigator could be

anyone or anything, and the environment

Kia monitors a driver’s emotional state using sensors to read their facial

expressions, heart rate and electrodermal activity.

A fictional trip hosted by a fictional

character, thanks to Intel and WB.