Shenzen, tech capital of China, is furthering is digital transformation by experimenting with facial recognition at one of its subway stations. After a quick scan, an algorithm will identify citizens and then deduct the cost of the train ticket from a linked account – a system which may free up time and speed up the daily grind to-and-from work.

But facial recognition technology is at the top of the list of contentious emerging tech on the verge of becoming mainstream. The arguments for its uses are just as big as those against it, especially within the context of the Asia region.

Here are the big boons and banes of facial recognition tech…

 

FOR: COUNTERING COUNTERFEITING

Your face is completely unique. It has its own shape, it reacts through movement in special ways, and thus it is almost impossible to accurately fake or reproduce. Thus, facial recognition technology could aid in the security and agility of certain applications. Shenzen is trialling it for train stations, but it’s also used around the region and the rest of the world – on machines like at ATMs and in social media from smart phones – in order to quickly and securely link a person to their product or service.

 

 

 

 

AGAINST: ABSOLUTE POWER

Facial recognition only works when algorithms can properly see your face and compare it to an archive of existing imagery. This means the use of more cameras and databases on which your details are recorded, and all without your direct consent. The biggest issue this creates is one of constant surveillance and data-collecting, an exercise in control that could be abused to devastating levels, and has been done historically. There is also the problem with protecting this data from local and foreign threats, which have also been shown capable of breaching significantly powerful firewalls in the past.

 

 

 

FOR: FACE-BASED FLEXIBILITY

Facial recognition technology isn’t just about putting a name to a face. It can also be used to put a name to an emotion. Many of our reactions to stimuli are flagged through facial expression – muscle twitches, subtle movements of the mouth or brow, and dilation of the eyes just to name a few. Some of these reactions however are so subtle, and so unconscious, that they can be overlooked altogether. Facial recognition technologies could enable a better understanding of our individual behaviours, as well as what people like and dislike, which can be applied to important industries like the psychological wing of medicine. These datasets could also partner with other emerging technologies like AI – much in the same way Shenzen’s subway experiment will be actuated.

 

 

 

AGAINST: CORPORATE GAZE

The flipside of having specialised data that focuses on your psychological responses is that this could be sold on to individuals looking to target said psychological triggers. Perhaps a prominent retailer wants to know what foods bring you the most joy, so that they can bombard the accounts linked to your face with ads that cater to you. It could also track what areas you frequent, meaning you might always find yourself running into similar ads, no matter where you went. It doesn’t sound horrible at first, but considering the complexity of the human mind, this data might not reflect your actual wishes. Furthermore, this data could also be sold to individuals looking to harass you with content that is confirmed to affect you.

 

 

 

FOR: HUMANE TECH

Humans are constantly striving to achieve or obtain the next innovation. We seem instinctually driven to not just survive, but thrive. And in the process of doing this, we sometimes incorporate technology directly into our persons. Facial recognition technology is one of the pinnacles of aiding machines – not people – in understanding us. A smart machine can better take care of us, sure; but a machine that understands us, and potentially sympathise with us by reading our faces the same way we read those of our friends and family, can lead to better life experiences.

 

 

 

 

 

AGAINST: SUPERFLUOUS SPECIES

The cultural panic revolving around AI rising above the human race, and potentially destroying us in the process, definitely gains momentum when we talk about facial recognition algorithms. If an AI is not trained to be empathetic, it could coldly deduce how humans can be emotionally manipulated, and thus self-replicate systems that can in turn trap and then control us. Much like the argument above, this is more theory than proven, but forewarned is forearmed, and whilst we needn’t worry about the rise of the machines such as they are right now, we may want to pause and reflect before the opportunity to do so is lost.

 

 

 

INNOVATIVE INSIGHTS

Innovatus Media has spoken with several AI developers in several enterprise sectors, and so we feel we know enough to highlight some key issues in all these arguments. Facial recognition technology can be overcome by some pretty low-tech solutions. Then again, forcing someone to show their face isn’t all that high-tech a process either. As for governmental or corporate control and surveillance, legislature is already being prepared to combat the issues of consent when it comes to facial tech. And it should also be mentioned that, whilst admittedly difficult, there will always be traditional ways of living that forgo the use of digital systems altogether, let alone competitive services and products to cater to this.

When it comes to AI, however, it all comes down to how we program and teach our machines. We saw a documentary on the topic that we found very useful.

How would you feel about facial recognition tech if it came to a city near you? Do you trust AI with data as personal as what your face looks like? Let us know your thoughts on this topic by reaching out at info@innovatusmedia.com.au

Tags:

Ads

You May Also Like

Old Dogs Up to New Driverless Tricks

Often it is the challenger brands that bring new ideas to the masses. When ...

Challenges Still Exists for Global Digital Leader

A recent study by the Singaporean Business Federation has showcased the digital transformation is ...