SQS: Does AI make you WannaCry?


Technology is supposed to help, not hinder us. Although according to recent research, there is skepticism across the board. Dik Vos, CEO at SQS discusses why the public and businesses alike, are so reluctant to adopt the latest technologies.

Almost every day we see or hear about major cyber-security threats and software issues that have serious repercussions on the daily running of society. The recent WannaCry ransomware attack brought the NHS to its knees in May and rendered the digital capability of the NHS useless.

Shortly after this, a system collapse, suggested to have been caused by human error, also brought down British Airways and meant the company had to take the drastic action of cancelling all its flights for almost 24 hours. But, of course, we have been watching these types of catastrophic digital downfalls play out in movies and TV shows for years – computers malfunction, software goes rogue and hackers infiltrate the deepest realms of government, business and our private lives.

Life imitates art

Problematically, for the adoption rates of emerging technology, as the public become more aware of the real scale and danger of cyber-attacks, they are becoming increasingly skeptical when it comes to embracing emerging technology.

The more life imitates art the less the public believe in the safety, security and quality of technological advances such as artificial intelligence (AI), autonomous vehicles and smart home devices.

Our recent study shows that almost 80% of UK adults looking to buy AI products in the future may reconsider due to the threat of hackers targeting this technology. And nearly half (48%) claim they would not purchase AI devices at all due to the threat of cyber-attacks.

Indeed, the revelations regarding 1984-esque Orwellian hacking tools developed and used by the CIA and British intelligence to spy on household connected devices highlights how easy it is for smart home systems to be compromised.

The reality of the situation is that consumers are not yet comfortable with embracing the technology they have watched destroy or take over the world in films numerous times. But it is now becoming increasingly obvious that fearmongering is standing in the way of the potential for positive digital transformation and is an issue the industry must tackle head on, if the public are to embrace emerging technology.

Gaining consumer trust

Though understandable, the public’s reluctance to buy into the latest tech products could see the UK left behind by the rest of the world in the race to effectively leverage technological innovations.

The skepticism and concern surrounding recent consumer innovation could severely hamper the UK’s economic growth and further widen the technology skills gap the nation is currently facing. Though this presents a challenge, it is certainly an area businesses have the power to address.

Clearly, consumer trust in technology can be swayed by threats of ever more daring cyber-criminals and safety implications, but consumers can also be influenced by what they see at the cinema and on the TV.

It is up to businesses to prove to the public that doomsday predictions made in films are works of fiction and nothing more. Businesses have an opportunity to prove to consumers that these products will have a positive societal impact. In turn, it is beholden on them to place a high importance on prioritising quality to protect customers, thus gaining consumer confidence with regards to emerging technologies. Quality must be integral, from concept to the finished product.

Ultimately, if innovations such as AI and smart homes are to become part of everyday life, government and businesses have a duty to prove to the public that every precaution has been taken to safeguard and protect human life. Quality is non-negotiable and by proving this is at the core of innovation, businesses will begin to change the current public perception of advanced technology.


  • Show Comments

Your email address will not be published. Required fields are marked *

comment *

  • name *

  • email *

  • website *