Friday , April 19 2024
Home / In The Magazine / Nantaba and Nebanda mother’s phone call

Nantaba and Nebanda mother’s phone call

Cerinah Nebanda and Rebecca Kadaga

Rise of cybercrimes

A cyber or communication crime involves networked devises such as computers or phones. The criminal first obtains a user’s personal, business, or official information. Then they hatch a plot to use the information to commit a crime. After the fake person calling Nantaba and pretending to be warning her of imminent danger, they might call again later and `beg’ for money. They might claim to be very sick in hospital.

Telephone fraud comes in many forms. It could be a solicitation or bogus offer of opportunity like a job, scholarship, promotion, investment or sales deal and more. Many people are aware of search con-acts and no longer easily fall for them.

As telling truth from falsehood becomes tough in the digital age, many readers are familiar with the term `fake news’. But even the smartest are also sometimes duped.

On Jan.29, Africa News ran the following headline: “How Fake News awarded Ugandan politician ‘most arrested’ World Record”.  The Ugandan politician referred to was Kizza Besigye, the opposition politician. But the claim, which originated on social media that the Guinness Book of World Records had named Besigye to be the “most arrested man” was fake. The story fooled many people, including editors of some of the major news outlets outside Uganda.

But even bigger danger comes from cases that are very difficult to uncover; the so-called “deep fakes”. Last year, a Chinese woman fraudster, Qiaobiluo Dianxia, was busted on Douyu, a Chinese live-streaming app where users are able to donate and receive money.  She pretended to be a pretty young woman and offered to meet in real life (IRL) with any of her followers as long as they paid her 100,000 Yuan (Approx. Shs50 million). Of course she never met any – because she was, in fact a 58-year old woman. But she faked her face using filters.

A deep fake looks to be a real person’s recorded face and voice, but the words they appear to be speaking or the things they are doing were never really uttered or done by them. Many of the most convincing deep fakes have been produced using impersonators capable of mimicking the source’s voice and gestures.

Last year, criminals used Artificial Intelligence (AI) based software to impersonate a chief executive’s voice and demand a fraudulent transfer of US$243,000 (Approx.890 million). The biggest challenge for the fakers is the voice. But as possibly in Nantaba and Nebandah’s mother’s case, it is easy to be conned if you are not familiar with the voice.

It’s scary

The BBC recently ran a story titled ‘Deep fakes: A threat to democracy or just a bit of fun?’

It described how Deep fakes are created using software to produce computer-manipulated videos of people – often politicians or celebrities – that are designed to look real. It involves “face swapping”, whereby the face of a celebrity is overlaid onto the likeness of someone else. Some deep fakes are used as “fake news” or in online fraud.

The BCC article warned that the greater threat is the potential for deepfakes to be used in political disinformation campaigns, according to Prof. Hao Li of the University of Southern California.

“Elections are already being manipulated with fake news, so imagine what would happen if you added sophisticated deep fakes to the mix?”

People could start to put words into the mouths of politicians and no one would know – or at least by the time it was corrected it would be too late.

“It could be even more dangerous in developing countries where digital literacy is more limited. There you could really impact how society would react. You could even spread stuff that got people killed,” said Prof Li.

According to another report quoting Robert Prigge, CEO of Jumio, deepfakes have become more sophisticated and easier to produce, thanks to artificial intelligence and machine learning, which can be applied to existing videos quickly and easily, achieving results that took professional special effects teams and digital artists hours or days to achieve in the past.

But the danger that deep fakes pose to truth goes beyond their ability to create false reality. The fact that deep fakes are now known to be possible makes it easier for individuals caught on the wrong side of truth to claim it is false.

That is what possibly happened in Gabon in early 2019 when people claimed a video broadcast of Gabon’s president Ali Bongo, who had been ill and absent from the public eye for some time, was a deep fake. Bongo who was recovering from a stroke in Morocco authorised the release of the New Year’s video address to reassure the Gabonese that he was alive and fit to government. Instead claims that it was a deep fake appear to have provoked the attempted military coup.

“Once again, one time too many, the wielders of power deceptively continue to instrumentalise the person of Ali Bongo Ondimba, a patient devoid of many of his physical and mental faculties,” said attempted coup leader, Lt. Kelly Ondo Obiang.

It is such incidents that possibly prompted Prof. Hao Li to say “We are already at the point where you can’t tell the difference between deep fakes and the real thing”.

“It’s scary,” Prof. Hao Li told the BBC.

****

Leave a Reply

Your email address will not be published. Required fields are marked *