TikTok users in Kenya are being shown videos that are abusive, incite violence or are outright fake news ahead of the country's August general elections, according to a new research.
The findings have found that the social network failed to flag and moderate content on the election.
The study was conducted by Odanga Madung, a researcher at the Mozilla Foundation.
It identified more than 130 videos linked to 33 accounts that contained hate speech, manipulated images and sound as well as political disinformation.
They include a fake television news bulletin and a fake documentary.
The videos have been viewed more than four million times collectively and were still not flagged as inappropriate content on the app at the time the study was being conducted.
The research further points out that TikTok, which is owned by a Chinese company called Bytedance, is currently the most downloaded app in Kenya.
Mr Madung said political disinformation on TikTok - in violation of the platform’s own policies - was stirring up a highly volatile political landscape.
Following disputed elections in late 2007, some 1,200 people were killed and more than 500,000 fled their homes.
In their official response, TikTok told the researcher that the organisation prohibits election disinformation.
Fortune Mgwili-Sibanda, the head of public policy and government relations at TikTok in Africa, added that the platform was working with accredited fact-checkers and had partners globally who helped assess content for accuracy in more than 30 languages, including Swahili.
He said the social network had removed some of the highlighted content and suspended some accounts.