Connect with us

Click here to join NNU for free and make money while reading news and getting updates daily.

digital content moderation

Why Nigeria needs digital content moderation -NITDA

Published

on


The National Information Technology Development Agency (NITDA) has said there is a need for content moderation in the digital space in line with global practices to ensure the safety of users.

Director-General of NITDA, Kashifu Inuwa, stated this on Monday when a team from TikTok, a social media platform, visited him in Abuja.

Inuwa said it is important for stakeholders to collaborate in advancing the Nigerian digital space as activities online generate human and Artificial Intelligence (AI) compendium, accessible to the rest of the world.

He called for the implementation of content moderation strategies that would address online issues such as hate speech, misinformation and cyberbullying, in relation to the protection of minors across the country.

Advertisement

Internet Code

NITDA, which introduced a Code of Practice to guide the operations of Twitter, Facebook, WhatsApp, Instagram, Google, TikTok, and other social media platforms in the country in 2022, said Nigeria will be leveraging the code and partnership with stakeholders to achieve digital content moderation.

“With the Code Practice for Interactive Computer Service Platforms/Internet Intermediaries in place, this will help in ensuring digital safety in accordance with global best practice and content moderation to enhance security.

“No organization can operate in silos. We need each other for the actualization of our goals and objectives towards services delivery and for the advancement of the nation,’’ the DG said.

Highlighting some critical areas, the NITDA DG said that leveraging an interactive computer service platform would advance the country through its Digital Literacy 4 All (DL4ALL) program.

According to him, it will ensure capacity building, knowledge sharing, training, curbing of misinformation, and digital safety to create a safer cyberspace and empower the online environment for Nigerian users.

“The platform also allows for creative expression through filters, stickers, and editing tools, entertainment, and comedy which are dominant themes.

“Other features are informational videos on various topics that are gaining traction, which has become a launchpad for influencers and trends that can go viral,’’ he said.

Advertisement

Inuwa said that NITDA’s Strategic Roadmap and Action Plan 2.0 (SRAP 2024-2027) was structured around eight pillars, part of which provided avenues to safeguard the digital space.

TikTok in Nigeria

Earlier, Mrs Tokunbo Ibrahim, Head of Government Regulation and Public Policy, TikTok Nigeria and West Africa, commended NITDA as, according to him, the agency’s policies align with the vision of TikTok.

“Apart from leveraging the social media platform to market products and service, it had enabled some projects, and programs in collaboration with Africa Creator Hub.

“The Africa Creation hub has encouraged users to do campaigns for tech creation, support, empower, and educate them on content creation while exploring other sections of TikTok to change the narrative.

“TikTok platform considers online safety as one of its critical areas to secure the cyberspace by providing an avenue for users to thrive and be productive in their various activities,” she said.

What you should know

NITDA in 2022 introduced the ‘Code of Practice for Interactive Computer Service Platforms/Internet Intermediaries and Conditions for Operating in Nigeria’ to drive the moderation of all digital content in the country.

Advertisement

Although this was heavily criticized by many Nigerians, the Code remains. However, it is unclear if this is being enforced yet.

Part of the Code dictates that internet platforms including social media must:

  • Act expeditiously upon receiving a notice from a user, or an authorised government agency of the presence of unlawful content on its p A platform must acknowledge the receipt of the complaint and take down the content within 24 hours.
  • Act expeditiously to remove, disable, or block access to non-consensual content that exposes a person’s private areas, full or partial nudity, sexual act, deepfake, or revenge porn, where such content is targeted to harass, disrepute, or intimidate an individual. A Platform must acknowledge the receipt of the complaint and take down the content within 24 hours.

Follow us for Breaking News and Market Intelligence.



Source link: Nairametrics