0% found this document useful (0 votes)
168 views35 pages

Mauritius Social Media Regulation Debate

Response of the Mauritius Labour Party to the Consultation Paper on proposed amendments to the ICT Act for regulating the use and addressing the abuse and misuse of Social Media in Mauritius. This detailed response was submitted to the ICT Authority (Mauritius) on 13th May 2021, following their call for comments.

Uploaded by

Keshav Jokhun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
168 views35 pages

Mauritius Social Media Regulation Debate

Response of the Mauritius Labour Party to the Consultation Paper on proposed amendments to the ICT Act for regulating the use and addressing the abuse and misuse of Social Media in Mauritius. This detailed response was submitted to the ICT Authority (Mauritius) on 13th May 2021, following their call for comments.

Uploaded by

Keshav Jokhun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Response to the Consultation Paper on proposed

amendments to the ICT Act for regulating the


use and addressing the abuse and misuse of
Social Media in Mauritius

By the Mauritius Labour Party

13 May 2021

1
1.1 Introduction
On the 14th of April 2021, the Information and Communication Technologies Authority published a
consultation paper titled “Consultation Paper on proposed amendments to the ICT Act for regulating the use
and addressing the abuse and misuse of Social Media in Mauritius”. The public has until the 20th of May 2021
at 16:00 to submit their comments on the [email protected].

The paper proposes amendments to the ICT Act to provide legal sanctity to a National Digital Ethics
Committee. The latter will be composed of people who will decide whether online content under investigation
is harmful and illegal. An Enforcement Unit composed of ICTA personnel will be set up in order to operate
tools that will filter incoming and outgoing Internet traffic in Mauritius, segregate social media traffic to be
decrypted, archived, then re-encrypted for further transit. The decryption and re-encryption of social media
traffic will require the installation of a self-signed certificate on the workstations and devices of social media
users in Mauritius.

Section 14 of the Consultation Paper contains nine questions that the ICT Authority has included in the paper.
However, the public are not confined to these questions and are encouraged to raise any issues pertinent to
them as mentioned in Section 13.2 of the Consultation Paper.

We, the Mauritius Labour Party, are participating in this public consultation and we are providing our answers
to the nine questions along with a thorough analysis of the proposed amendments by the ICT Act.

1.2 Comments & Answers to the consultation paper questions


The vision statement of the ICT Authority as listed on its website (https://www.icta.mu/about.html) is;

«to play a leading role in the future of ICT in Mauritius contributing to an efficient,
competitive and optimally regulated sector.»
The proposed solution for the regulation of social media as per the Consultation Paper issued by the Authority
dated 14th April 2021 (Consultation Paper on proposed amendments to the ICT Act for regulating the use and
addressing the abuse and misuse of Social Media in Mauritius), will certainly not improve the efficiency and
competitiveness of the ICT as laid down in the Authority’s vision. Quality of service will be lower and barriers
to entry for new players will be raised.
We shall also consider the adverse effects that the implementation of such an intrusive means of social media
regulation, where privacy and freedom of speech are at risk, could have on the reputation of Mauritius on the
international scene and in the media globally. Three of the pillars of our economy, the global business
(offshore) sector, the BPO sector and the Tourism industry may be seriously affected just when we need these
same pillars to help the country recover from the economic downturn during the COVID-19 pandemic.
In any case, we agree that some form of social media regulation in collaboration with social media
administrators is needed to prevent abuse. We would like to stress the importance of all the stakeholders
being independent in their decision making process and the only way we believe that this is possible is through
transparency at all levels, starting with the composition of the committees, policies and guidelines, decisions
taken etc. We will show in our response the level of transparency that social media platforms already have in
place as well as international trends. The ICTA should definitely take inspiration from that.
We have based our response and recommendations on the above points raised and we hope that these will
be taken into consideration.
We have also taken into consideration the communiqués issued by the ICTA on 19 and 29 April 2021 in our
th th

response.

2
2. Reply to Question 14.1 of the Consultation Paper: What are your views on the present approach of self-
regulation of social networks by social media administrators themselves where they decide to remove an
online content or not based on their own usage policy and irrespective of your domestic law?

2.1 Definition of social media and social media administrators


From the outset, it is essential that the ICTA clarifies exactly the definition it will assign to the term “social
media”. The term, albeit defined generally, is not defined under any statue in Mauritius. It is present in only
two pieces of legislation, namely The Children Act 2020 and the Tourism Authority (Hotel Classification)
Regulations 2015. However, there is no definition of the term social media. In recent case, the judges and
magistrates, mainly in relation to the granting of bail, the Court has been using the following catch all wording:
“any technology such as phone, email, WhatsApp, Messenger, Facebook, Twitter, or any other social media
platform.”
The Consultation Paper uses the term social media and social network without providing any guidance as to
what is exactly encompassed. We have already seen discrepancies in the Consultation Paper, statements made
by the Chairman of ICTA and the communiqué issued as to which platforms or social media will be regulated
and whose traffic will be intercepted.
It is therefore essential from the outset for the ICTA to define clearly what it means by “social media” and
which platforms/networks it will target for its proposed interception. The term social media administrators
should also be clearly defined to distinguish between the social media platform administrators and page
administrators.

2.2 Cases reported to social media administrators


On page 3 of the Consultation Paper, first paragraph mentions: “in Mauritius, when offensive and abusive
online content is posted in the native creole language, in the majority of cases, complaints made by local
authorities to the social media administrators remain unattended or are not addressed in a timely manner”.
Can the Authority share data that they have to substantiate this statement? If this is the main issue, then why
not instead focus on addressing it directly through engaging with the social media administrators to ensure
that complaints are better attended to? This could be achieved for example through agreeing with the social
media administrators on specific protocols, and on having representatives – albeit ones who understand the
creole language - posted in the country for timely interventions when required.
In the communiqué dated 29th April, the ICTA states:
“Dans le cadre de cette consultation publique, l’ICTA a aussi envoyé sa proposition à la direction de Facebook
pour leurs commentaires et suggestions. »
This gives the impression that this is the first time that the Authority is seeking a collaborative approach with
Facebook. Also, would the fact that it has been sent to Facebook only means that only content posted on
Facebook is perceived as an issue by the ICTA

2.3 Self-Regulation by social media administrators


Social media platform already have advanced self-regulation mechanisms. If we look at Facebook, with both
human review and (Artificial Intelligence) AI technologies, nearly 90% of the hate speech content are
proactively removed before it is reported by users. Three million pieces of hate speech each month, or more
than 4,000 per hour are removed. Facebook detailed this for Facebook and Instagram platforms at
https://transparency.facebook.com/community-standards-enforcement.

3
Facebook also has information on the various actions it is taking to continuously improve the self-regulation
mechanisms at: https://www.facebook.com/business/news/sharing-actions-on-stopping-hate. All the other
platforms have similar mechanisms in place to detect and remove illegal content which uses both Human
capabilities as well as AI and machine learning technologies.
At a meeting of the European Commission on 9th January 2018 with social media platform: (See https://digital-
strategy.ec.europa.eu/en/library/tackling-illegal-content-online-meeting-online-platforms-9-january-2018),
it was mentioned that:

There are no simple technical solutions to remove illegal content online. Machine learning algorithms
are currently used and are very effective, but human intervention is always necessary and the degree
to which this is deployed varies a lot.

We do not understand the position of the Authority of trying to acquire and spend resources on refining a
technical toolset when firstly through interactions with social media administrators, these issues could surely
be addressed and secondly there is a high possibility that people will be required to moderate the intercepted
content effectively.
The Authority also mentions the issue of creole language. We believe that this can be resolved by defining a
dictionary for creole words that appear harmful and/or illegal and communicate same to social media
administrators. This should not be a big problem to resolve through collaboration.
Facebook mentions that they have over 35,000 people working on safety and security
(https://about.fb.com/actions/promoting-safety-and-expression/). We believe that it would be a significant
waste of resources being given the current critical state of our economy to invest in such a technical toolset
and the associated human resources when we will be trying to recover from the COVID-19 pandemic.
In fact, we believe that there is actually a huge opportunity to get social media administrators to set up offices
in Mauritius for their human moderation processes; Mauritius has a pool of young skilled workforce who are
fluent in English, French, and mostly in one additional language (Hindi, Urdu and Arabic) amongst others on
top of Creole.

2.4 International Trends in Social Media Regulation


The US, UK and Australia, have been pressing social networks to take responsibility for content on their
platforms besides wanting tighter data-handling practices.
Currently, the EU, by way of EU Directive 2000/31/EU, has obligated these companies to self-regulate through
a duty to take-down illegal/harmful contents expeditiously (though the number of hours is not defined) once
they have been notified. A post-Brexit UK is still subject to this directive but is drafting legislations that will
impose a duty of care for internet companies, including social media platforms (see:
https://www.ofcom.org.uk/about-ofcom/latest/features-and-news/ofcom-to-regulate-harmful-content-
online). The independent regulator, Ofcom, would oversee and enforce compliance. Ofcom has already stated
the following:

We won’t be responsible for regulating or moderating individual pieces of online content.


The Government’s intention is that online platforms should have appropriate systems and processes in
place to protect user; and that Ofcom should take action against them if they fall short. We’ll focus
particular attention on tackling the most serious harms, including illegal content and
harms affecting children.

4
India amended its law in February this year bringing major social media companies to establish local offices
manned by Senior Officials to deal with law enforcement and user grievances. They will also be under
obligation to take-down harmful / illegal content within 36 hours from government agencies and 24 hours in
the case of users. They may also have to alter their technology architecture to build in automated tools to
weed out content related to rape, child sexual abuse or conduct. Local offices will also have to publish monthly
blocking and compliance reports. This approach has been well appreciated globally with countries like the US
and New Zealand contemplating similar routes.
These major democratic countries are setting the trend of holding social media companies accountable for the
contents on their platform. This is the global trend and it would be unwise not to learn from these experiences.
The response of social media administrators to the Indian government’s new legislation is worth pointing out:
Facebook: “The details of rules like these matter and we will carefully study the new rules
(https://www.reuters.com/article/india-tech-regulation-idUSKBN2AP175 ),”

Twitter: “We believe that regulation is beneficial when it safeguards citizen’s fundamental rights and reinforces
online freedoms,” (https://www.reuters.com/article/india-tech-regulation-idUSKBN2AP175 )

We therefore believe that the responsibility of deleting or blocking access to content on social media platforms
should be with the social media administrators, not with the Government.

2.5 Transparency Reports


It should be pointed out that Social Media Administrators publish transparency reports for requests from
Governments around the world to remove or block access to content. Some of these are listed here:
https://transparencyreport.google.com/government-
removals/overview?hl=en&authority_search=country:mauritius&lu=request_country&request_country=peri
od:;authority:MU
https://transparency.facebook.com/government-data-requests/country/MU/jan-jun-2019
https://about.linkedin.com/transparency/government-requests-report#government-requests-data
Based on these, it would appear that they have not received the large number requests from the Government
of Mauritius or have not received them in the appropriate format as the Consultation would like make us
believe. In fact, no request from Govt is listed by Facebook for 2018 and 2019. L’Express newspaper provides
details on the different requests made by the Government to the social media administrators in the article
which is available at https://www.lexpress.mu/node/392176.

2.6 Consideration for Local Laws by social media administrators


LinkedIn also details that they consider local laws when they receive requests for remove content:
https://www.linkedin.com/help/linkedin/answer/112222?src=or-search&veh=www.google.com%7Cor-
search

5
“Different countries balance free speech against the potential impact that can come from speech in
various ways. For example, some countries actively enforce laws around hate speech or speech that they
view as threatening social stability, while other countries’ laws are more tolerant of these types of
speech. Governments will ask us to restrict member-posted content that they believe violates their laws.
When we receive a government request to remove content, we carefully review and assess it to
understand the reason for the request, the authority of the requestor, and our applicable policies or
terms. Based on these reviews, we determine whether and to what extent we should take action.
For international requests, we check the validity and ask that the request to be properly issued, for
example, through a Mutual Legal Assistance Treaty or a form of international process known as a Letter
Rogatory, except in the case of certain emergencies.”

Facebook also refers to actions that they take based on local laws at the following page:
https://transparency.facebook.com/content-restrictions

“Content Restrictions Based on Local Law


When something on Facebook or Instagram is reported to us as violating local law, but doesn't go
against our Community Standards, we may restrict the content's availability in the country where it is
alleged to be illegal. We receive reports from governments and courts, as well from non-government
entities such as members of the Facebook community and NGOs. This report details instances where we
limited access to content based on local law. “

We therefore do not agree to the statement of the ICTA that social media administrators do not block or
remove access to content based on local laws.
The transparency reports also give example of social media administrators taking actions to remove content
based on local laws in cases where they do not have an office. Some examples are given below for Facebook:
- In Costa Rica, content was blocked in response to request from the Ministry of Health.
- In Morocco, access was restricted to one item reported by the Ministry of Interior of Morocco for
content expressing anti-king sentiment.
- In Peru, Egypt, Kenya, access was restricted to content in response to private reports.
- In Indonesia, access was restricted to 445 items for publishing misinformation such as false COVID-19-
related claims, to 90 items for fraudulent investment-related advertisement, to 62 items for illegally
advertising or selling regulated goods, to 17 items for religious attacks and blasphemy, and to 17 items
for violation of laws related to hate speech, graphic violence, incitement, separatism, extremism, and
nudity following requests from the government.
These are only some examples where Facebook takes action even in the absence of an office in the country
requesting for the blocking of content. We fail to understand the statement in Section 5.1 mentioning that
social media administrators need to have a physical presence in Mauritius to be effective. A more collaborative
approach should be encouraged.

6
2.7 European Union Stand
On a final point, in the recent article by the European Union,
https://www.europarl.europa.eu/news/en/headlines/society/20210204STO97129/social-media-and-
democracy-we-need-laws-not-platform-guidelines, the following extract provides guidelines to the approach
that we think would work best:

“Speaking on behalf of the Portuguese Council Presidency, Ana Paula Zacarias said: “We expect online
platforms to play their parts in this common fight, but it is up to the democratic institutions, our laws,
our courts to set the rules of the game, to define what is illegal and what is not, what must be removed
and what should not be.””

The local Authority should provide social media platform in advance the rules or guidelines on what content
is illegal / harmful and therefore should be removed. We would request the Authority to seriously consider
this approach and open a channel of communications with the social media administrators for better
collaboration. We would suggest that our embassies assist the Authority in this process since they are located
in countries where the social media administrators are present; e.g. in South Africa, India, US and Europe.
We can also leverage the local US Embassy as they are very well aware of the issues which the society is facing.

2.8 Online Harms White Paper – UK


We refer to the interview of the part-time chairman of the ICTA, Mr Dick Ng Sui Wah, which appeared in Le
Mauricien on 3rd May 2021 (https://www.lemauricien.com/opinions/interview/me-dick-ng-sui-wa-le-
president-de-licta-ce-qui-ne-serait-etre-publie-dans-un-journal-ne-peut-etre-ecrit-sur-les-reseaux-
sociaux/420474/) and the following extract:

D’où vient cette proposition de contrôle du contenu des réseaux sociaux.

— Des pays européens et de ce qui a été écrit dans le papier préparé par le gouvernement
britannique en avril 2019, le “Online Harms White Paper” dont voici un extrait sans équivoque:
“existing regulatory and voluntary initiatives has “not gone far or fast enough” to keep the users
safe. This paper proposed a single regulatory framework to tackle a range of harms.” Cela
concernait des propositions pour réguler les compagnies internet et les plateformes de réseaux
sociaux. Une consultation publique sur ces propositions a eu lieu en juillet 2019 en Grande-
Bretagne.

The ICTA refers to the UK Government both in the Consultation Paper and the interview above, we believe it
would be useful to look at some extracts of the Online Harms White Paper. The one which we consulted is
available at: https://www.gov.uk/government/consultations/online-harms-white-paper and is dated 20
December 2020. We would like to highlight some extracts.

7
In the Ministerial Foreword by Rt Hon Oliver Dowden CBE MP (Secretary of State for Digital, Culture, Media
and Sport) and Rt Hon Priti Patel MP (Secretary of State for the Home Department), the following is mentioned:

We are taking action to unlock innovation across digital markets, while also ensuring we keep people
safe online and promote a thriving democracy, where pluralism and freedom of expression are
protected.
Alongside tackling harmful content this legislation will protect freedom of expression and uphold media
freedom.
As an independent country, the UK has the opportunity to set the global standard for a risk-based,
proportionate regulatory framework that protects citizens online and upholds their right to freedom of
expression.

It is clear here that the UK Government focus is to encourage innovation through online technologies while
protecting people and safeguarding their rights to freedom of expression. We see no approach towards
encouraging innovation in the approach of the ICTA in the Consultation Paper. In fact the opposite is most
likely to happen with international firms seeing this approach as too intrusive.

In the Executive Summary, the following is mentioned:

3. The government has taken a deliberately consultative and iterative approach in developing the
framework, to ensure regulation that is coherent, proportionate and agile in response to advances in
technology.
14. Many of the major social media companies have moved further and faster than ever before to tackle
disinformation and misinformation during the pandemic through technical changes to their products,
including techniques to protect user safety online.
17. The online harms framework will be coherent and comprehensive, bringing much needed clarity to
the regulatory landscape and providing support for both industry and users. It will be proportionate,
risk-based and tightly defined in its scope. The legislation will avoid taking a ‘one size fits all approach’
to companies and harms in scope, to reflect the diversity of online services children, and harms. The
government has placed particular emphasis on protecting ensuring a pro-innovation approach, and
protecting freedom of expression online. Regulation will safeguard pluralism and ensure internet users
can continue to engage in robust debate online.
22. Stakeholders raised concerns during the consultation about how the legislation will impact
journalistic content online and the importance of upholding media freedom. Content published by a
news publisher on its own site (e.g. on a newspaper or broadcaster’s website) will not be in scope of the
regulatory framework and user comments on that content will be exempted.

The UK Government has taken a consultative and iterative approach in developing the framework and has also
mentioned that the framework will be proportionate to the risk. We would strongly recommend the ICTA to
follow this approach, more consultation is needed and because of the lack of details in the Consultation Paper,
there is need for future public consultation following the analysis of the comments received from the
Consultation Paper issued on 14th April 2021.

8
Box 1: User-generated content and user interactions
1.4 The White Paper consulted on defining private communications, and what regulatory requirements
should apply to them. It also said that companies would not be required to monitor for illegal content
on these services in order to protect user privacy.

The key takeout here is that “companies would not be required to monitor for illegal content”. In no
circumstances in the document does the UK Government mention that its regulator will be intercepting,
decrypting traffic.

Harmful content and activity covered by the duty of care


2.3 A limited number of priority categories of harmful content, posing the greatest risk to users, will be
set out in secondary legislation. These will cover (i) priority categories of criminal offences (including
child sexual exploitation and abuse, terrorism, hate crime and sale of illegal drugs and weapons) (ii)
priority categories of harmful content and activity affecting children, such as pornography or violent
content, and (iii) priority categories of harmful content and activity that is legal when accessed by adults,
but which may be harmful to them, such as abuse and content about eating disorders, self-harm or
suicide

Illegal content and activity


2.19 All companies in scope will need to take action to prevent the use of their services for criminal
activity. They will need to ensure that illegal content is removed expeditiously and that the risk of it
appearing and spreading across their services is minimised by effective systems.
2.26 The regulatory framework will also require companies to give users a right to challenge content
removal, as an important protection for freedom of expression

The ICTA should have already defined harmful and illegal content. We have not seen any mention of some of
the criminal offences mentioned in the UK White Paper. Also, the approach of the regulator in the UK is to
have the companies (e.g. social media administrators) to take actions at their level. We do not see why, with
the UK being as a model by the ICTA, we should implement a completely different solution in Mauritius.

Box 12: Collaboration with industry Harmful content including suicide, self-harm, and eating disorder
content
The online harms framework will place regulatory responsibilities on in-scope companies likely to be
accessed by children to protect their child users from harmful content and activity, including suicide, self-
harm and eating disorder content. However there are wider government-led initiatives to develop
voluntary cooperation in this area ahead of legislation.

The Department for Health and Social Care has coordinated a strategic partnership with social media
companies and the Samaritans to set guidance on moderating suicide and self-harm content, and
educating users to stay safe online.

Again, the UK Government is putting the responsibility of moderating content on social media companies.

9
Interim codes of practice
The government has undertaken an extensive period of engagement across wider government, industry,
international partners and civil society, to ensure the measures set out are proportionate but robust
enough to tackle these most serious and illegal online harms.

Using technology to identify illegal child sexual exploitation and abuse content and activity
Final Policy Position: In light of this, the regulator will have the power to require companies to use
automated technology that is highly accurate to identify illegal child sexual exploitation and abuse
activity or content on their services.
2.60 Robust safeguards will be included in the online harms legislation to govern when the regulator
can require the use of automated technology. The regulator will only be able to require the use of tools
that are highly accurate in identifying only illegal content, minimising the inadvertent flagging of legal
content (‘false positives’) for human review. The regulator will advise the government on the accuracy
of tools and make operational decisions regarding whether or not a specific company should be required
to use them. However, before the regulator can use the power it will need to seek approval from
Ministers on the basis that sufficiently accurate tools exist. The government assesses that currently,
sufficiently accurate tools exist for identifying illegal child sexual exploitation and abuse material that
has previously been assessed as being illegal.

Again, in the UK proposal, it will be the responsibility of the companies (social media administrators) to identify
illegal content on their platforms. In no instance is the regulator being the enforcer, unlike what the ICTA is
planning to do.

Data retention and reporting to law enforcement


Final policy position: The government is minded to introduce a requirement for companies to report child
sexual exploitation and abuse identified on their services, with these reports being made to a designated
body. A requirement to retain child sexual exploitation and abuse data will not be introduced through
this legislation. However, the government is considering introducing this through alternative legislation.
With regards to terrorist content and activity, the government expects companies to report to law
enforcement where they consider there is a threat to life or risk of imminent attack. The legislation will
not introduce a requirement for companies to retain this data.
2.73 Companies will be encouraged to retain child sexual exploitation and abuse data for law
enforcement purposes. The online harms legislation will not introduce a requirement to retain this data
but the government is considering introducing this requirement within alternative legislation.
2.74 The government expects companies to report terrorist content and activity on their services to law
enforcement where they consider there is a threat to life or risk of imminent attack. The government
will work with the regulator to ensure that it encourages this and provides companies with clear
guidance on how this could best be done and information on where to report to. The online harms
legislation will not introduce a legal requirement for companies to report and retain this data.

10
The UK proposal does not impose data retention as part of the online harms legislation. Again, despite the fact
that the ICTA (through its chairperson) using the UK as an example, it is going for a proposal which is not in
line with what the UK is planning to implement by archiving data.

2.81 As set out in paragraph 2.2, the duty of care will apply to content or activity which could cause
significant physical or psychological harm to an individual, including disinformation and misinformation.
Where disinformation is unlikely to cause this type of harm it will not fall in scope of regulation. Ofcom
should not be involved in decisions relating to political opinions or campaigning, shared by domestic
actors within the law.
The Regulator
The regulator will be accountable to Parliament. Ofcom as the regulator will lay its annual report and
accounts before Parliament and be subject to Select Committee scrutiny. The annual report will give
details about how it has discharged its functions in relation to online harms.
3.11 The importance of regulators being independent from undue influence - from government, other
political sources, regulated services and organisations with an interest in the regulated area - is an
important element of effective regulation.

Duties on and functions of the regulator


4.1 The regulator will have certain duties and functions under the framework. Its primary duty will be
to improve the safety of users of online services (and that of non-users who may be directly affected by
others’ use of them). Regulatory action should be undertaken in line with the principles of the regulatory
framework (see Annex A), which means being realised in a way that:
• is based on the risk of content or activity online harming individuals, where it gives rise to a
reasonably foreseeable risk of a significant adverse physical or psychological impact on
individuals
• is reasonable and proportionate to the severity of the potential harm and resources available to
companies
• provides a higher level of protection for children than for adults
• protects users’ rights, including to freedom of expression and privacy online, and safeguards
media freedom
• promotes transparency about and accountability for the incidence of and response to harm
• supports innovation and reduces the burden on business, and
• is delivered by putting in place appropriate systems and processes

The UK proposal clearly states that the Regulator should be accountable to Parliament and should be
independent from undue influence including political ones. The Consultation Paper of the ICTA does not shed
any light on how this will be done. The duties of the Regulator in the UK is also based on principles which are
clearly stated.

11
Enforcement in an international context
4.47 The enforcement powers have been designed to be able to be used against companies with and
without a physical or legal presence in the UK. As other countries introduce similar legislation,
international cooperation will become an increasingly important and effective tool for the regulator.
The government expects the regulator to work with equivalent organisations internationally to help
foster collaboration.

International Context
6.4 Countries around the world are grappling with how to make the internet a safer environment for
users. The regulator will take an international approach, working with other international regulators,
to ensure effective enforcement and promote best practice at a global level.
6.5 The government continues to engage with international partners to learn from their experiences and
build consensus around shared approaches to tackling online harms that uphold our democratic values
and promote a free, open and secure internet.
6.6 International collaboration remains vital. The government welcomes international, industry-led,
multi-stakeholder initiatives – including initiatives supported by the UN and other multi-lateral bodies –
such as the Global Internet Forum to Counter Terrorism, the WePROTECT Global Alliance, and wider
initiatives such as the Freedom Online Coalition and the Technology Coalition Fighting Child Sexual
Abuse.

It is very clear that without international collaboration and sharing of best practices, the enforcement of
regulation will be challenging. Could the ICTA disclose whether it has sought collaboration with other
regulators around the world, in particular in the countries mentioned in the Consultation Paper?

2.8 Summary
We believe that social media administrators have in place self-regulations to remove certain illegal content
and also take actions to remove illegal content based on local laws when the requests are genuine and follow
the correct processes. This is also the case even in countries where the social media administrators are not
present.
It is clear that it is the Social Media Platform administrators who are the responsible to remove contents
deemed not suitable. With the proposed amendment, it appears that the Government will be investing a hefty
amount on equipment to decrypt data and store for which they are not even the legal owner. We would
strongly recommend a more collaborative approach with social media administrators while ensuring that the
ICTA follow the procedures set in place by the social media administrators. Has the ICTA followed the proper
channel and used the Attorney General Office for mutual legal assistance, since the Attorney General Office is
the Central Authority who can request for mutual legal assistance?
This is the approach being used by democratic countries as we have shown in our examples. We have not seen
any regulator that is planning to intercept, decrypt and store social media. They prefer to stay at policy level
and get social media companies to take actions. This is also the case for the UK, which the ICTA has used as an
example when talking about social media regulation.
The UK also requires social media administrators to report certain incidents to local Authorities.

12
3. Reply to Question 14.2 of the Consultation Paper: Do you think that the damage caused by the excesses
and abuses of social networks to social cohesion warrants a different approach from the self-regulatory
regime presently being enforced by social media administrators themselves?

3.1 Social Media Offences v/s other Offences


In order to justify a solution, the Authority mentions in the Consultation Paper a total of 2051 incidents
reported in a timeframe of 12/13 months. Therefore, these statistics appear to be forming the very basis on
which the proposal sits. However, this only represents cases reported. We do not have insights into the
number of cases investigated and the outcome; we are not aware of how many of these reported incidents
were actual offences that lead to prosecution and conviction. No such data has been made available in the
Consultation Paper. Now, relying on this number to justify a technical solution with such blatant and revolting
suppression of our basic human rights, defies all the logic.
The officer in charge at the ICTA, Jérôme Louis, also states that:
Mais après les trop nombreux abus au niveau des réseaux sociaux qui ont impacté notre pays depuis des
années,
The ICTA should provide the actual figures for social media incidents only so that a more appropriate response
is provided. The ICTA should substantiate the actual number of offences and prosecutions resulting from these
incidents. Furthermore, reported cases represents a rate of 1.6 per 1,000 population or 2.4 per 1,000 users
(based on 850k Facebook users). We have to bear in mind that not all the incidents (such as phishing and
hacking amongst others) reported relate to social media only and specifically the social media platforms that
are being addressed in this Consultation Paper. Following the two communiqués issued by the ICTA (19th April
and 29th April), it is also important for the ICTA to provide the number of incidents that are linked only to the
social media falling under this project (e.g. removing incidents on Whatsapp, Messenger and other online
messaging platform).
Even if we assume that all of these relate to social media, the occurrence of other incidents that resulted in
offences is far higher than those related to social media is shown in the table below:

Indicator 2019

Overall offence rate (excluding contraventions) per1,000 population 35.9

Theft rate per1,000 population 8.9

Drug rate per1,000 population 3.9

Juvenile delinquency per1,000 population 6.8

Persons prosecuted per 1,000 population 11.7

Source: Statistics Mauritius - Crime, Justice and Security Statistics, 2019 (Police, Prosecution, Judiciary,
Prisons and Probation)

Theft, drugs and other offences are more prominent and have greater adverse effect on society than social
media incidents and the priority of the government should be to focus funds, time and energy on addressing
these offences instead of those caused by social media.

13
Also, it should be pointed out that as per the ICTA communiqué dated 19th April 2021, online messaging
services like Whatsapp are not concerned with the proposal of the Authority. Fake news can also be circulated
through this application as mentioned by the Authority in the Consultation Paper (Section 3.1). Therefore, by
considering only certain online services, the Authority would unlikely address the issues it is trying to resolve.
Ill-minded people will nevertheless be able to communicate on Facebook over standard VPN software out of
their own groups, in the way the Dark Web operates.
Are we therefore, not encouraging other flavours of the social media web which the Government will not
financially / technologically be able to afford to forcefully break through? Illicit exchanges and transactions
could find a honey-pot in the 50-shades of the local social media.

3.2 Unrest due to social media incidents / offences


Can the ICTA provide supporting information on social media incidents which have caused unrest in Mauritius?
The actual number of investigations and cases which led to prosecution would be also useful in order to
provide appropriate comments and recommendations.

3.3 Offences through private channels


The UK Online Harms White Paper mentions at Section 30 of the Executive Summary: The scale, severity and
complexity of child sexual exploitation and abuse is particularly concerning, with private channels being
exploited by offenders. For example, 12 million of the 18.4 million worldwide child sexual exploitation and
abuse reports made by Facebook in 2019 were for content shared on private channels.
From the above, two thirds of the abuse reports are for content shared on private channels. Bearing in mind
the recent Telegram issue (https://www.lexpress.mu/article/391224/affaire-telegram-policier-baker-affirme-
avoir-lost-or-dropped-son-portable). By trying to take roles that should be that of social media companies, the
ICTA will look only at a third of incidents on only social media that is being intercepted. Cases like the Telegram
one will still be undetectable in the proposed system.

3.3 Summary
We believe that:
1. The number of incidents reported does not need such drastic intrusive measures from the
Government.
2. The priorities of the Government should go to offences that are causing more damage to our society
rather than to a statistically insignificant and presumed online delinquency.
3. A less costly and more collaborative approach with social media administrators should be sought after.
Therefore we do not think that a different approach to the self-regulatory regime used by social media
administrators is needed.

14
4. Reply to Question 14.3 of the Consultation Paper: What are your views on the overall proposed
operational framework in terms of the
• National Digital Ethics Committee (NDEC)
• Enforcement Division
which is intended to bring more clarity to section 18 (m) of the ICT Act, where the ICTA is mandated to take
steps to regulate or curtail the harmful and illegal content on the Internet and other information and
communication services.
We do not believe that the proposed operational framework, as set out in the Consultation Paper, is the most
appropriate approach to address harmful and illegal content on social media. The proposed creation of the
NDEC adds another layer to an already complex structure in terms of dealing with online offences.
However, we do agree that there should be a legal entity that comes up with clear guidelines with regards to
what is harmful and illegal based on our local laws. We suggest that the Authority inspires itself from the UK
approach in defining categories of harmful content as described in the following document:
https://commonslibrary.parliament.uk/research-briefings/cbp-8743/

Priority categories of harmful content would be set out in secondary legislation:

• criminal offences (e.g. child sexual exploitation and abuse, terrorism, hate crime and the sale
of illegal drugs and weapons);
• harmful content and activity affecting children (e.g. pornography); and
• harmful content and activity that is legal when accessed by adults, but which may be harmful
to them (e.g. content about eating disorders, self-harm or suicide).

4.1 NDEC Composition


There is no indication as to how the chairperson and other members of the NDEC will be appointed in the
Consultation Paper. In the unlikely event that this proposal goes ahead as put forward in the Consultation
Paper, we would like to insist on the following:
- The chairperson and members should all be appointed following consultations between the Prime
Minister and the Leader of the Opposition.
- The NDEC should be accountable to a Parliamentary Committee comprising of members from
Government and Opposition.
- At least 50% of the NDEC should comprise of retired judges.
- The NDEC to publish clear the guidelines to differentiate between legal / illegal and harmless / harmful
contents.
- The NDEC to publish a set of clearly defined contents / behaviour as benchmark to assess what
behaviour is effectively harmful?

15
4.2 Existing Provisions in ICT Act
We are also questioning whether the NDEC is needed when there is already an Internet Management
Committee (IMC) whose role amongst others is to “to advise the Authority on Internet and related policies;”
as per Section 13 (1) (a) of the ICT Act. We would suggest that the composition of the IMC be reinforced to
consist of more retired judges, instead of going through the process of amending the law and creating a new
legal entity.
The reason being that firstly, the issue is a national one and cannot be seen as being partisan both at local and
international level and secondly, retired judges are experienced in taking decisions on whether a content is
harmful and illegal.

4.3 Separation of Powers


Section 8.3.1 of the Consultation Paper refers to the NDEC being able to carry out investigation “on its own”.
There has to be more clarity on how such investigation will be carried out and whether the NDEC will have an
investigation team in place and how such a team will be constituted. If it is just a matter of re-affecting police
officers from the Cybercrime unit to the NDEC, it will be a duplication of tasks. Investigative powers to
determine possible future prosecution should be handled carefully and within properly set parameters.
The features and powers given to the NDEC as set out in the Consultation Paper have the possibility of turning
this body into a discriminatory entity (based on political affinity, gender, religious beliefs, etc.) bent on serving
the government in power and clamping down on any online content against those in power. We are talking
here of setting up a completely new body with extensive powers.
Section 8.3.2 of the Consultation Paper refers to the NDEC deciding “in its own opinion” whether online
content under investigation is harmful and illegal. This power being given is very subjective. The focus should
be on defining first what is considered as harmful / illegal rather than giving this whole discretionary power to
the NDEC.
Separation of powers will be important and we feel that if the Technical Enforcement Unit is created, it should
be separate from the ICTA and the NDEC, eventually falling under the Mauritius Police Force which would be
more natural (Cybercrime Unit). Otherwise there might be a duplication of tasks if a person makes a complaint
at the Police and the NDEC is investigating the issue on its own.
There are some questions that arise because of lack of clarity from the Consultation Paper.
- Will the NDEC be a separate legal entity from ICTA so that it can be sued or taken to task?
- Will the composition of the NDEC be made public, bearing in mind that currently, the composition of
the IMC is not public?

4.4 Technical Enforcement Unit


We note with concern the security issues which can arise from the proposed technical solution. There is no
information on how the whole technical infrastructure will be secured considering the highly sensitive and
private nature of the information that would be decrypted and archived. What if targeted hacks happen on
the servers that hold all the data? People who have access to these servers and data could leak out personal
information causing loss of reputation both for the persons/organisations concerned as well as the Republic
of Mauritius. We have witnessed in the past leakage of official documents to the press and general public.
How will the security be built-in to ensure that these leaks will not be repeated on this new platform?

16
4.5 Merger between ICTA and Independent Broadcasting Authority (IBA)
The Government has mentioned that the merger between IBA and ICTA is in progress (in Parliament on 2 April
2019 and 30 June 2020).
At the sitting of 2 April 2019 (question B/131), it was mentioned that in reply to part (c) of the question, namely
if the said merger will be proceeded with and, if so, when, to which the reply given was as follows:

With regard to part (c) of the question, the hon. Member may wish to note that the merger exercise will
be proceeded with as Cabinet, at its meeting of 07 December 2018, approved that the draft Bill be
submitted to the Attorney General’s Office for legal vetting. The Bill is under consideration at the State
Law office and it is expected that the Bill will be finalised and introduced in the National Assembly during
its next session later this year.

Since in April 2019 it was mentioned that the Bill had already been prepared and sent to Attorney’s General
Office where it is in consideration, we strongly believe that, that this merger should take place before any
implementation of social media moderation.
This merger would enable optimisation of resources and competencies since the IBA has published a code of
conduct and code of ethics for its licensees to adhere to. These codes includes references such as:

Section 2 of the Code of Conduct


Broadcasting licensees shall -
(a) not broadcast any material which is indecent, obscene or offensive to public morals or offensive to
the religious convictions or feelings of any section of the population or likely to prejudice the safety of
the State or the public order or relations between sections of the population;
(b) not, without due care and sensitivity present material which depicts or relates to brutality,
violence, atrocities, drug abuse and obscenity;
(c) exercise due care and responsibility in the presentation of programmes where a large number of
children are likely to be part of the audience.

Section 4.4 of the Code of Ethics:


4.4 Abusive / Offensive Comments
The broadcaster shall ensure that his or her programming does not contain abusive/offensive
comments and discriminatory remarks on material pertaining to race, colour, age, sex, religion, social
origin, marital status, physical or mental disability. Remarks which are abusive/offensive and risk
exposing an individual or a group to contempt or hatred, contravene the objectives of the
broadcasting policy set out in the Code.

We believe that this is the right approach as content should be regulated in the same manner regardless of
which means is being used to access it. Having a single body regulating all content would make more sense. In
the UK, Ofcom who regulates TV, Radio and Telecommunications sector will also be regulating social media.
Therefore the merger between IBA and ICTA should be prioritised before any changes in the ICT Act which will
become obsolete if the merger is done after.

17
4.5 Summary
We would like to highlight the following to conclude:
1. We strongly recommend that the proposed merger between IBA and ICTA be completed before such
social media regulation is enforced. This would ensure consistency across different media.
2. We agree that there should be a legal entity that decides which type of content should be classified
as harmful and illegal at the onset. These should be published with examples so that the public know
what type of content will lead to offences and local social media moderators know what type of
content they need to remove from brand pages they have control on. Whether it is the NDEC or the
IMC or a merged committee following merger of IBA and ICTA, we believe only one body should exist
in order not to duplicate roles.
3. We do not see that there is sufficient separation of powers at the level of the ICTA for this proposal to
be viewed as a positive measure for Mauritius on the international scene.
4. We do not agree to the setting up of the Technical Enforcement Unit as proposed in the Consultation
Paper.

18
5. Reply to Question 14.4 of the Consultation Paper: What are your views on the proposed legal
amendments to the ICT Act to give legal sanctity and enforcement power to the NDEC?
We understand, from section 8.2 of the Consultation Paper that the changes proposed are:
(a) Nothing in this Act shall prevent NDEC or any of his employees or agents from intercepting, withholding
or otherwise dealing with a message which he has reason to believe is-
(i) indecent or abusive;
(ii) in contravention with this Act
(iii) of a nature likely to endanger or compromise State’s defence, or public safety or public order.

5.1 Lack of Clarity on proposed changes


Firstly, we believe, that the Consultation Paper should have been clearer about the proposed changes to
existing laws. We have assumed that “Public Operator” is replaced by “NDEC” in the proposed amendment.
Section 7.2 of the Consultation Paper mentions “amendments to the ICT Act”, but does not provide the
necessary details in order to provide concrete comments and suggestions on the proposed changes. We
believe that the proposed amendments should have been more explicit and detailed in the Consultation Paper
to include the composition and role of both the NDEC and Technical Enforcement Unit, as is the case for
Internet Management Committee where its composition and roles are spelt out.
The question is also badly phrased. The Consultation Paper mentions that enforcement will be done by the
Technical Enforcement Unit, whereas the question mentions “Enforcement Powers” to give NDEC. If this is the
case, then it should be clearly spelt out what changes in the Act will enable the Technical Enforcement Unit to
be set-up, to which body will it report to (ICTA or Mauritius Police Force).
With the setting up of the NDEC, one would also question whether the Internet Management Committee
should still remain and what their contribution has been so far on the issue of regulation of internet content
in general.
The amendments to the laws should cater for a proper framework of separation of duties and powers of each
stakeholder as well as the required independence from political and other pressure that may be exercised.

5.2 Access to Data


In order to exercise powers under 8.2(a)(i), the NDEC will have to carry out a triage in order to identify the
abusive and indecent contents. The triage process involves reading the good and the bad, in order to act on
the bad (abusive/indecent). Whether content is bad or abusive or likely to endanger or compromise the state’s
defence, it will only be known after having intercepted and read the contents.
It is therefore clear that NDEC will access contents that are not abusive/indecent also. Therefore, the law, in
itself, is not clear.

5.3 Summary
We do not believe that any changes in the legislation is required. We will expand on this in our concluding
remarks. However, it is essential that the proposed merger between IBA and ICTA takes place as soon as
possible and that any changes in the law be reflected in the Bill that has already been prepared. Should the
Bill be ready for the merger, it should be tabled in Parliament at the earliest.

19
6. Reply to Question 14.5 of the Consultation Paper: What are your views on the proposed modus operandi
of the NDEC?

6.1 Powers Granted to NDEC


The powers being given to the NDEC are too far-reaching and constitute breaches to the fundamental rights
of Mauritians as set out under the Constitution and the laws in Mauritius, in particular the freedom to
expression and the right to privacy.
Intercepting, analysing, investigating and reporting matters for enforcement should only be possible after
obtaining a Judge’s Order as currently envisaged by the law. The process can be smoothened with the judiciary
in terms of such orders being granted. However, in order to ensure that there is no abuse, a Judge’s order is a
pre-requisite to access any private content. We believe that in this day and age, a simple platform can be
developed to allow judges to issue an order online for this purpose.
Judge’s orders are used today to get information from telecommunications operators on call records and other
owners of telephone numbers amongst others. By by-passing a judge’s order in the case of social media, the
Authority is putting transparency and independency at stake. The proposed power would be proactive instead
of being reactive as are most countries’ stand on Legal Interception today, thereby infringing basic human
rights to privacy.
The features and powers given to the NDEC as set out in the Consultation Paper have the possibility of turning
this body into a politically motivated or otherwise discriminatory entity meant on serving the government in
power and clamping down on any online content against those in power. We are talking here of setting up a
completely new body with extensive powers.
Section 8.3.2 of the Consultation Paper refers to the NDEC deciding “in its own opinion” whether the content
under investigation is harmful and illegal. This power being given is very subjective. The focus should be on
defining first what is considered as harmful / illegal rather than giving this whole discretionary power to the
NDEC.

6.2 Scope of Investigation


It should also be clarified whether the following non-exhaustive list of scenarios will also be investigated and
whether, in case the content are found to be harmful or illegal, these content could be blocked.

Location of Social Location of User Nationality of User Location of Social


Media Profile Media Page

Mauritius Mauritius Mauritian Mauritius

Mauritius Mauritius Mauritian Foreign country

Foreign country Mauritius Non-Mauritian Foreign country

Foreign country Foreign country Non-Mauritian Mauritius

Mauritius Foreign country Mauritian Mauritius

20
The Authority should also clarify which social media platforms will fall under this project, especially in the case
of Facebook messenger service (which falls under Facebook but is also a messaging service) and Facebook
private groups.

6.3 Harmful and Illegal Content Definition


The criteria that will be used to classify a content as “to be removed or blocked” should also be clearly spelt
out and made public so that companies that manage social media business pages can also take necessary
actions to block this on their brand pages. The Authority has used different qualifying words in the
Consultation Paper and this need to be well spelt out to avoid confusion. We would suggest that the Authority
take inspiration from the format that Facebook uses when showing what is accepted and not accepted on
their platform: https://about.fb.com/actions/promoting-safety-and-expression/

6.4 Complaints Process


It is also mentioned in the Consultation Paper, that NDEC will act upon complaints amongst other means,
however there is no mention how they will receive complaints. Will it be via an online platform or a person
has to go in person to the Authority make the complaint? Will there be a timeframe for the public to file
complaints following an incident?
Since an offence under the ICT Act is criminal in nature, we understand that only when a complaint is lodged
that actions can be taken (investigation of complaints and removal where required).
Section 8.3.1 of the Consultation Paper refers to the NDEC being able to carry out investigation “on its own”.
There has to be more clarity as to how such investigation will be carried out and whether the NDEC will have
an investigation team in place and how such a team will be constituted. If it is just a matter of re-affecting
police officers from the Cybercrime unit to the NDEC, it will be a duplication of tasks. Investigative powers to
determine possible future prosecution should be handled carefully and within properly set parameters.
We believe that apart from the definition of harmful / illegal content on social media, all the pieces of the
puzzle are already in place to block access to harmful / illegal content. Clause 14.1 of the ISP licence clearly
states that ISPs should have the technical capability to prevent these contents:

“The Licensee shall take all the necessary measures to discourage and prevent the flow of content which
is grossly offensive or of an indecent, obscene, or menacing character or infringing intellectual property
rights and international and domestic cyber laws.”

Some countries have imposed 12 to 36 hours to stakeholders for them to remove harmful or illegal content;
we would recommend that the same timeframe be enforced in Mauritius.

6.5 Compliance with Data Protection Act of 2017 (DPA) and GDPR
Section 8.4 of the Consultation Paper states that the NDEC will comply with the requirements of the Data
Protection Act of 2017 with regards to handling of personal data. We have assumed that the Technical
Enforcement Unit will also be complying with these requirements in our response.

21
The setting up and operation of the NDEC and the Enforcement Unit are clearly at odds with the provisions of
the Data Protection Act 2017 of Mauritius (the “DPA”). Section 8.4 of the Consultation Paper refers to this
aspect but falls short as to how it is being proposed to comply with the requirements of the DPA in terms of
the handling of personal data.
As set out under the DPA and stated by the Data Protection Office of Mauritius, the key principle underpinning
data protection is to ensure that people know to control how personal information about them is used or, at
the very least, to know how others use that information. The object of the DPA is to provide for the protection
of the privacy rights of individuals in view of the developments in the techniques used to capture, transmit,
manipulate, record or store data relating to individuals.
The DPA was amended in 2017 in response to the coming into force of the General Data Protection Regulation
(GDPR) which brought radical changes to data privacy laws in the European Union and has a cross-border
impact.
Being given the provisions of the DPA and the GDPR which grants more rights to an individual in terms of how
his private data can be handled, there will be a clear issue when such data are being intercepted by the NDEC
and through the technical toolset. The current proposed method is that the individual will have to consent to
such interception but this is likely to raise serious concerns from a data protection perspective.
The data protection laws require the free and unambiguous consent of individuals before their data can be
accessed. Section 11.3 of the Consultation Paper refers to a user “being prompted for the automatic
installation of this self-signed certificate on his workstation/device”. There will need to be proper guidelines
informing all users in Mauritius as to how their data are being intercepted and only a prompt or pop-up
message as described in the Consultation Paper is likely to fail data protection laws.

6.6 Provision for Children under 16 in DPA


Another main concern is the handling of data for users who are less than the age of 16. Section 30 of DPA
deals with personal data of children and reads as follows:

(1) No person shall process the personal data of a child below the age of 16 years unless consent is given
by the child’s parent or guardian.

We do not understand how the proposed solution will be able to comply with this provision of the DPA,
especially when accessing a social media platform.
The DPA provides that prior to processing the personal data of a child below the age of 16, it is requisite to
obtain the consent of the child’s parent or guardian.
It will be advisable that the ICTA consult with the Data Protection Office and clear guidance be provided to the
public as to how the proposed amendments in the Consultation Paper will be in line with data protection laws
in Mauritius.

6.7 Summary
We believe that there are already enough provisions to remove / block access to harmful / illegal content
without the need for new legal entities or technical toolset. As mentioned before, social media administrators
have the required technical toolset and human resources to perform this. A collaborative approach, which
would be less costly for the country should be considered by the ICTA.

22
7. Reply to Question 14.6 of the Consultation Paper: What are your suggestions on the safeguard measures
to be placed for the NDEC?
Our understanding of this question is that it also includes the Technical Enforcement Unit.

7.1 Access to Data and Confidentiality


As per the proposed changes in the ICT Act, to which we do not agree, agents of NDEC or the Technical
Enforcement Unit will be able to intercept, withhold or otherwise deal with content. It should be clearly
defined who the agents of NDEC could be, bearing in mind that confidential data is stored on the technical
toolset. Will the supplier of the system or any of their employees be considered as agents and have access to
the confidential data of the citizens of Mauritius?
Companies also pay social media to promote their posts and input their credit card details. What provisions
will be put in place to protect these types of sensitive data. Who will be liable in case of hacking or misuse of
the technical toolset?

7.2 Safeguards
The Australian government has also mentioned several safeguards in “The Assistance and Access Act 2018”,
on their website: https://www.homeaffairs.gov.au/about-us/our-portfolios/national-security/lawful-access-
telecommunications/data-encryption. Some of the relevant ones to this Consultation Paper are listed below:

“Engagement between Government and industry is bounded by critical safeguards. All requirements
must be reasonable, proportionate, practical and technically feasible. Government cannot:
• build or implement so-called ‘backdoors’ or do anything that would make the communications
of innocent persons less secure
• build a decryption, interception or data retention capability
• access communications without an existing warrant or authorisation “

We believe that the above safeguards should be taken into consideration by the ICTA.

7.3 Compliance with Data Protection Act of 2017 (DPA)


Mauritius boasted itself as being amongst the first countries to implement the GDPR requirements into its
local laws. The proposed amendments by ICTA are likely to raise serious issues in terms of data protection
laws as described above.
The global business sector in Mauritius is already hit by its listing as a high-risk country in terms of Anti-Money
Laundering and Counter Financing Terrorism deficiencies. Governments and development financial
institutions today associate their investment and funding in companies and the grants they provide with
compliance with their legislative requirements.
We should not run the risk of also being blacklisted in terms of our handling of private data.
Investors are likely to be alarmed as to how their data may be subject to interception in Mauritius.
Transparency with clear legal provisions have been at the forefront of our global business sector over decades.

23
7.4 Social Media and Country Laws
The European Union in a recent article such subject “Social media and democracy: we need laws, not platform
guidelines” mentions a number of key elements that should be used to define any action by the ICTA or the
Government:

“MEPs praised these efforts to regulate the online world via laws, not platform guidelines, but said they
must safeguard freedom of expression and fundamental rights, while avoiding censorship.”
Anne-Sophie Pelletier (The Left, France) stressed the need to protect freedom of expression and opinion.
"On the internet, the freedom of one group of people shouldn't stop where the big platform bosses
decide," she said. "We can’t have content being censored without a decision from a judge...censorship
is never the answer."
Speaking on behalf of the Portuguese Council Presidency, Ana Paula Zacarias said: “We expect online
platforms to play their parts in this common fight, but it is up to the democratic institutions, our laws,
our courts to set the rules of the game, to define what is illegal and what is not, what must be removed
and what should not be.”

The article is available at the following link:


https://www.europarl.europa.eu/news/en/headlines/society/20210204STO97129/social-media-and-
democracy-we-need-laws-not-platform-guidelines

The European Union (EU) is an important economic partner for Mauritius. By way of intercepting incoming
traffic, business and personal data of EU citizens taking place on Facebook will also be intercepted and read in
clear text. This action compromises the confidentiality agreement under GDPR (Article 32). The proposed
solution will render the state of Mauritius to institutionally infringe EU legislations.

7.5 Summary
We believe that the NDEC or any other similar legal entity (e.g. IMC or any a committee falling under the
merged IBA and ICTA) should only be working at the policy level and that it is the role of ISPs to enforce the
policies and any directive issued. The Authority should consider the safeguards mentioned by the Australian
Government. Although, they have been fighting battles with social media administrators, they still maintain
that Government should not build decryption, interception and data retention capabilities. The ICTA is going
against what more mature and experienced democracies are practising.

24
8. Reply to Question 14.7 of the Consultation Paper: What are your views on the use of the technical toolset,
especially with respect to its privacy and confidentiality implications when enforcing the mandatory need
to decrypt social media traffic?
The Consultation Paper briefly describes the toolset but does not describe the complete infrastructure. This
would have helped in providing more constructive comments.
There are a number of issues and concerns with the technical toolset which are detailed below.

8.1 Cost associated with Toolset


ICTA did not produce any estimate of the financial implications in the initial setting up, maintenance and
upgrading of the infrastructure for the proposed framework. Evidently, the population and we, as the tax-
payers, need to have a proper forecast and visibility about the financial implications to be able to effectively
evaluate the financial worthiness and feasibility of the proposed framework. Some of the cost implications
which needs to be considered according to us are listed below:
i) Internet service providers will have to connect to the toolset and this will require investment on the
part of these ISPs as the capacity of the links required is terms of 10s of GB/s links. Unfortunately,
these costs will be on-passed to customers.
ii) Because of the criticality of the service, ISPs and the ICTA will have to factor in both redundancy and
route diversity for the connectivity between ISPs location and the technical toolset.
iii) The archiving of the data that is decrypted is likely to be an issue if all data is archived, as spelt out in
Section 11.1 of the Consultation Paper (This, in turn, implies that all Facebook traffic (both incoming
and outgoing for Mauritius), will need to be decrypted and archived). As per Statistics Mauritius, in
2019, the total International Bandwidth Usage was 101,657 Mbit/s. We estimate the volume of data
to be at least 50% of all traffic if we include Youtube. This will need TBs of storage daily and it is not
clear for how long the data will be stored. Will the Authority keep data long enough before complaints
are made and in the event of Court cases, which data will be? We have to remember the storage issues
that was recently made public with the Safe City project.
iv) Cost of storing all of data sessions will be high. Since Mauritius uses NAT on almost all services, how
will these NAT sessions be stored to correctly identify users? We are assuming that one of the
objectives of the setting up of the technical toolset will be to identify offenders without the need of
going through social media administrators, so that legal actions can be taken against them.
v) We believe that a proper storage of all audit trails is essential in any project of this scope, e.g. though
the use of an Audit Vault which is very secure and cannot be tampered with. However, here again, the
cost of this is very high and cannot be justified.
vi) What will the maintenance cost be? We should definitely not expect the same scenario as that of Safe
City cameras as this infrastructure will become critical. A failure will cause a failure in all social media
platforms at the same time. See article https://www.lexpress.mu/article/388381/enquete-judiciaire-
debut-declairage-sur-appels-doffres-auxquels-participait-kistnen

8.2 Quality of Service


With the proposed set-up of all social media traffic having to go through the technical toolset, it is expected
that there will be an increase in round trip delays. This lower the consumer experience in many ways, especially
during peak hours. There could be scenarios where the user’s data during a transaction has already been sent
to the technical toolset, but due to heavy traffic has not been “processed” at that level before being sent to
the social media network. (E.g. failure in payment transactions on social media, slow streaming).

25
ISPs will have to inform the Authority when they plan on upgrading their links well in advance so that the
toolset can be upgraded to cater for additional capacity before it is put in service by the operators.
What happens if the ICTA server/system are not responsive? This means that most of the digital market place
on social media will be unavailable and hereby causing loss of revenue to SMEs and other companies highly
depending on digital marketing and social media? Will the ICTA have a team onsite 24x7 to ensure the highest
level of service and provide a guarantee on uptime?
Will the Technical Enforcement Unit work on a 24x7 service to remove illegal content?

8.3 Self-Signed Digital certificate


As far as we are aware, only Kazakhstan has attempted to use a similar approach in the past. However Apple,
Google, Mozilla and Microsoft blocked the use of the Kazakhstan root CA certificate
(https://www.zdnet.com/article/apple-google-microsoft-and-mozilla-ban-kazakhstans-mitm-https-
certificate/), as this undermines the security of users and contradicts the individual security and privacy on the
internet. The first attempt was in 2015 where the Government was sued by ISPs and banks, preventing the
project to proceed. Going ahead will prove risky as it may not work in practice should the tech companies
above proceed the same way as they have done for Kazakhstan. The installation of the Digital certificate will
prove to be challenging for a certain category of the population not familiar with this.
Users will be prompted to install the certificate, however this which will not be recognised by the browsers
developed by the above companies should they proceed in the same manner as they have done for
Kazakhstan. This will lead to a bad experience from user perspective when trying to browse social media
websites.
When new social media are created, we assume that there will be a need for these to be moderated as well.
These domains will be added to the proposed one by ICTA; how does the Authority plan on informing users in
these cases?

8.4 Identifying a user


From Section 11.2.1c of the Consultation Paper, it would seem that the Authority plans on using IP to identify
users. The IP address that the Authority will see is a public IP address. However, since ISPs in Mauritius use
NAT, there could be several users being seen using the same public IP from the toolset perspective. If the
objective is to identify a user from IP address, this may prove challenging. It is also highly unlikely that ISPs
keep logs of which user is assigned which public IP for long durations. If the aim is not to identify a user actually
committing the offence, then the rationale of the project should be reviewed.

8.5 Competition within the ISP market


The proposed set-up will require significant investment from ISPs to connect to the technical toolset. ISPs who
offer only internet services would have no choice but to increase their prices, while ISPs that also offer other
services may not do so necessarily. The barrier to entry in the ISP market is already perceived to be very high
and this will raise the barrier even higher.

8.6 Security and Confidentiality


As mentioned in the Consultation Paper, the data will be stored unencrypted. This includes usernames and
passwords as well as credit card details. This represents a significant risk and could make it easier for criminals
to exploit weakened security.

26
Apple’s stand in 2018 (https://www.itnews.com.au/news/apple-says-decryption-should-alarm-every-
australian-513893) with regards to decryption was considered by the Australian Government which lead to
the Government adding that it cannot build a decryption, interception or data retention capability as
previously mentioned. There are also some questions that are not answered in the Consultation Paper.
Is the installation of the certificate not making the users more liable to hackers, if the certificate issued by ICTA
is stolen/copied without authorisation through man-in-the-middle attacks? How will ICTA ensure that the
certificates are rightly and correctly maintained by end users? Where will the back-up / data recovery centre
be located? And will it have the same security standards as the main site?
Some social media platforms such as Facebook have different types of pages; personal pages, business / brand
pages and groups (public and private) and well as private conversations. We do not believe that there exists a
toolset that could only capture content that are marked as public and this will not ensure confidentiality of
the users.
We welcome the fact that through its communiqué of 29th April 2021, the ICTA states that Messenger traffic
will not be subject to interception, decryption and archiving. The ICTA also implies that all private
conversations would not be subject to the same treatment by stating: “Ces communications privées sont
d’ailleurs inviolables de par la Constitution”. If there is any traffic from Facebook that is intercepted at ISP
level and sent to the technical toolset at ICTA, we do not see how the proposed set-up will prevent private
communications such as Facebook Messenger and Facebook private groups from being intercepted. The ICTA
should disclose whether ISPs have already been contacted to see whether they can technically re-direct only
“public” Facebook data to the technical toolset. ICTA should also clearly define what is public and what is
private. Our understanding is that currently ISPs are not equipped to differentiate public and private data on
Facebook.
We understand that a number of people use the social media login that are available on non-social media
website. By having access to a person’s login credentials, the Authority may also gain access to non-social
media websites.
Clause 31(2)(a)(i) of Data Protection Act of 2017 mentions the need to “the pseudonymisation and encryption
of personal data;”. However, section 11.1 of the Consultation Paper does not explicitly state, that after data
is decrypted, whether personal data will be encrypted before being stored to comply with DPA. We expect
that this should be the case.
Section 11. 1 of the proposal document discloses the most important implication of the technical toolset lies
in the fact that traffic will be intercepted, decrypted, archived and also re-encrypted. Some points need to be
highlighted:
i) The technical toolset will have to intercept all Facebook traffic.
ii) A triage in clear text will need to be carried out, implying that all Facebook traffic will be
decrypted and analysed for NDEC to identify illegal/harmful content. Data will be archived.
iii) This inevitably implies that confidentiality of all Facebook contents will be compromised.
iv) Any Facebook content in clear text will also contain the user IDs and password. Same will be
de-facto compromised.
v) The system may present loopholes exploitable by ill-intentioned parties to edit the contents
before re-encrypting it, or edit the contents to deliberately incriminate a party. The fact that
there are technical loopholes, casts doubts on security system allowing users to repudiate,
irrespective of whether they authored/shared the content or not.
vi) Courts will find it challenging to establish criminal conviction in face of so many doubts, even
when the alleged has effectively posted incriminating contents.

27
8.7 Privacy and Freedom of Expression
The right to privacy is enshrined in the Constitution of Mauritius and the Civil Code. The operations of the
technical toolset are likely to fall foul of the rights to the freedom to expression and the rights to privacy as
described above. At the heart of data privacy laws, there is the concept that a user should be free to know as
to whom his data is being shared with and he must provide his free and unambiguous consent to such usage.
The proposed mechanism set out at Section 11.3 may be deemed not to be enough when compared to all the
information that will be intercepted through the technical toolset.

The Universal Declaration of Human Rights mentions in Article 12 that “No one shall be subjected to
arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour
and reputation.”
The Universal Declaration of Human Rights mentions in Article 19 states that: “Everyone has the right to
freedom of opinion and expression; this right includes freedom to hold opinions without interference
and to seek, receive and impart information and ideas through any media and regardless of frontiers.”.

The implementation of the proposed technical toolset is a clear abuse of the right to privacy of the individual.
We believe that only through separation of powers that the right to freedom of opinion and expression (while
respecting people) will be upheld. The protection of freedom of expression is also enshrined in Section 12 of
our Constitution.
The ICTA in its own document, namely the “ICTA Consumer Guide: The Internet”, mentions an Internet Traffic
Management Practice (ITMP) Framework and states:

The ICTA’s objective in setting up and ITMP Framework is to strike the appropriate balance between the
freedom of expression of Mauritian Internet users to use the Internet for a variety of purposes against
the legitimate interests of ISPs to manage the traffic generated on their networks. The Framework will
also aim to be consistent with legislation, including privacy legislation and to ensure that net neutrally
is not compromised.
In the same document, the Authority states that “net neutrality principle is based on the premise that
ISPs should treat all data equally and that the data should be equally accessible to the users of the
Internet.”

Clearly the proposal as set out by the Authority in the Consultation Paper goes against the principle of net
neutrality as described by the same Authority.

8.8 Impact on Pillars of our economy


Any non-citizen travelling to Mauritius and using social media will have some of their personal data stored in
the technical toolset. What measures will the Authority take to ensure that this does not contravene GDPR
with regards to storage of data of EU nationals? Any non-citizen business travellers using social media for
business purposes may find it unattractive to conduct business in Mauritius. We believe that this would
definitely have an impact on our tourism industry.

28
If we consider a UK based company who is operating from Mauritius and dealing with a US based client, we
believe that the UK based company may find it risky to continue operating in Mauritius. This could be even
more accentuated with the recent ranking of Mauritius at No 8 of autocracies in Mauritius by the V-Dem
Institute. If the Authority goes ahead with decrypting and storing data from all social media users, there is a
huge risk that our offshore sector will be negatively impacted.
A number of multinationals currently outsources some of their processes to Mauritius. With the proposed
setting up of the technical toolset as described in the Consultation Paper, we believe that these multinationals
may not take the risk of having confidentiality issues with regards to their data and move their operations to
countries where there are no such set-up, thereby impacting our BPO sector. Has the ICTA consulted with
financial institutions on whether there would be any potential or perceived breach with PCI/DSS compliance?
We would strongly suggest that the Authority and the Government carry out an assessment of the impact of
these proposed measures on the above pillars of our economy and publishes the report before taking any
action.

8.9 Inconsistencies across Consultation Paper and communiqués issued by ICTA


The Consultation Paper mentions in Section 11.1: This, in turn, implies that all Facebook traffic (both
incoming and outgoing for Mauritius) will need to be decrypted and archived, whereas the communiqué
dated 29th April 2021 mentions: Toutefois le système archiverai que les contenus abusifs sur les pages
publiques incriminées sur les réseaux sociaux, à la suite des plaintes enregistrées en bonne et due forme.
We wonder whether the ICTA is backtracking from its initial position following the outcry from the society.
Can the ICTA clearly explain how it will archive “contenus abusifs” if e.g., when the complaint is filed, the
content is no longer available on the social media platform, if not all data is archived?

8.10 Netsweeper
In the communiqué of 29 April 2021, the ICTA mentions: « L’outil technologique que l’ICTA utiliserait est un
dérivé d’une technologie existante dont se sert déjà le régulateur depuis 2012 ». We also note from
Netsweeper communiqué of 3rd May 2021 (where they explain the issue with whatismyip.com) that they have
been awarded the tender that was launched during last year’s confinement (ICTA/OIB/CSA/04-20/04). They
also mention in the same communiqué that the filtering servers are based in the Netherlands.
We also wish to highlight that CSA filtering works using BGP, i.e. no traffic from ISPs has to transit through the
CSA infrastructure to check whether access to a particular website should be blocked. Only traffic to the
blocked sites need to transit through the CSA infrastructure. However, the proposed technical toolset will
require that all social media traffic is routed through it by ISPs. This is a significant change from the current
CSA filtering. This will require huge bandwidth and processing resources to ensure optimum customer
experience.
Netsweeper also states in the Technology Overview on their website
(https://www.netsweeper.com/netsweeper-platform/ ) the solution which we believe the ICTA is planning to
implement:
HTTPS Selective Filtering with NSProxy
Our advanced high speed NSProxy service is capable of delivering 10Gbps of throughput and up to 6Gbps of
selective SSL decryption throughput. Advanced social media SSL protection is an example of how our
technologies are helping in today’s digital world. With NSProxy, we can allow access to the safe areas of
social media, while restricting access to the potentially dangerous ones.

29
This raises a number of issues:
i) Will the proposed solution also not be hosted in Mauritius as the current CSA filtering solution?
ii) If for a simple issue of a website being blocked for non-valid reason, the ICTA need to ask
clarifications from the supplier, will this still be the case for the new platform being planned?
iii) What is the limitation on traffic handling of the proposed solution?

8.11 Summary
We would like to re-iterate that we have not seen any democracy trying to implement such a technical toolset
at the level of the Regulator. The UK Government is promoting a collaborative approach with social media
companies, as we should be doing. International collaboration with other regulators to see how they are
implementing social media regulation and learn from their best practices should have been sought in a first
instance by the ICTA.
We believe that the issues, challenges and cost implications raised by the implementation of a technical toolset
are too wide ranging at both local and international level to allow its implementation. A more non-intrusive
approach is definitely needed. The Authority may have underestimated the international impact of this
intrusive measure, bearing in mind that the tourist industry has a large role to play in the recovery of our
economy post COVID-19. The Offshore sector which is already in jeopardy could also be affected. The global
business (BPO) sector which has not been as affected as the other two pillars would certainly be affected if
the proposed measures are implemented as is.
The technical toolset will only block access to content from Mauritius. The content will still be visible from
outside Mauritius or when using VPNs. In this regards, we re-affirm our position that if a content is found to
be harmful and / or illegal, the blocking of the content should be done by social media administrators based
on guidelines published by the Government of Mauritius.
Therefore, we do not agree to the implementation of this technical toolset, or any toolset, at the level of the
ICTA as we do not see technical intervention as being one of the duties of a regulator.

30
9. Reply to Question 14.8 of the Consultation Paper: Can you propose an alternative technical toolset of a
less intrusive nature which will enable the proposed operational framework to operate in an expeditious,
autonomous and independent manner from the need to request technical data from social media
administrators?
We welcome the fact that the Authority itself defines its proposal as intrusive and is open to smarter solutions.
We shall detail here our proposal and we recommend that the Authority to consider same.

9.1 Request for technical data


The question raises itself another question, what technical data is requested from social media administrators.
If the technical data is needed to identify the person behind harmful / illegal content, we do see how the
setting up of a technical toolset would enable the Authority to trace back a content posted outside of Mauritius
or using a fake profile with a foreign email provider where no proper authentication is required. Again, a more
collaborative approach should be used. There are plenty of examples on the internet where users have
requested to have fake profiles blocked and also get information regarding such profiles impersonating other
people. See https://www.engageweb.co.uk/how-to-trace-a-fake-facebook-account-9058.html

9.2 Self-Regulation by social media platforms


The approach adopted by India seems to be the most favoured one whereby responsibility of the contents
rests on the social media administrators who are also under the obligation of taking-down contents
expeditiously (within 24hrs) of being notified and to alert the Authority for onward action with the relevant
department and the author, including those who share the posts.
As already mentioned in section 14.1, there are a number of mechanisms that social media administrators use
to self-regulate content on the social media platforms and to consider requests to remove harmful / illegal
content. A more collaborative approach with social media administrators should be favoured so that we can
use their technical toolset. As mentioned earlier in our response, these technical toolset are already using AI
and Machine Learning and have significant experience in continuously improving their toolset.

9.3 Summary

Our proposed approach is as follows:

The Government should work in collaboration with social media administrators so that they improve their
technical toolset to prevent harmful / illegal from being posted in the first place; whether it is through a list of
creole dictionary of words to be blocked or specificities of Mauritian culture which would be classified as
harmful / illegal.

We have shown that social media administrators have put in place processes to have access to harmful / illegal
content blocked, both through their own technical toolset and through reporting mechanisms. This should be
the favoured approach as it would tackle the issue in a more efficient and less costly manner.

We have also shown through examples that it is possible to obtain technical data, without the need for an
intrusive toolset. In any case, even with the toolset, this technical data may not be retrievable.

The need of going through a judge’s order being essential for separation of powers and transparency cannot
be stressed enough.

31
10. Reply to Questions 14.9 of the Consultation Paper: Should the Courts be empowered to impose
sentences (which include banning use of social media) on persons convicted of offences relating to misuse
of social media tools?

10.1 Sentences imposed for misuse of social media tools


The Courts are already empowered to impose sentences, including to people who are convicted of offences
related to social media. Two such examples have been reported by the Press:
https://defimedia.info/messages-sur-facebook-et-whatsapp-deux-hommes-condamnes-pour-chantage-et-
harcelement

https://defimedia.info/propos-communal-sur-facebook-peine-dun-mois-de-prison-maintenue

There are also older cases such as:

- Police v N. S. Mohamed (2008) where the sentence by the Intermediate Court was a fine of Rs.
190,000 under the ICT Act 2001 and Computer Misuse and Cybercrime Act 2003.
- Police v S.Teeluck (2009) where the sentence was a fine of Rs. 150,000 for two counts under the ICT
Act 2001
- Police v K. Bunwaree & S. Dowlut where the one accused was sentenced Intermediate Court to a fine
of Rs. 25,000 for one count and the other to a fine of Rs. 300,000 for twelve counts.

10.2 Banning use of social media in sentences


In recent cases dealing with the granting of bail and more specially relating with drug offences, judges and
magistrates have already imposed conditions banning suspects released on bail from accessing or using the
internet for any purpose whatsoever and to use any third party’s smartphone or internet connection. As of
now, there have been no challenges to such limitations being imposed. As example of this is in the case Kusraj
Lutchigadoo v/s The Commissioner of Police and Ors 2020 SCJ 160, THE FO where the ruling states that:

“The applicant is not entitled to access or use the internet for any purpose whatsoever. He is not entitled
to be allowed to make international phone calls by any means. He is not to make use of any 3rd party’s
smart phone and internet connection. The applicant should not to be in communication with any person,
other than the authorities, in connection with the present case either in person or by means of any
technology such as phone, email, WhatsApp, Messenger, Facebook, Twitter, or any other social media
platform.”

10.3 Summary
According to us, the Courts are already empowered to deliver sentences on people convicted of social media
offences and these sentences already include social media ban amongst others.

32
11. Concluding Remarks
The proposal in the Consultation paper is typical of repressive regimes where internet traffic is heavily
controlled by government, (from the more extreme like China to others like Pakistan, Egypt and Tunisia). In
dealing with the internet and social media, all countries are facing the same dilemma in terms of balancing
freedom of speech and right to privacy on the one hand and social cohesion and security on the other. But
many countries, namely developed countries such as Australia, US, UK are implementing more balanced
measures after lengthy consultations with the public and private sector and all stakeholders involved. What
the government here is proposing ranks as one of the most repressive and intrusive measure envisaged though
disguised as one where it’s the citizen himself and even non-citizen travelling to Mauritius who agrees to it.
This will have a negative impact on the image of the country and will not help the country get removed from
being grey / blacklisted by European institutions if we are perceived as going against GDPR. Furthermore, it
will reinforce Mauritius’s position as an autocratic regime, which the German V-Dem institute has already
highlighted in its 2021 report.
On top of that, the lack of transparency from the ICTA and the choice of timing (during lockdown last year and
this year) for both the tender issuing and the public consultation does not give confidence in what the ICTA is
planning to implement. The lack of transparency in when Facebook was consulted and for what does not help
the ICTA cause.
From our research during the limited time given to respond to this Consultation Paper, it seems no other
country besides Kazakhstan has attempted to implement such an intrusive solution to moderate social media.
The proposals in Kazakhstan were thrown in the limelight, leading to technical companies Mozilla and Google
opposing such measures as clear there were breaches of privacy policies that drive their products and services.
India, the largest democracy in the world, is using a collaborative approach with social media platforms; as are
all the other democratic countries who are trying to address the issue of social media regulation.
(https://www.reuters.com/article/india-tech-regulation-idUSKBN2AP175 and
https://prsindia.org/billtrack/the-information-technology-intermediary-guidelines-and-digital-media-ethics-
code-rules-2021 ).
A number of multinationals operate from Mauritius and also use social media for business purposes where
exchanges are highly confidential. Any threat to this confidentiality, through decryption of data, may
encourage these multinationals to leave Mauritius and opt for more appropriate countries. Businesses will shy
away from having their businesses being conducted from Mauritius, hence, impacting negatively on Foreign
Direct Investments. This is not the image that we want to portray for our country in the international scene.
And with the proposed Data Technology Park at Cote d’Or, we do not think that we will be able to attract key
international players if we go ahead with such a project as described in the Consultation Paper. Some
international organisations (e.g. Electronic Frontier Foundation) have started criticising the planned initiative
of the ICTA; see https://www.eff.org/deeplinks/2021/04/proposed-new-internet-law-mauritius-raises-
serious-human-rights-concerns and https://portswigger.net/daily-swig/mauritian-governments-plan-to-
intercept-encrypted-web-traffic-marks-death-knell-for-freedom-of-speech. We expect to see more bad
publicity for Mauritius on international media before the closing of this consultation.
We also believe that more education of internet users should be done by both the National Computer Board
and the Cybercrime unit to all users across the country. Most internet users are just given access to internet
without proper guidance and knowledge on the impacts and implications of online activities and actions. Most
people in Mauritius are not fully aware on the existing laws and the enforcement of those. There should be a
simple guidelines on what is allowed and not allowed.

33
Education of parents and education of children as part of the ICT related curriculum should already be part of
the day to day of schools. An educated and informed citizenship will allow the persons to act more responsibly
with the due respect of the applicable laws. The need for coercive measures is inversely proportional to the
quality of education provided to our citizens.
As shown in our response, one of the rationale but forward in the Consultation Paper to implement the
technical toolset, namely that the social media administrators do not have an office in Mauritius, does not
hold as we have seen that the social media administrators have taken actions in countries where they do not
have an office.
However, we do agree that there is a need to improve regulation of content on social media, but it should be
done in a transparent and non-partisan manner and based on the laws of Mauritius. We strongly believe there
are already all the provisions in place in existing laws and licences to achieve the same results that the
Authority is trying to achieve.

Our Proposed Approach


We believe that the following approach is more appropriate for social media regulation and should be
considered by the Authorities:

1. The merger between IBA and ICTA be prioritised by the Government. As mentioned in Parliament, the
Bill is already at Attorney General Office since 2019, any change in legislations now without
considering the Bill already prepared will be at huge waste of time and resources. We believe that the
priority of the Government should be to accelerate this merger so that resources can be optimised.
2. There is already an Internet Management Committee as defined in the ICT Act, whose role is to advise
on Internet and related policies. Some work has already been done at IBA where they have drafted a
Code of Conduct and Code of Ethics. Once the two bodies merge, the task of drafting clear definitions
of harmful / illegal content should be easy and standardised across all media. These definitions should
then be publicly available as well as clear guidelines and examples. The composition of committees is
critical and should be properly overseen.
3. The Authorities should work closer with social media administrators to improve self-regulation for
content posted in Creole and use proper channels to have access to harmful / illegal content blocked.
This is the approach taken by countries such as UK which is used as a source of inspiration by the ICTA.
At the same time the Government should encourage social media administrators to set up offices in
Mauritius as we have the unique offering of having people who could moderate content in different
languages such as English, French and additional languages (Hindi, Urdu and Arabic amongst others).
We note that Facebook‘s presence on the whole African continent is limited to only one country,
namely in South Africa. It would be a unique proposition from Mauritius to get them to set an office
in a French speaking country for Africa, before they decide to set up in another one.
4. We do not believe that it is the role of the Authority or any Government body to intercept any traffic
for decryption and analysis. We refer to the safeguards mentioned by the Australian Government:
“Engagement between Government and industry is bounded by critical safeguards. All requirements
must be reasonable, proportionate, practical and technically feasible. Government cannot build
a decryption, interception or data retention capability”.
5. Authorities should go through Judge’s order for every content to which access should be blocked at
international level (social media administrators).

34
6. We believe that in cases where Legal Interception is required, a reactive approach should be preferred
rather than proactive drag netting of all data including private data. This should be resorted to only
upon judge’s order including in cases where data is required from Internet Service Providers or social
media administrators. Therefore we do not see the need for a technical toolset to be implemented at
the level of ICTA.
7. Social Media administrators should block access to the content upon receipt of a request in the
appropriate form (Request for Mutual Assistance from Attorney General Office). We believe that the
collaborative approach which we encourage should be favoured in an era where privacy and freedom
of speech are pillars of any democratic society.
8. We believe that there should be increased education of the population on offences pertaining to the
use of internet and social media platforms.

Finally, we strongly believe that once the ICTA has analysed all responses and come up with the best solution
both for the technical and legal aspects, it should carry out another round of public consultation with a more
detailed proposal.

Way Forward
The first consultation on UK Online Harms White Paper started on 8th April 2019 and ran till 1st July 2019, which
gave a period of 84 days for responses to be sent. Here in Mauritius, the ICTA initially gave only 3 weeks and
finally, with the extension, 35 days in total to respond.
The final version of the White Paper was published on 20th December 2020, approximately 20 months after
the start of the consultation process. As mentioned in the White Paper, a number of consultations was done
with different stakeholders in order to finalise the White Paper using an iterative approach.
We believe that the ICTA should not rush this project and proceed in the same consultative and iterative
manner as the UK, whilst being transparent along the way.
A number of questions / clarifications have been asked due to non-detailed information provided or
inconsistencies between the consultation paper and the two communiqués issued by ICTA. More consultations
and open sessions should be done with all stakeholders including NGOs, to shed more light on the project and
allow more interactions before a proper solution is found to regulate abuse and misuse of social media.

35

You might also like