Mauritius Social Media Regulation Debate
Mauritius Social Media Regulation Debate
13 May 2021
1
1.1 Introduction
On the 14th of April 2021, the Information and Communication Technologies Authority published a
consultation paper titled “Consultation Paper on proposed amendments to the ICT Act for regulating the use
and addressing the abuse and misuse of Social Media in Mauritius”. The public has until the 20th of May 2021
at 16:00 to submit their comments on the [email protected].
The paper proposes amendments to the ICT Act to provide legal sanctity to a National Digital Ethics
Committee. The latter will be composed of people who will decide whether online content under investigation
is harmful and illegal. An Enforcement Unit composed of ICTA personnel will be set up in order to operate
tools that will filter incoming and outgoing Internet traffic in Mauritius, segregate social media traffic to be
decrypted, archived, then re-encrypted for further transit. The decryption and re-encryption of social media
traffic will require the installation of a self-signed certificate on the workstations and devices of social media
users in Mauritius.
Section 14 of the Consultation Paper contains nine questions that the ICT Authority has included in the paper.
However, the public are not confined to these questions and are encouraged to raise any issues pertinent to
them as mentioned in Section 13.2 of the Consultation Paper.
We, the Mauritius Labour Party, are participating in this public consultation and we are providing our answers
to the nine questions along with a thorough analysis of the proposed amendments by the ICT Act.
«to play a leading role in the future of ICT in Mauritius contributing to an efficient,
competitive and optimally regulated sector.»
The proposed solution for the regulation of social media as per the Consultation Paper issued by the Authority
dated 14th April 2021 (Consultation Paper on proposed amendments to the ICT Act for regulating the use and
addressing the abuse and misuse of Social Media in Mauritius), will certainly not improve the efficiency and
competitiveness of the ICT as laid down in the Authority’s vision. Quality of service will be lower and barriers
to entry for new players will be raised.
We shall also consider the adverse effects that the implementation of such an intrusive means of social media
regulation, where privacy and freedom of speech are at risk, could have on the reputation of Mauritius on the
international scene and in the media globally. Three of the pillars of our economy, the global business
(offshore) sector, the BPO sector and the Tourism industry may be seriously affected just when we need these
same pillars to help the country recover from the economic downturn during the COVID-19 pandemic.
In any case, we agree that some form of social media regulation in collaboration with social media
administrators is needed to prevent abuse. We would like to stress the importance of all the stakeholders
being independent in their decision making process and the only way we believe that this is possible is through
transparency at all levels, starting with the composition of the committees, policies and guidelines, decisions
taken etc. We will show in our response the level of transparency that social media platforms already have in
place as well as international trends. The ICTA should definitely take inspiration from that.
We have based our response and recommendations on the above points raised and we hope that these will
be taken into consideration.
We have also taken into consideration the communiqués issued by the ICTA on 19 and 29 April 2021 in our
th th
response.
2
2. Reply to Question 14.1 of the Consultation Paper: What are your views on the present approach of self-
regulation of social networks by social media administrators themselves where they decide to remove an
online content or not based on their own usage policy and irrespective of your domestic law?
3
Facebook also has information on the various actions it is taking to continuously improve the self-regulation
mechanisms at: https://www.facebook.com/business/news/sharing-actions-on-stopping-hate. All the other
platforms have similar mechanisms in place to detect and remove illegal content which uses both Human
capabilities as well as AI and machine learning technologies.
At a meeting of the European Commission on 9th January 2018 with social media platform: (See https://digital-
strategy.ec.europa.eu/en/library/tackling-illegal-content-online-meeting-online-platforms-9-january-2018),
it was mentioned that:
There are no simple technical solutions to remove illegal content online. Machine learning algorithms
are currently used and are very effective, but human intervention is always necessary and the degree
to which this is deployed varies a lot.
We do not understand the position of the Authority of trying to acquire and spend resources on refining a
technical toolset when firstly through interactions with social media administrators, these issues could surely
be addressed and secondly there is a high possibility that people will be required to moderate the intercepted
content effectively.
The Authority also mentions the issue of creole language. We believe that this can be resolved by defining a
dictionary for creole words that appear harmful and/or illegal and communicate same to social media
administrators. This should not be a big problem to resolve through collaboration.
Facebook mentions that they have over 35,000 people working on safety and security
(https://about.fb.com/actions/promoting-safety-and-expression/). We believe that it would be a significant
waste of resources being given the current critical state of our economy to invest in such a technical toolset
and the associated human resources when we will be trying to recover from the COVID-19 pandemic.
In fact, we believe that there is actually a huge opportunity to get social media administrators to set up offices
in Mauritius for their human moderation processes; Mauritius has a pool of young skilled workforce who are
fluent in English, French, and mostly in one additional language (Hindi, Urdu and Arabic) amongst others on
top of Creole.
4
India amended its law in February this year bringing major social media companies to establish local offices
manned by Senior Officials to deal with law enforcement and user grievances. They will also be under
obligation to take-down harmful / illegal content within 36 hours from government agencies and 24 hours in
the case of users. They may also have to alter their technology architecture to build in automated tools to
weed out content related to rape, child sexual abuse or conduct. Local offices will also have to publish monthly
blocking and compliance reports. This approach has been well appreciated globally with countries like the US
and New Zealand contemplating similar routes.
These major democratic countries are setting the trend of holding social media companies accountable for the
contents on their platform. This is the global trend and it would be unwise not to learn from these experiences.
The response of social media administrators to the Indian government’s new legislation is worth pointing out:
Facebook: “The details of rules like these matter and we will carefully study the new rules
(https://www.reuters.com/article/india-tech-regulation-idUSKBN2AP175 ),”
Twitter: “We believe that regulation is beneficial when it safeguards citizen’s fundamental rights and reinforces
online freedoms,” (https://www.reuters.com/article/india-tech-regulation-idUSKBN2AP175 )
We therefore believe that the responsibility of deleting or blocking access to content on social media platforms
should be with the social media administrators, not with the Government.
5
“Different countries balance free speech against the potential impact that can come from speech in
various ways. For example, some countries actively enforce laws around hate speech or speech that they
view as threatening social stability, while other countries’ laws are more tolerant of these types of
speech. Governments will ask us to restrict member-posted content that they believe violates their laws.
When we receive a government request to remove content, we carefully review and assess it to
understand the reason for the request, the authority of the requestor, and our applicable policies or
terms. Based on these reviews, we determine whether and to what extent we should take action.
For international requests, we check the validity and ask that the request to be properly issued, for
example, through a Mutual Legal Assistance Treaty or a form of international process known as a Letter
Rogatory, except in the case of certain emergencies.”
Facebook also refers to actions that they take based on local laws at the following page:
https://transparency.facebook.com/content-restrictions
We therefore do not agree to the statement of the ICTA that social media administrators do not block or
remove access to content based on local laws.
The transparency reports also give example of social media administrators taking actions to remove content
based on local laws in cases where they do not have an office. Some examples are given below for Facebook:
- In Costa Rica, content was blocked in response to request from the Ministry of Health.
- In Morocco, access was restricted to one item reported by the Ministry of Interior of Morocco for
content expressing anti-king sentiment.
- In Peru, Egypt, Kenya, access was restricted to content in response to private reports.
- In Indonesia, access was restricted to 445 items for publishing misinformation such as false COVID-19-
related claims, to 90 items for fraudulent investment-related advertisement, to 62 items for illegally
advertising or selling regulated goods, to 17 items for religious attacks and blasphemy, and to 17 items
for violation of laws related to hate speech, graphic violence, incitement, separatism, extremism, and
nudity following requests from the government.
These are only some examples where Facebook takes action even in the absence of an office in the country
requesting for the blocking of content. We fail to understand the statement in Section 5.1 mentioning that
social media administrators need to have a physical presence in Mauritius to be effective. A more collaborative
approach should be encouraged.
6
2.7 European Union Stand
On a final point, in the recent article by the European Union,
https://www.europarl.europa.eu/news/en/headlines/society/20210204STO97129/social-media-and-
democracy-we-need-laws-not-platform-guidelines, the following extract provides guidelines to the approach
that we think would work best:
“Speaking on behalf of the Portuguese Council Presidency, Ana Paula Zacarias said: “We expect online
platforms to play their parts in this common fight, but it is up to the democratic institutions, our laws,
our courts to set the rules of the game, to define what is illegal and what is not, what must be removed
and what should not be.””
The local Authority should provide social media platform in advance the rules or guidelines on what content
is illegal / harmful and therefore should be removed. We would request the Authority to seriously consider
this approach and open a channel of communications with the social media administrators for better
collaboration. We would suggest that our embassies assist the Authority in this process since they are located
in countries where the social media administrators are present; e.g. in South Africa, India, US and Europe.
We can also leverage the local US Embassy as they are very well aware of the issues which the society is facing.
— Des pays européens et de ce qui a été écrit dans le papier préparé par le gouvernement
britannique en avril 2019, le “Online Harms White Paper” dont voici un extrait sans équivoque:
“existing regulatory and voluntary initiatives has “not gone far or fast enough” to keep the users
safe. This paper proposed a single regulatory framework to tackle a range of harms.” Cela
concernait des propositions pour réguler les compagnies internet et les plateformes de réseaux
sociaux. Une consultation publique sur ces propositions a eu lieu en juillet 2019 en Grande-
Bretagne.
The ICTA refers to the UK Government both in the Consultation Paper and the interview above, we believe it
would be useful to look at some extracts of the Online Harms White Paper. The one which we consulted is
available at: https://www.gov.uk/government/consultations/online-harms-white-paper and is dated 20
December 2020. We would like to highlight some extracts.
7
In the Ministerial Foreword by Rt Hon Oliver Dowden CBE MP (Secretary of State for Digital, Culture, Media
and Sport) and Rt Hon Priti Patel MP (Secretary of State for the Home Department), the following is mentioned:
We are taking action to unlock innovation across digital markets, while also ensuring we keep people
safe online and promote a thriving democracy, where pluralism and freedom of expression are
protected.
Alongside tackling harmful content this legislation will protect freedom of expression and uphold media
freedom.
As an independent country, the UK has the opportunity to set the global standard for a risk-based,
proportionate regulatory framework that protects citizens online and upholds their right to freedom of
expression.
It is clear here that the UK Government focus is to encourage innovation through online technologies while
protecting people and safeguarding their rights to freedom of expression. We see no approach towards
encouraging innovation in the approach of the ICTA in the Consultation Paper. In fact the opposite is most
likely to happen with international firms seeing this approach as too intrusive.
3. The government has taken a deliberately consultative and iterative approach in developing the
framework, to ensure regulation that is coherent, proportionate and agile in response to advances in
technology.
14. Many of the major social media companies have moved further and faster than ever before to tackle
disinformation and misinformation during the pandemic through technical changes to their products,
including techniques to protect user safety online.
17. The online harms framework will be coherent and comprehensive, bringing much needed clarity to
the regulatory landscape and providing support for both industry and users. It will be proportionate,
risk-based and tightly defined in its scope. The legislation will avoid taking a ‘one size fits all approach’
to companies and harms in scope, to reflect the diversity of online services children, and harms. The
government has placed particular emphasis on protecting ensuring a pro-innovation approach, and
protecting freedom of expression online. Regulation will safeguard pluralism and ensure internet users
can continue to engage in robust debate online.
22. Stakeholders raised concerns during the consultation about how the legislation will impact
journalistic content online and the importance of upholding media freedom. Content published by a
news publisher on its own site (e.g. on a newspaper or broadcaster’s website) will not be in scope of the
regulatory framework and user comments on that content will be exempted.
The UK Government has taken a consultative and iterative approach in developing the framework and has also
mentioned that the framework will be proportionate to the risk. We would strongly recommend the ICTA to
follow this approach, more consultation is needed and because of the lack of details in the Consultation Paper,
there is need for future public consultation following the analysis of the comments received from the
Consultation Paper issued on 14th April 2021.
8
Box 1: User-generated content and user interactions
1.4 The White Paper consulted on defining private communications, and what regulatory requirements
should apply to them. It also said that companies would not be required to monitor for illegal content
on these services in order to protect user privacy.
The key takeout here is that “companies would not be required to monitor for illegal content”. In no
circumstances in the document does the UK Government mention that its regulator will be intercepting,
decrypting traffic.
The ICTA should have already defined harmful and illegal content. We have not seen any mention of some of
the criminal offences mentioned in the UK White Paper. Also, the approach of the regulator in the UK is to
have the companies (e.g. social media administrators) to take actions at their level. We do not see why, with
the UK being as a model by the ICTA, we should implement a completely different solution in Mauritius.
Box 12: Collaboration with industry Harmful content including suicide, self-harm, and eating disorder
content
The online harms framework will place regulatory responsibilities on in-scope companies likely to be
accessed by children to protect their child users from harmful content and activity, including suicide, self-
harm and eating disorder content. However there are wider government-led initiatives to develop
voluntary cooperation in this area ahead of legislation.
The Department for Health and Social Care has coordinated a strategic partnership with social media
companies and the Samaritans to set guidance on moderating suicide and self-harm content, and
educating users to stay safe online.
Again, the UK Government is putting the responsibility of moderating content on social media companies.
9
Interim codes of practice
The government has undertaken an extensive period of engagement across wider government, industry,
international partners and civil society, to ensure the measures set out are proportionate but robust
enough to tackle these most serious and illegal online harms.
Using technology to identify illegal child sexual exploitation and abuse content and activity
Final Policy Position: In light of this, the regulator will have the power to require companies to use
automated technology that is highly accurate to identify illegal child sexual exploitation and abuse
activity or content on their services.
2.60 Robust safeguards will be included in the online harms legislation to govern when the regulator
can require the use of automated technology. The regulator will only be able to require the use of tools
that are highly accurate in identifying only illegal content, minimising the inadvertent flagging of legal
content (‘false positives’) for human review. The regulator will advise the government on the accuracy
of tools and make operational decisions regarding whether or not a specific company should be required
to use them. However, before the regulator can use the power it will need to seek approval from
Ministers on the basis that sufficiently accurate tools exist. The government assesses that currently,
sufficiently accurate tools exist for identifying illegal child sexual exploitation and abuse material that
has previously been assessed as being illegal.
Again, in the UK proposal, it will be the responsibility of the companies (social media administrators) to identify
illegal content on their platforms. In no instance is the regulator being the enforcer, unlike what the ICTA is
planning to do.
10
The UK proposal does not impose data retention as part of the online harms legislation. Again, despite the fact
that the ICTA (through its chairperson) using the UK as an example, it is going for a proposal which is not in
line with what the UK is planning to implement by archiving data.
2.81 As set out in paragraph 2.2, the duty of care will apply to content or activity which could cause
significant physical or psychological harm to an individual, including disinformation and misinformation.
Where disinformation is unlikely to cause this type of harm it will not fall in scope of regulation. Ofcom
should not be involved in decisions relating to political opinions or campaigning, shared by domestic
actors within the law.
The Regulator
The regulator will be accountable to Parliament. Ofcom as the regulator will lay its annual report and
accounts before Parliament and be subject to Select Committee scrutiny. The annual report will give
details about how it has discharged its functions in relation to online harms.
3.11 The importance of regulators being independent from undue influence - from government, other
political sources, regulated services and organisations with an interest in the regulated area - is an
important element of effective regulation.
The UK proposal clearly states that the Regulator should be accountable to Parliament and should be
independent from undue influence including political ones. The Consultation Paper of the ICTA does not shed
any light on how this will be done. The duties of the Regulator in the UK is also based on principles which are
clearly stated.
11
Enforcement in an international context
4.47 The enforcement powers have been designed to be able to be used against companies with and
without a physical or legal presence in the UK. As other countries introduce similar legislation,
international cooperation will become an increasingly important and effective tool for the regulator.
The government expects the regulator to work with equivalent organisations internationally to help
foster collaboration.
International Context
6.4 Countries around the world are grappling with how to make the internet a safer environment for
users. The regulator will take an international approach, working with other international regulators,
to ensure effective enforcement and promote best practice at a global level.
6.5 The government continues to engage with international partners to learn from their experiences and
build consensus around shared approaches to tackling online harms that uphold our democratic values
and promote a free, open and secure internet.
6.6 International collaboration remains vital. The government welcomes international, industry-led,
multi-stakeholder initiatives – including initiatives supported by the UN and other multi-lateral bodies –
such as the Global Internet Forum to Counter Terrorism, the WePROTECT Global Alliance, and wider
initiatives such as the Freedom Online Coalition and the Technology Coalition Fighting Child Sexual
Abuse.
It is very clear that without international collaboration and sharing of best practices, the enforcement of
regulation will be challenging. Could the ICTA disclose whether it has sought collaboration with other
regulators around the world, in particular in the countries mentioned in the Consultation Paper?
2.8 Summary
We believe that social media administrators have in place self-regulations to remove certain illegal content
and also take actions to remove illegal content based on local laws when the requests are genuine and follow
the correct processes. This is also the case even in countries where the social media administrators are not
present.
It is clear that it is the Social Media Platform administrators who are the responsible to remove contents
deemed not suitable. With the proposed amendment, it appears that the Government will be investing a hefty
amount on equipment to decrypt data and store for which they are not even the legal owner. We would
strongly recommend a more collaborative approach with social media administrators while ensuring that the
ICTA follow the procedures set in place by the social media administrators. Has the ICTA followed the proper
channel and used the Attorney General Office for mutual legal assistance, since the Attorney General Office is
the Central Authority who can request for mutual legal assistance?
This is the approach being used by democratic countries as we have shown in our examples. We have not seen
any regulator that is planning to intercept, decrypt and store social media. They prefer to stay at policy level
and get social media companies to take actions. This is also the case for the UK, which the ICTA has used as an
example when talking about social media regulation.
The UK also requires social media administrators to report certain incidents to local Authorities.
12
3. Reply to Question 14.2 of the Consultation Paper: Do you think that the damage caused by the excesses
and abuses of social networks to social cohesion warrants a different approach from the self-regulatory
regime presently being enforced by social media administrators themselves?
Indicator 2019
Source: Statistics Mauritius - Crime, Justice and Security Statistics, 2019 (Police, Prosecution, Judiciary,
Prisons and Probation)
Theft, drugs and other offences are more prominent and have greater adverse effect on society than social
media incidents and the priority of the government should be to focus funds, time and energy on addressing
these offences instead of those caused by social media.
13
Also, it should be pointed out that as per the ICTA communiqué dated 19th April 2021, online messaging
services like Whatsapp are not concerned with the proposal of the Authority. Fake news can also be circulated
through this application as mentioned by the Authority in the Consultation Paper (Section 3.1). Therefore, by
considering only certain online services, the Authority would unlikely address the issues it is trying to resolve.
Ill-minded people will nevertheless be able to communicate on Facebook over standard VPN software out of
their own groups, in the way the Dark Web operates.
Are we therefore, not encouraging other flavours of the social media web which the Government will not
financially / technologically be able to afford to forcefully break through? Illicit exchanges and transactions
could find a honey-pot in the 50-shades of the local social media.
3.3 Summary
We believe that:
1. The number of incidents reported does not need such drastic intrusive measures from the
Government.
2. The priorities of the Government should go to offences that are causing more damage to our society
rather than to a statistically insignificant and presumed online delinquency.
3. A less costly and more collaborative approach with social media administrators should be sought after.
Therefore we do not think that a different approach to the self-regulatory regime used by social media
administrators is needed.
14
4. Reply to Question 14.3 of the Consultation Paper: What are your views on the overall proposed
operational framework in terms of the
• National Digital Ethics Committee (NDEC)
• Enforcement Division
which is intended to bring more clarity to section 18 (m) of the ICT Act, where the ICTA is mandated to take
steps to regulate or curtail the harmful and illegal content on the Internet and other information and
communication services.
We do not believe that the proposed operational framework, as set out in the Consultation Paper, is the most
appropriate approach to address harmful and illegal content on social media. The proposed creation of the
NDEC adds another layer to an already complex structure in terms of dealing with online offences.
However, we do agree that there should be a legal entity that comes up with clear guidelines with regards to
what is harmful and illegal based on our local laws. We suggest that the Authority inspires itself from the UK
approach in defining categories of harmful content as described in the following document:
https://commonslibrary.parliament.uk/research-briefings/cbp-8743/
• criminal offences (e.g. child sexual exploitation and abuse, terrorism, hate crime and the sale
of illegal drugs and weapons);
• harmful content and activity affecting children (e.g. pornography); and
• harmful content and activity that is legal when accessed by adults, but which may be harmful
to them (e.g. content about eating disorders, self-harm or suicide).
15
4.2 Existing Provisions in ICT Act
We are also questioning whether the NDEC is needed when there is already an Internet Management
Committee (IMC) whose role amongst others is to “to advise the Authority on Internet and related policies;”
as per Section 13 (1) (a) of the ICT Act. We would suggest that the composition of the IMC be reinforced to
consist of more retired judges, instead of going through the process of amending the law and creating a new
legal entity.
The reason being that firstly, the issue is a national one and cannot be seen as being partisan both at local and
international level and secondly, retired judges are experienced in taking decisions on whether a content is
harmful and illegal.
16
4.5 Merger between ICTA and Independent Broadcasting Authority (IBA)
The Government has mentioned that the merger between IBA and ICTA is in progress (in Parliament on 2 April
2019 and 30 June 2020).
At the sitting of 2 April 2019 (question B/131), it was mentioned that in reply to part (c) of the question, namely
if the said merger will be proceeded with and, if so, when, to which the reply given was as follows:
With regard to part (c) of the question, the hon. Member may wish to note that the merger exercise will
be proceeded with as Cabinet, at its meeting of 07 December 2018, approved that the draft Bill be
submitted to the Attorney General’s Office for legal vetting. The Bill is under consideration at the State
Law office and it is expected that the Bill will be finalised and introduced in the National Assembly during
its next session later this year.
Since in April 2019 it was mentioned that the Bill had already been prepared and sent to Attorney’s General
Office where it is in consideration, we strongly believe that, that this merger should take place before any
implementation of social media moderation.
This merger would enable optimisation of resources and competencies since the IBA has published a code of
conduct and code of ethics for its licensees to adhere to. These codes includes references such as:
We believe that this is the right approach as content should be regulated in the same manner regardless of
which means is being used to access it. Having a single body regulating all content would make more sense. In
the UK, Ofcom who regulates TV, Radio and Telecommunications sector will also be regulating social media.
Therefore the merger between IBA and ICTA should be prioritised before any changes in the ICT Act which will
become obsolete if the merger is done after.
17
4.5 Summary
We would like to highlight the following to conclude:
1. We strongly recommend that the proposed merger between IBA and ICTA be completed before such
social media regulation is enforced. This would ensure consistency across different media.
2. We agree that there should be a legal entity that decides which type of content should be classified
as harmful and illegal at the onset. These should be published with examples so that the public know
what type of content will lead to offences and local social media moderators know what type of
content they need to remove from brand pages they have control on. Whether it is the NDEC or the
IMC or a merged committee following merger of IBA and ICTA, we believe only one body should exist
in order not to duplicate roles.
3. We do not see that there is sufficient separation of powers at the level of the ICTA for this proposal to
be viewed as a positive measure for Mauritius on the international scene.
4. We do not agree to the setting up of the Technical Enforcement Unit as proposed in the Consultation
Paper.
18
5. Reply to Question 14.4 of the Consultation Paper: What are your views on the proposed legal
amendments to the ICT Act to give legal sanctity and enforcement power to the NDEC?
We understand, from section 8.2 of the Consultation Paper that the changes proposed are:
(a) Nothing in this Act shall prevent NDEC or any of his employees or agents from intercepting, withholding
or otherwise dealing with a message which he has reason to believe is-
(i) indecent or abusive;
(ii) in contravention with this Act
(iii) of a nature likely to endanger or compromise State’s defence, or public safety or public order.
5.3 Summary
We do not believe that any changes in the legislation is required. We will expand on this in our concluding
remarks. However, it is essential that the proposed merger between IBA and ICTA takes place as soon as
possible and that any changes in the law be reflected in the Bill that has already been prepared. Should the
Bill be ready for the merger, it should be tabled in Parliament at the earliest.
19
6. Reply to Question 14.5 of the Consultation Paper: What are your views on the proposed modus operandi
of the NDEC?
20
The Authority should also clarify which social media platforms will fall under this project, especially in the case
of Facebook messenger service (which falls under Facebook but is also a messaging service) and Facebook
private groups.
“The Licensee shall take all the necessary measures to discourage and prevent the flow of content which
is grossly offensive or of an indecent, obscene, or menacing character or infringing intellectual property
rights and international and domestic cyber laws.”
Some countries have imposed 12 to 36 hours to stakeholders for them to remove harmful or illegal content;
we would recommend that the same timeframe be enforced in Mauritius.
6.5 Compliance with Data Protection Act of 2017 (DPA) and GDPR
Section 8.4 of the Consultation Paper states that the NDEC will comply with the requirements of the Data
Protection Act of 2017 with regards to handling of personal data. We have assumed that the Technical
Enforcement Unit will also be complying with these requirements in our response.
21
The setting up and operation of the NDEC and the Enforcement Unit are clearly at odds with the provisions of
the Data Protection Act 2017 of Mauritius (the “DPA”). Section 8.4 of the Consultation Paper refers to this
aspect but falls short as to how it is being proposed to comply with the requirements of the DPA in terms of
the handling of personal data.
As set out under the DPA and stated by the Data Protection Office of Mauritius, the key principle underpinning
data protection is to ensure that people know to control how personal information about them is used or, at
the very least, to know how others use that information. The object of the DPA is to provide for the protection
of the privacy rights of individuals in view of the developments in the techniques used to capture, transmit,
manipulate, record or store data relating to individuals.
The DPA was amended in 2017 in response to the coming into force of the General Data Protection Regulation
(GDPR) which brought radical changes to data privacy laws in the European Union and has a cross-border
impact.
Being given the provisions of the DPA and the GDPR which grants more rights to an individual in terms of how
his private data can be handled, there will be a clear issue when such data are being intercepted by the NDEC
and through the technical toolset. The current proposed method is that the individual will have to consent to
such interception but this is likely to raise serious concerns from a data protection perspective.
The data protection laws require the free and unambiguous consent of individuals before their data can be
accessed. Section 11.3 of the Consultation Paper refers to a user “being prompted for the automatic
installation of this self-signed certificate on his workstation/device”. There will need to be proper guidelines
informing all users in Mauritius as to how their data are being intercepted and only a prompt or pop-up
message as described in the Consultation Paper is likely to fail data protection laws.
(1) No person shall process the personal data of a child below the age of 16 years unless consent is given
by the child’s parent or guardian.
We do not understand how the proposed solution will be able to comply with this provision of the DPA,
especially when accessing a social media platform.
The DPA provides that prior to processing the personal data of a child below the age of 16, it is requisite to
obtain the consent of the child’s parent or guardian.
It will be advisable that the ICTA consult with the Data Protection Office and clear guidance be provided to the
public as to how the proposed amendments in the Consultation Paper will be in line with data protection laws
in Mauritius.
6.7 Summary
We believe that there are already enough provisions to remove / block access to harmful / illegal content
without the need for new legal entities or technical toolset. As mentioned before, social media administrators
have the required technical toolset and human resources to perform this. A collaborative approach, which
would be less costly for the country should be considered by the ICTA.
22
7. Reply to Question 14.6 of the Consultation Paper: What are your suggestions on the safeguard measures
to be placed for the NDEC?
Our understanding of this question is that it also includes the Technical Enforcement Unit.
7.2 Safeguards
The Australian government has also mentioned several safeguards in “The Assistance and Access Act 2018”,
on their website: https://www.homeaffairs.gov.au/about-us/our-portfolios/national-security/lawful-access-
telecommunications/data-encryption. Some of the relevant ones to this Consultation Paper are listed below:
“Engagement between Government and industry is bounded by critical safeguards. All requirements
must be reasonable, proportionate, practical and technically feasible. Government cannot:
• build or implement so-called ‘backdoors’ or do anything that would make the communications
of innocent persons less secure
• build a decryption, interception or data retention capability
• access communications without an existing warrant or authorisation “
We believe that the above safeguards should be taken into consideration by the ICTA.
23
7.4 Social Media and Country Laws
The European Union in a recent article such subject “Social media and democracy: we need laws, not platform
guidelines” mentions a number of key elements that should be used to define any action by the ICTA or the
Government:
“MEPs praised these efforts to regulate the online world via laws, not platform guidelines, but said they
must safeguard freedom of expression and fundamental rights, while avoiding censorship.”
Anne-Sophie Pelletier (The Left, France) stressed the need to protect freedom of expression and opinion.
"On the internet, the freedom of one group of people shouldn't stop where the big platform bosses
decide," she said. "We can’t have content being censored without a decision from a judge...censorship
is never the answer."
Speaking on behalf of the Portuguese Council Presidency, Ana Paula Zacarias said: “We expect online
platforms to play their parts in this common fight, but it is up to the democratic institutions, our laws,
our courts to set the rules of the game, to define what is illegal and what is not, what must be removed
and what should not be.”
The European Union (EU) is an important economic partner for Mauritius. By way of intercepting incoming
traffic, business and personal data of EU citizens taking place on Facebook will also be intercepted and read in
clear text. This action compromises the confidentiality agreement under GDPR (Article 32). The proposed
solution will render the state of Mauritius to institutionally infringe EU legislations.
7.5 Summary
We believe that the NDEC or any other similar legal entity (e.g. IMC or any a committee falling under the
merged IBA and ICTA) should only be working at the policy level and that it is the role of ISPs to enforce the
policies and any directive issued. The Authority should consider the safeguards mentioned by the Australian
Government. Although, they have been fighting battles with social media administrators, they still maintain
that Government should not build decryption, interception and data retention capabilities. The ICTA is going
against what more mature and experienced democracies are practising.
24
8. Reply to Question 14.7 of the Consultation Paper: What are your views on the use of the technical toolset,
especially with respect to its privacy and confidentiality implications when enforcing the mandatory need
to decrypt social media traffic?
The Consultation Paper briefly describes the toolset but does not describe the complete infrastructure. This
would have helped in providing more constructive comments.
There are a number of issues and concerns with the technical toolset which are detailed below.
25
ISPs will have to inform the Authority when they plan on upgrading their links well in advance so that the
toolset can be upgraded to cater for additional capacity before it is put in service by the operators.
What happens if the ICTA server/system are not responsive? This means that most of the digital market place
on social media will be unavailable and hereby causing loss of revenue to SMEs and other companies highly
depending on digital marketing and social media? Will the ICTA have a team onsite 24x7 to ensure the highest
level of service and provide a guarantee on uptime?
Will the Technical Enforcement Unit work on a 24x7 service to remove illegal content?
26
Apple’s stand in 2018 (https://www.itnews.com.au/news/apple-says-decryption-should-alarm-every-
australian-513893) with regards to decryption was considered by the Australian Government which lead to
the Government adding that it cannot build a decryption, interception or data retention capability as
previously mentioned. There are also some questions that are not answered in the Consultation Paper.
Is the installation of the certificate not making the users more liable to hackers, if the certificate issued by ICTA
is stolen/copied without authorisation through man-in-the-middle attacks? How will ICTA ensure that the
certificates are rightly and correctly maintained by end users? Where will the back-up / data recovery centre
be located? And will it have the same security standards as the main site?
Some social media platforms such as Facebook have different types of pages; personal pages, business / brand
pages and groups (public and private) and well as private conversations. We do not believe that there exists a
toolset that could only capture content that are marked as public and this will not ensure confidentiality of
the users.
We welcome the fact that through its communiqué of 29th April 2021, the ICTA states that Messenger traffic
will not be subject to interception, decryption and archiving. The ICTA also implies that all private
conversations would not be subject to the same treatment by stating: “Ces communications privées sont
d’ailleurs inviolables de par la Constitution”. If there is any traffic from Facebook that is intercepted at ISP
level and sent to the technical toolset at ICTA, we do not see how the proposed set-up will prevent private
communications such as Facebook Messenger and Facebook private groups from being intercepted. The ICTA
should disclose whether ISPs have already been contacted to see whether they can technically re-direct only
“public” Facebook data to the technical toolset. ICTA should also clearly define what is public and what is
private. Our understanding is that currently ISPs are not equipped to differentiate public and private data on
Facebook.
We understand that a number of people use the social media login that are available on non-social media
website. By having access to a person’s login credentials, the Authority may also gain access to non-social
media websites.
Clause 31(2)(a)(i) of Data Protection Act of 2017 mentions the need to “the pseudonymisation and encryption
of personal data;”. However, section 11.1 of the Consultation Paper does not explicitly state, that after data
is decrypted, whether personal data will be encrypted before being stored to comply with DPA. We expect
that this should be the case.
Section 11. 1 of the proposal document discloses the most important implication of the technical toolset lies
in the fact that traffic will be intercepted, decrypted, archived and also re-encrypted. Some points need to be
highlighted:
i) The technical toolset will have to intercept all Facebook traffic.
ii) A triage in clear text will need to be carried out, implying that all Facebook traffic will be
decrypted and analysed for NDEC to identify illegal/harmful content. Data will be archived.
iii) This inevitably implies that confidentiality of all Facebook contents will be compromised.
iv) Any Facebook content in clear text will also contain the user IDs and password. Same will be
de-facto compromised.
v) The system may present loopholes exploitable by ill-intentioned parties to edit the contents
before re-encrypting it, or edit the contents to deliberately incriminate a party. The fact that
there are technical loopholes, casts doubts on security system allowing users to repudiate,
irrespective of whether they authored/shared the content or not.
vi) Courts will find it challenging to establish criminal conviction in face of so many doubts, even
when the alleged has effectively posted incriminating contents.
27
8.7 Privacy and Freedom of Expression
The right to privacy is enshrined in the Constitution of Mauritius and the Civil Code. The operations of the
technical toolset are likely to fall foul of the rights to the freedom to expression and the rights to privacy as
described above. At the heart of data privacy laws, there is the concept that a user should be free to know as
to whom his data is being shared with and he must provide his free and unambiguous consent to such usage.
The proposed mechanism set out at Section 11.3 may be deemed not to be enough when compared to all the
information that will be intercepted through the technical toolset.
The Universal Declaration of Human Rights mentions in Article 12 that “No one shall be subjected to
arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour
and reputation.”
The Universal Declaration of Human Rights mentions in Article 19 states that: “Everyone has the right to
freedom of opinion and expression; this right includes freedom to hold opinions without interference
and to seek, receive and impart information and ideas through any media and regardless of frontiers.”.
The implementation of the proposed technical toolset is a clear abuse of the right to privacy of the individual.
We believe that only through separation of powers that the right to freedom of opinion and expression (while
respecting people) will be upheld. The protection of freedom of expression is also enshrined in Section 12 of
our Constitution.
The ICTA in its own document, namely the “ICTA Consumer Guide: The Internet”, mentions an Internet Traffic
Management Practice (ITMP) Framework and states:
The ICTA’s objective in setting up and ITMP Framework is to strike the appropriate balance between the
freedom of expression of Mauritian Internet users to use the Internet for a variety of purposes against
the legitimate interests of ISPs to manage the traffic generated on their networks. The Framework will
also aim to be consistent with legislation, including privacy legislation and to ensure that net neutrally
is not compromised.
In the same document, the Authority states that “net neutrality principle is based on the premise that
ISPs should treat all data equally and that the data should be equally accessible to the users of the
Internet.”
Clearly the proposal as set out by the Authority in the Consultation Paper goes against the principle of net
neutrality as described by the same Authority.
28
If we consider a UK based company who is operating from Mauritius and dealing with a US based client, we
believe that the UK based company may find it risky to continue operating in Mauritius. This could be even
more accentuated with the recent ranking of Mauritius at No 8 of autocracies in Mauritius by the V-Dem
Institute. If the Authority goes ahead with decrypting and storing data from all social media users, there is a
huge risk that our offshore sector will be negatively impacted.
A number of multinationals currently outsources some of their processes to Mauritius. With the proposed
setting up of the technical toolset as described in the Consultation Paper, we believe that these multinationals
may not take the risk of having confidentiality issues with regards to their data and move their operations to
countries where there are no such set-up, thereby impacting our BPO sector. Has the ICTA consulted with
financial institutions on whether there would be any potential or perceived breach with PCI/DSS compliance?
We would strongly suggest that the Authority and the Government carry out an assessment of the impact of
these proposed measures on the above pillars of our economy and publishes the report before taking any
action.
8.10 Netsweeper
In the communiqué of 29 April 2021, the ICTA mentions: « L’outil technologique que l’ICTA utiliserait est un
dérivé d’une technologie existante dont se sert déjà le régulateur depuis 2012 ». We also note from
Netsweeper communiqué of 3rd May 2021 (where they explain the issue with whatismyip.com) that they have
been awarded the tender that was launched during last year’s confinement (ICTA/OIB/CSA/04-20/04). They
also mention in the same communiqué that the filtering servers are based in the Netherlands.
We also wish to highlight that CSA filtering works using BGP, i.e. no traffic from ISPs has to transit through the
CSA infrastructure to check whether access to a particular website should be blocked. Only traffic to the
blocked sites need to transit through the CSA infrastructure. However, the proposed technical toolset will
require that all social media traffic is routed through it by ISPs. This is a significant change from the current
CSA filtering. This will require huge bandwidth and processing resources to ensure optimum customer
experience.
Netsweeper also states in the Technology Overview on their website
(https://www.netsweeper.com/netsweeper-platform/ ) the solution which we believe the ICTA is planning to
implement:
HTTPS Selective Filtering with NSProxy
Our advanced high speed NSProxy service is capable of delivering 10Gbps of throughput and up to 6Gbps of
selective SSL decryption throughput. Advanced social media SSL protection is an example of how our
technologies are helping in today’s digital world. With NSProxy, we can allow access to the safe areas of
social media, while restricting access to the potentially dangerous ones.
29
This raises a number of issues:
i) Will the proposed solution also not be hosted in Mauritius as the current CSA filtering solution?
ii) If for a simple issue of a website being blocked for non-valid reason, the ICTA need to ask
clarifications from the supplier, will this still be the case for the new platform being planned?
iii) What is the limitation on traffic handling of the proposed solution?
8.11 Summary
We would like to re-iterate that we have not seen any democracy trying to implement such a technical toolset
at the level of the Regulator. The UK Government is promoting a collaborative approach with social media
companies, as we should be doing. International collaboration with other regulators to see how they are
implementing social media regulation and learn from their best practices should have been sought in a first
instance by the ICTA.
We believe that the issues, challenges and cost implications raised by the implementation of a technical toolset
are too wide ranging at both local and international level to allow its implementation. A more non-intrusive
approach is definitely needed. The Authority may have underestimated the international impact of this
intrusive measure, bearing in mind that the tourist industry has a large role to play in the recovery of our
economy post COVID-19. The Offshore sector which is already in jeopardy could also be affected. The global
business (BPO) sector which has not been as affected as the other two pillars would certainly be affected if
the proposed measures are implemented as is.
The technical toolset will only block access to content from Mauritius. The content will still be visible from
outside Mauritius or when using VPNs. In this regards, we re-affirm our position that if a content is found to
be harmful and / or illegal, the blocking of the content should be done by social media administrators based
on guidelines published by the Government of Mauritius.
Therefore, we do not agree to the implementation of this technical toolset, or any toolset, at the level of the
ICTA as we do not see technical intervention as being one of the duties of a regulator.
30
9. Reply to Question 14.8 of the Consultation Paper: Can you propose an alternative technical toolset of a
less intrusive nature which will enable the proposed operational framework to operate in an expeditious,
autonomous and independent manner from the need to request technical data from social media
administrators?
We welcome the fact that the Authority itself defines its proposal as intrusive and is open to smarter solutions.
We shall detail here our proposal and we recommend that the Authority to consider same.
9.3 Summary
The Government should work in collaboration with social media administrators so that they improve their
technical toolset to prevent harmful / illegal from being posted in the first place; whether it is through a list of
creole dictionary of words to be blocked or specificities of Mauritian culture which would be classified as
harmful / illegal.
We have shown that social media administrators have put in place processes to have access to harmful / illegal
content blocked, both through their own technical toolset and through reporting mechanisms. This should be
the favoured approach as it would tackle the issue in a more efficient and less costly manner.
We have also shown through examples that it is possible to obtain technical data, without the need for an
intrusive toolset. In any case, even with the toolset, this technical data may not be retrievable.
The need of going through a judge’s order being essential for separation of powers and transparency cannot
be stressed enough.
31
10. Reply to Questions 14.9 of the Consultation Paper: Should the Courts be empowered to impose
sentences (which include banning use of social media) on persons convicted of offences relating to misuse
of social media tools?
https://defimedia.info/propos-communal-sur-facebook-peine-dun-mois-de-prison-maintenue
- Police v N. S. Mohamed (2008) where the sentence by the Intermediate Court was a fine of Rs.
190,000 under the ICT Act 2001 and Computer Misuse and Cybercrime Act 2003.
- Police v S.Teeluck (2009) where the sentence was a fine of Rs. 150,000 for two counts under the ICT
Act 2001
- Police v K. Bunwaree & S. Dowlut where the one accused was sentenced Intermediate Court to a fine
of Rs. 25,000 for one count and the other to a fine of Rs. 300,000 for twelve counts.
“The applicant is not entitled to access or use the internet for any purpose whatsoever. He is not entitled
to be allowed to make international phone calls by any means. He is not to make use of any 3rd party’s
smart phone and internet connection. The applicant should not to be in communication with any person,
other than the authorities, in connection with the present case either in person or by means of any
technology such as phone, email, WhatsApp, Messenger, Facebook, Twitter, or any other social media
platform.”
10.3 Summary
According to us, the Courts are already empowered to deliver sentences on people convicted of social media
offences and these sentences already include social media ban amongst others.
32
11. Concluding Remarks
The proposal in the Consultation paper is typical of repressive regimes where internet traffic is heavily
controlled by government, (from the more extreme like China to others like Pakistan, Egypt and Tunisia). In
dealing with the internet and social media, all countries are facing the same dilemma in terms of balancing
freedom of speech and right to privacy on the one hand and social cohesion and security on the other. But
many countries, namely developed countries such as Australia, US, UK are implementing more balanced
measures after lengthy consultations with the public and private sector and all stakeholders involved. What
the government here is proposing ranks as one of the most repressive and intrusive measure envisaged though
disguised as one where it’s the citizen himself and even non-citizen travelling to Mauritius who agrees to it.
This will have a negative impact on the image of the country and will not help the country get removed from
being grey / blacklisted by European institutions if we are perceived as going against GDPR. Furthermore, it
will reinforce Mauritius’s position as an autocratic regime, which the German V-Dem institute has already
highlighted in its 2021 report.
On top of that, the lack of transparency from the ICTA and the choice of timing (during lockdown last year and
this year) for both the tender issuing and the public consultation does not give confidence in what the ICTA is
planning to implement. The lack of transparency in when Facebook was consulted and for what does not help
the ICTA cause.
From our research during the limited time given to respond to this Consultation Paper, it seems no other
country besides Kazakhstan has attempted to implement such an intrusive solution to moderate social media.
The proposals in Kazakhstan were thrown in the limelight, leading to technical companies Mozilla and Google
opposing such measures as clear there were breaches of privacy policies that drive their products and services.
India, the largest democracy in the world, is using a collaborative approach with social media platforms; as are
all the other democratic countries who are trying to address the issue of social media regulation.
(https://www.reuters.com/article/india-tech-regulation-idUSKBN2AP175 and
https://prsindia.org/billtrack/the-information-technology-intermediary-guidelines-and-digital-media-ethics-
code-rules-2021 ).
A number of multinationals operate from Mauritius and also use social media for business purposes where
exchanges are highly confidential. Any threat to this confidentiality, through decryption of data, may
encourage these multinationals to leave Mauritius and opt for more appropriate countries. Businesses will shy
away from having their businesses being conducted from Mauritius, hence, impacting negatively on Foreign
Direct Investments. This is not the image that we want to portray for our country in the international scene.
And with the proposed Data Technology Park at Cote d’Or, we do not think that we will be able to attract key
international players if we go ahead with such a project as described in the Consultation Paper. Some
international organisations (e.g. Electronic Frontier Foundation) have started criticising the planned initiative
of the ICTA; see https://www.eff.org/deeplinks/2021/04/proposed-new-internet-law-mauritius-raises-
serious-human-rights-concerns and https://portswigger.net/daily-swig/mauritian-governments-plan-to-
intercept-encrypted-web-traffic-marks-death-knell-for-freedom-of-speech. We expect to see more bad
publicity for Mauritius on international media before the closing of this consultation.
We also believe that more education of internet users should be done by both the National Computer Board
and the Cybercrime unit to all users across the country. Most internet users are just given access to internet
without proper guidance and knowledge on the impacts and implications of online activities and actions. Most
people in Mauritius are not fully aware on the existing laws and the enforcement of those. There should be a
simple guidelines on what is allowed and not allowed.
33
Education of parents and education of children as part of the ICT related curriculum should already be part of
the day to day of schools. An educated and informed citizenship will allow the persons to act more responsibly
with the due respect of the applicable laws. The need for coercive measures is inversely proportional to the
quality of education provided to our citizens.
As shown in our response, one of the rationale but forward in the Consultation Paper to implement the
technical toolset, namely that the social media administrators do not have an office in Mauritius, does not
hold as we have seen that the social media administrators have taken actions in countries where they do not
have an office.
However, we do agree that there is a need to improve regulation of content on social media, but it should be
done in a transparent and non-partisan manner and based on the laws of Mauritius. We strongly believe there
are already all the provisions in place in existing laws and licences to achieve the same results that the
Authority is trying to achieve.
1. The merger between IBA and ICTA be prioritised by the Government. As mentioned in Parliament, the
Bill is already at Attorney General Office since 2019, any change in legislations now without
considering the Bill already prepared will be at huge waste of time and resources. We believe that the
priority of the Government should be to accelerate this merger so that resources can be optimised.
2. There is already an Internet Management Committee as defined in the ICT Act, whose role is to advise
on Internet and related policies. Some work has already been done at IBA where they have drafted a
Code of Conduct and Code of Ethics. Once the two bodies merge, the task of drafting clear definitions
of harmful / illegal content should be easy and standardised across all media. These definitions should
then be publicly available as well as clear guidelines and examples. The composition of committees is
critical and should be properly overseen.
3. The Authorities should work closer with social media administrators to improve self-regulation for
content posted in Creole and use proper channels to have access to harmful / illegal content blocked.
This is the approach taken by countries such as UK which is used as a source of inspiration by the ICTA.
At the same time the Government should encourage social media administrators to set up offices in
Mauritius as we have the unique offering of having people who could moderate content in different
languages such as English, French and additional languages (Hindi, Urdu and Arabic amongst others).
We note that Facebook‘s presence on the whole African continent is limited to only one country,
namely in South Africa. It would be a unique proposition from Mauritius to get them to set an office
in a French speaking country for Africa, before they decide to set up in another one.
4. We do not believe that it is the role of the Authority or any Government body to intercept any traffic
for decryption and analysis. We refer to the safeguards mentioned by the Australian Government:
“Engagement between Government and industry is bounded by critical safeguards. All requirements
must be reasonable, proportionate, practical and technically feasible. Government cannot build
a decryption, interception or data retention capability”.
5. Authorities should go through Judge’s order for every content to which access should be blocked at
international level (social media administrators).
34
6. We believe that in cases where Legal Interception is required, a reactive approach should be preferred
rather than proactive drag netting of all data including private data. This should be resorted to only
upon judge’s order including in cases where data is required from Internet Service Providers or social
media administrators. Therefore we do not see the need for a technical toolset to be implemented at
the level of ICTA.
7. Social Media administrators should block access to the content upon receipt of a request in the
appropriate form (Request for Mutual Assistance from Attorney General Office). We believe that the
collaborative approach which we encourage should be favoured in an era where privacy and freedom
of speech are pillars of any democratic society.
8. We believe that there should be increased education of the population on offences pertaining to the
use of internet and social media platforms.
Finally, we strongly believe that once the ICTA has analysed all responses and come up with the best solution
both for the technical and legal aspects, it should carry out another round of public consultation with a more
detailed proposal.
Way Forward
The first consultation on UK Online Harms White Paper started on 8th April 2019 and ran till 1st July 2019, which
gave a period of 84 days for responses to be sent. Here in Mauritius, the ICTA initially gave only 3 weeks and
finally, with the extension, 35 days in total to respond.
The final version of the White Paper was published on 20th December 2020, approximately 20 months after
the start of the consultation process. As mentioned in the White Paper, a number of consultations was done
with different stakeholders in order to finalise the White Paper using an iterative approach.
We believe that the ICTA should not rush this project and proceed in the same consultative and iterative
manner as the UK, whilst being transparent along the way.
A number of questions / clarifications have been asked due to non-detailed information provided or
inconsistencies between the consultation paper and the two communiqués issued by ICTA. More consultations
and open sessions should be done with all stakeholders including NGOs, to shed more light on the project and
allow more interactions before a proper solution is found to regulate abuse and misuse of social media.
35