Publication date: Oct 6, 2022, 10:07 AM UTC
Reporting period: 1 January – 30 June 2021
Which of the following best describes the primary type(s) of online service(s) that you provide?
- Cloud-based storage and sharing
- Messaging
- Video chat
- Video sharing
If you would like to add a comment or other information, please do so here.
Mega provides end-to-end encrypted cloud storage and text/voice/video chat.
Do you prohibit terrorist and/or violent extremist content on your service?
Yes
Do you use one or more specific, publicly available definitions or understandings of terrorist and/or violent extremist content?
Yes
Please provide the publicly available definition(s) or understanding(s) that you use, along with your relevant terms of service or policies.
MEGA's Terms of Service and Takedown Guidance Policy make it clear that MEGA has zero tolerance for violent extremism, and references s3 of the Films, Videos, and Publications Classification Act 1993.
See https://mega.nz/takedown
https://mega.nz/terms
If you would like to add a comment or other information, please do so here.
Violent extremism is processed in the same manner as other objectionable material as defined in section 3 of the New Zealand Films, Videos, and Publications Classification Act 1993, or other Internet-harming material, including as defined by the Harmful Digital Communications Act 2015: there is immediate deactivation of the folder/file links, closure of the user’s account and provision of the details to the New Zealand Government Authorities for investigation and prosecution.
Do you use any of the following methods to detect terrorist and/or violent extremist content on your platform or service?
- Flagging by individual users or entities
- Government referrals
- Government legal requirements
- Cross-company shared databases or tooling
- Trusted notifiers
Can you determine the total amount of content that was flagged or reported as terrorist and/or violent extremist content on your service during the reporting period?
Yes
Are you willing and able to disclose it?
Yes
How much content, in total, was flagged or reported as terrorist and/or violent extremist content on your service during the reporting period?
2021 File Link Folder Link Total
Q1 419 9 428
Q2 347 15 362
Can you determine the amounts of content that are flagged or reported as terrorist content separately from the amounts of content that are flagged or reported as violent extremist content on your service?
No
If you would like to add a comment or other information, please do so here.
We remove duplicate reports from our public statistics.
Can you determine the total amount of content that is flagged or reported as terrorist and/or violent extremist content according to the method of detection?
Yes
If you would like to add any comments or you can provide any relevant data, please do so here.
The totals provided here are reports, not URLs as in Section 3, because we do not track URLs in the disaggregated format.
Please select all interim or final actioning methods that you use on terrorist and/or violent extremist content.
- Content removal
- Suspension/removal of account
- Content blocking
Can you determine the total amount of terrorist and/or violent extremist content on which you took action during the reporting period?
Yes
Are you willing and able to disclose it?
Yes
Please provide that amount, along with any breakdowns that are available.
All links are immediately deactivated, the user's account closed, and details provided to the New Zealand authorities..
Can you determine the total number of accounts on which you took action during the reporting period for violations of your policies against the use of your service for terrorist and/or violent extremist purposes as a percentage of the average number of monthly active accounts during the reporting period?
Yes
Are you willing and able to disclose it?
Yes
Please provide that percentage, along with any breakdowns that are available.
The accounts that were closed because they shared Violent Extremist files represented 0.0001% of MEGA's total registered users.
If you would like to add other comments or information, please do so here.
We took action on 100% of the content that was flagged as TVEC.
If your service includes livestreaming functionality (even if it is not among what you consider to be the primary functionalities), then given the potential for terrorists and violent extremists to exploit livestreaming in ways that could promote, cause, or publicize imminent violence or physical harm, do you implement controls or proactive risk parameters on livestreaming to reduce misuse?
No livestreaming functionality
Please provide details on how you balance the need to action terrorist and/or violent extremist content with the risk that such content can be mislabelled and may actually be denouncing and documenting human rights abuses, or that it does not otherwise violate your terms of service.
If any user disputes the action we have taken in closing their account because of alleged Violent extremist content sharing, MEGA requests the New Zealand Government agency to review the content and advise whether it was VE or was mis-reported. In the latter case, the user's account can be reopened.
Do you have an appeal or redress process for content and/or account actioning decisions made under your terms of service on terrorist and/or violent extremist content?
Yes
Please provide a detailed overview of those processes.
If any user disputes the action we have taken in closing their account because of alleged Violent extremist content sharing, MEGA requests the New Zealand Government agency to review the content and advise whether it was VE or was mis-reported. In the latter case, the user's account can be reopened.
Is your appeal or redress process available to the user who posted the content or owns the account in question?
Yes
Is the outcome of your appeal and redress process available to the user who posted the content or owns the account in question?
Yes
Is your appeal or redress process available to the person or entity who requested actioning?
No
What is the total number of appeals received from all sources, during the reporting period, following content or account actioning decisions under your policies against terrorist and/or violent extremist content?
29 appeals across all types of actioning were received during the reporting period. The number relating just to violent extremism is not available.
How many such appeals were decided during this reporting period (regardless of when those appeals were received)?
29 appeals across all types of actioning were decided during the reporting period. The number relating just to violent extremism is not available.
Of those, how many were granted?
1
If you can break these numbers (appeals received, decided and granted) down with any more detail, please do so here.
N/A
How, and how often, do you measure and evaluate the efficacy and/or room for improvement of your policies in each of the following areas?
MEGA monitors the type of reports being received and has a continuous improvement process.
Do you have a point of contact (such as a dedicated email alias, desk or department) that can be contacted during a real-world or viral event with direct online implications and which works to address harmful content on your service?
Yes
Are you a member of an international crisis protocol aimed at reducing the volume and impact of terrorist and/or violent extremist content online during a crisis?
Yes
Please identify the protocol.
GIFCT CIP
Did your company participate in or benefit from the activation of a crisis protocol aimed at reducing the volume and impact of terrorist and/or violent extremist content during the reporting period?
No such crisis protocol was activated during the reporting period.