The Terrorist Content Regulation (TERREG)

The Terrorist Content Regulation (TERREG), is based on the murder of 51 people during a shooting and a broadcast at the time in two mosques in Christchurch, New Zealand, in March last year. This video starts to spread rapidly on the Internet. The European Parliament needs a set of laws that include law enforcement. These laws aim to prevent such content, especially on the internet. AP proposes TERREG in 2018 and discussions begin. The EP wants to prevent the propaganda of terrorist organizations such as ISIS on the visible side of the scene and to control technology and social media providers such as Facebook and Google.

Warnings will be given to such technology giants under TERREG. These companies will only have one hour by law to remove or filter unwanted content. States may impose a fine up to 4% of their annual income to an organization that does not remove or filter unwanted content within one hour, or decisions leading to closure may be taken. For the Facebook, whose annual income is 17 billion dollars, this 4% fine means an average of 680 million dollars. The EP thinks that such penalties will prevent propaganda. Although some institutions and individuals defend this; big tech companies and think tanks see this law as a blow to freedom of thought on the internet. Moreover, TERREG will restrict the activities of some groups engaged in anti-terrorism activities.

Almost all other terrorist groups like ISIS want to use social media as a propaganda tool. When ISIS first emerged, it actively used social media to gain militants and instill fear with its power. Although the physical presence of ISIS decreased over time, it tried to influence using such media tools intellectually. Those who defend TERREG also make these excuses. However, it should not be forgotten that the concept of terrorism is not fully defined and can easily be perverted by the authorities.

Julian King is one of the leading figures who defended this law. “We see the reflection of every attack for 18 months on digital platforms.” he said. King is one of the driving forces behind this law and argues that the law is necessary for European countries.

Obligation to remove unwanted content within one hour is one of the two responsibilities of companies. According to the recommendations, each EU member state will appoint a so-called competent authority. It will be up to each member state to decide exactly how this body works. The legislation also says that they are responsible for pointing out problematic content. These contents; include videos and images that incite terrorism, give instructions on how to carry out an attack, or otherwise encourage participation in a terrorist group. According to the legislation, once the content is determined, this authority sends a removal order to the platform hosting it and the period begins. After this warning, the content should either be deleted or made out of reach of any user within the EU. Whichever method is preferred, it all has to happen within an hour. Through this time limit, the EP wants to prevent the spread of the video as soon as possible. This obligation is similar to currently in practice volunteering rules that force tech companies to remove content flagged by law enforcement and other trusted entities within one hour. But what is new is the addition of a legally mandated upload filter that will technically stop the same excessive content from constantly re-loading after being flagged and removed. In other words, the determined content must be removed from the internet world in a way that will never come back. Along with the HASH encoding, an ID will be assigned to that content and the ID number will be stored in the data store. The next time that code goes online, it will be automatically banned. Such coding methods of identification are also applied to disturbing content such as child abuse videos; however, there is a big difference between the content that is intended to be banned and content that contains child abuse. That difference is that so-called terrorist content that is intended to be banned may sometimes be used as news sources. This automatic filtering and encoding will also ban images and accurate information, most of which will be news content. Critics say the upload filter would be used by governments to censor their citizens, and the abrupt removal of extremist content could prevent NGOs from documenting events in war-torn parts of the world. This is a big blow to journalism and journalists. The delegations under the control of the governments will be able to remove any content they want from the internet environment within one hour, in line with the directives of their governments. In doing so, they will rely on TERREG’s law. This also means that it would be used by governments that do not want their corruption to occur, because it cannot be determined exactly which content contains terrorism or which does not. Inciting citizens to revolt and overthrowing the government may also be included in the terror classification. In this way, government officials who want to hide their corruption will destroy that content by making such excuses. In times of Syria’s civil war, one of the best ways to prove when human rights violations occurred was to watch footage of the conflict. These images will also be in danger of being deleted from the internet by one of the governments. Between 2012 and 2018, Google removed more than 100,000 videos of the vital attacks that provided evidence of events in the Syrian civil war. The Syrian Archive, an organization aimed at verifying and preserving footage of the conflict, had to back up the footage on its site to prevent the record from being lost.

The application of one hour limit to small and medium sized websites is a big problem for them. Most small and medium sized websites do not have the infrastructure to remove or filter the content within the given limit. With this situation, the internet will cease to be free and become a monopoly. There will be irreversible consequences of it such as, as long as the segments sharing the knowledge are monopolized, the validity of the information will not be questioned and the monopolization of knowledge will adversely affect every part of the world. Some sites that are already information providers and distributors will also be affected. Images and expressions revealed by Wikileaks that expose war crimes will also be easily banned under TERREG. Governments, together with TERREG, will put more pressure on their citizens. One of the tragicomic aspects of this legislation is that one European country is also entitled to ban content from another European country. In other words, European countries may take some unwanted actions towards each other’s policies within themselves.

The Center for Thought Democracy and Technology (CDT), funded in part by Amazon, Apple, Facebook, Google, and Microsoft, wrote an open letter to the European Parliament earlier this year stating that the law will direct internet platforms to adopt untested and poorly understood technologies to restrict online expression. The letter was co-signed by 41 campaigners and organizations, including the Electronic Frontier Foundation, Digital Rights Watch, and Open Rights Group. Opponents of the legislation such as CDT also state that, eventually filters will act like YouTube which has the frequently criticized Content ID system. This identity protection algorithm allows copyright holders to apply take-down action against videos that use their material, but the system sometimes also removes videos posted by their original owners and says the original clips are copyrighted. Therefore, as such identity algorithms cannot be fully controlled, TERREG also constitutes an obstacle to free information. Of course, there is a danger that thousands of years of historical documents and symbols, literature and even musical works will be damaged within the scope of terrorist content.

Another fact that shows that TERREG should not be is that important social media platforms are currently removing videos and articles containing terrorism. In addition, every big technology company has its own rules on this subject, partly they are sensitive to this issue according to other rules. Under TERREG, every technology company may be forced to use the same filtering technology. This includes user data and content so that it will mean sharing between platforms, between EU member states and law enforcement agencies such as Europol. This situation means that when an opinion is determined as terrorist content by decision makers, the person who produces that thought will also be punished. With this aspect, TERREG deals a blow to the right to free thinking. This will make it harder for people to share their thoughts, and freedom on the Internet will be severely restricted. Although TERREG is said to assist law enforcement, removing the evidential videos will actually create a huge obstacle to law enforcement and their research.

It will be useful to state the damages that will be caused by the point by point:

1) Knowledge will be monopolized.

2) As the power focus changes, freedom on the internet will be determined according to the power in charge.

3) Some major internet providers and platforms will have the internet.

4) Small and medium websites will be banned or forced to shut down

5) Even important sources, digital versions of historical artifacts, stereotyped symbols and historical literatures may be banned over time.

6) With the loss of correct information, terrorists will become more active. Citizens will be vulnerable to terrorists as time passes by due to information loss.

7) With loading filters, these journalistic related platforms will face a great risk. Neutral and investigative journalism needs to document and question what is happening when it comes to corruption. Content found to be related to terrorism will be removed if it contains corruption related documents.

The Swiss Journalists Association, The Transatlantic Working Group, The International Association of Journalists, Polish Coalition and some important United Nations reporters oppose the TERREG legislation.

In addition, ACTA2, which has a worldwide impact, and the STOPACTA2 Initiative and Anonymous digital activists continue to work to stop this law. You can see the studies and comments under the heading #TERREG #stopACTA2 on Twitter.

You can access the Transatlantic Working Group Report here.

You can access the United Nations Report here.

It is useful to share some important links for further investigation:

https://saveyourinternet.eu

https://www.patrick-breyer.de/?page_id=588862&lang=en

https://www.accessnow.org/terrorist-content-regulation-the-fight-for-fundamental-rights-isnt-over/

https://www.accessnow.org/automation-and-illegal-content-can-we-rely-on-machines-making-decisions-for-us/

https://edri.org/our-work/terrorist-content-regulation-document-pool/

https://cdn.netzpolitik.org/wp-upload/2020/09/2018-0331-TerroristContentOnline-StateOfPlay-st09379-en.pdf

Legislation is currently on track through the European Parliament, and the exact wording of its text may still change. It will be clear whether TERREG will come to life after the triple talks called trilogue are complete. Trilogue is a decision-making discussion process with the participation of the representatives of The Parliament, The Council and The Commission. The decision is made after the tripartite meetings. These meetings on TERREG include very hot debates. Even if many countries want this legislation, many considerable countries oppose it. Even the auction itself is in a position to violate some laws determined by the European Union itself. Because of all the aspects mentioned, TERREG has to be reconsidered.

Investigative Journalist & Independent Scientist #Anonymous #hacktheplanet #FreeSpeech #FreeWorld