Ahead of the final proposal for the Digital Services Act (DSA) Implementation Act, which must pass its second reading and be voted on in Parliament by March 26, 2025, Gong has sent parliamentary deputies and committee members a proposal for amendments that would speed up the implementation of the Act in Croatia.
The Digital Services Act, better known as the DSA, is the first law in the world that systematically attempts to regulate the operations of large digital platforms and online services in order to ensure the safety and protection of their users.
Regarding the implementing law, Gong believes that it is important to ensure transparency in content removal, which is why we advocate that HAKOM - the Croatian Regulatory Agency for Network Activities, which has been given the key role of coordinator for digital services, maintains and publishes a database of all requests from Croatian authorities to remove content sent to large platforms in real time. The DSA already stipulates that coordinators must report on all orders received from national authorities, but only once a year.
Platforms based on the DSA can remove illegal content in several ways. They can do this on their own initiative based on their algorithms, user reports, and measures they take after assessing systemic risks. Another way is through requests from competent national institutions. Platforms can also act upon reports from trusted flaggers who report potentially illegal content.
As the coordinator for digital services, HAKOM is also authorised to grant, suspend, or revoke the status of trusted flaggers. The DSA also provides for annual reports on platform reports for trusted flaggers, and Gong recommends that their reports should also be published in real time.
The law states that the bodies responsible for issuing orders to take action against illegal content are the State Attorney's Office, the Ministry of the Interior, the Personal Data Protection Agency, the Customs Administration, the State Inspectorate, and the Ministry of Health. Gong advocates that the State Electoral Commission be included in the list of bodies responsible for issuing orders to take action against illegal content, for content that constitutes violations of electoral legislation.
We also believe that it would be important to educate citizens, for whose safety on the internet the Digital Services Act was passed, about the reporting mechanisms and rights they have against large platforms due to the introduction of DSA. Establishing a contact point, a website where citizens would be clearly shown and promoted all the options and tools available to them for safer use of the internet, would be of immense benefit for the best possible implementation of DSA in the lives of citizens.
Gong's proposals for amendments to the proposed DSA Implementation Act
Below are Gong's proposed amendments to the final draft of the Act on the Implementation of Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on the Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act):
In Article 6, paragraph 1, point 6 is added:
The State Electoral Commission, for illegal content that constitutes a violation of regulations in the field of electoral legislation.
Article proposal: Article 6
(1) The authorities competent to issue an order to take action against illegal content referred to in Article 9 and an order to provide information referred to in Article 10 of Regulation (EU) 2022/2065 are:
1. The State Attorney's Office and the Ministry of the Interior for illegal content constituting a criminal offence and a misdemeanour
2. The Personal Data Protection Agency for illegal content constitutes a breach of regulations governing the protection of personal data
3. The Customs Administration of the Ministry of Finance for illegal content constituting a breach of intellectual property rights
4. The State Inspectorate for illegal content constitutes a breach of regulations within the scope of inspections of the State Inspectorate under the powers laid down in special regulations
5. The Ministry of Health for illegal content constituting a breach in the field of healthcare, medicines and medical devices, and biomedicine, in accordance with the powers laid down in special legislation
6. State Election Commission, for illegal content that represents a violation of regulations in the field of election legislation.
Explanation:
In Article 6, paragraph 1, point 6 was added to separate and including offenses from the field of electoral legislation, due to its special importance for the democratic order.
Paragraph 2 is added to Article 9 and reads:
(2) The coordinator for digital services maintains and publishes a database of all orders to act against illegal content and orders to provide information from national competent authorities, as well as the effects of these orders, in real time.
Article proposal:
Monitoring of implementation and reporting
Article 9
(1) National competent authorities shall submit their reports on the activities carried out to the Digital Services Coordinator, including the number and subject of the orders for action against illegal content and information orders and the effects of those orders, in accordance with Article 55 of Regulation (EU) 2022/2065, by 1 February of the current calendar year for the previous year at the latest.
(2) The Digital Services Coordinator shall maintain and publish a database of all orders for action against illegal content and information orders issued by national competent authorities, as well as the effects of those orders, in real time.
(3) To fulfil the requirements for cooperation with other Digital Services Coordinators, the European Commission, and the Board, the Digital Services Coordinator may request a report referred to in paragraph 1, irrespective of the deadline referred to in paragraph 1 of this Article.
(4) The Digital Services Coordinator shall ensure the submission of consolidated annual reports to the European Commission and the Board by 1 March of the current calendar year for the previous year.
(5) ADR entities certified according to Article 21 of Regulation (EU) 2022/2065 and trusted reporting entities referred to in Article 22 of Regulation (EU) 2022/2065 shall be obliged to submit to the Digital Services Coordinator the report referred to in Article 21(4) and Article 22(3) of Regulation (EU) 2022/2065 by 1 February of the current calendar year.
Justification:
Paragraph 2 has been added to Article 9 to strengthen the transparency of content removal under the Digital Services Act.
Furthermore, the regulations that will further elaborate on the monitoring of implementation and reporting should include: