News Feed CC and Communia Statement on Transparency in the EU AI Act “EU Flag Neural Network” by Creative Commons was cropped from an image generated by the DALL-E 2 AI platform with the text prompt “European Union flag neural network.” CC dedicates any rights it holds to the image to the public domain via CC0. The European Union’s Artificial Intelligence (AI) Act will be discussed at a key trilogue meeting on 24 October 2023 — a trilogue is a meeting bringing together the three bodies of the European Union for the last phase of negotiations: the European Commission, the European Council and the European Parliament. CC collaborated with Communia to summarize our views emphasizing the importance of a balanced and tailored approach to regulating foundation models and of transparency in general. Additional organizations that support public interest AI policy have also signed to support these positions. Statement on Transparency in the AI Act The undersigned are civil society organizations advocating in the public interest, and representing knowledge users and creative communities. We are encouraged that the Spanish Presidency is considering how to tailor its approach to foundation models more carefully, including an emphasis on transparency. We reiterate that copyright is not the only prism through which reporting and transparency requirements should be seen in the AI Act. General transparency responsibilities for training data Greater openness and transparency in the development of AI models can serve the public interest and facilitate better sharing by building trust among creators and users. As such, we generally support more transparency around the training data for regulated AI systems, and not only on training data that is protected by copyright. Copyright balance We also believe that the existing copyright flexibilities for the use of copyrighted materials as training data must be upheld. The 2019 Directive on Copyright in the Digital Single Market and specifically its provisions on text-and-data mining exceptions for scientific research purposes and for general purposes provide a suitable framework for AI training. They offer legal certainty and strike the right balance between the rights of rightsholders and the freedoms necessary to stimulate scientific research and further creativity and innovation. Proportionate approach We support a proportionate, realistic, and practical approach to meeting the transparency obligation, which would put less onerous burdens on smaller players including non-commercial players and SMEs, as well as models developed using FOSS, in order not to stifle innovation in AI development. Too burdensome an obligation on such players may create significant barriers to innovation and drive market concentration, leading the development of AI to only occur within a small number of large, well-resourced commercial operators. Lack of clarity on copyright transparency obligation We welcome the proposal to require AI developers to disclose the copyright compliance policies followed during the training of regulated AI systems. We are still concerned with the lack of clarity on the scope and content of the obligation to provide a detailed summary of the training data. AI developers should not be expected to literally list out every item in the training content. We maintain that such level of detail is not practical, nor is it necessary for implementing opt-outs and assessing compliance with the general purpose text-and-data mining exception. We would welcome further clarification by the co-legislators on this obligation. In addition, an independent and accountable entity, such as the foreseen AI Office, should develop processes to implement it. Signatories Creative Commons Communia Association Creative Commons Italy Chapter Centrum Cyfrowe The post CC and Communia Statement on Transparency in the EU AI Act appeared first on Creative Commons.