101010.pl is one of the many independent Mastodon servers you can use to participate in the fediverse.
101010.pl czyli najstarszy polski serwer Mastodon. Posiadamy wpisy do 2048 znaków.

Server stats:

563
active users

Open source has been a key issue in policy debates, and includes provisions that regulate development and sharing of AI development.
We've been following the proposed rules as they meandered through several different approaches. Now elements of the final version have been made public.

The big question was whether transparency and other obligations need to be mandated for open source AI, or they can be self-regulated – under the assumption that open source developers ensure these elements based on principles of open development.

The agreed upon wording of the AI Act assumes the later, and makes open source exempt from regulation of general purpose AI models, including transparency obligations.

We think that this is a problem, especially that lack of agreed standards that define open source AI means that there is an risk of open washing.

You can read more about this on our blog: Paul Keller wrote a detailed analysis of the provisions:
openfuture.eu/blog/a-frankenst

Open FutureA Frankenstein-like approach: open source in the AI act – Open FutureLate last week, the European Commission, the Member States, and the European Parliament reached a deal on the AI Act. The current compromise is a combination of tiered obligations and a limited open source exemption which creates a situation where open source AI models can get away with being less transparent and less well-documented than proprietary GPAI models.
@tarkowski
That is, it can be expected that companies will circumvent the limitations of #AIAct by, for example, calling open-source AI such where the code is open but the training data is closed.
Alek

@miklo this depends on how the AIAct will be enforced, let’s hope that the exception will be protected from such circumvention. And a more clear, precise definition is the starting point