The Cyber Resilience Act. What's so bad about it

The Cyber Resilience Act. What's so bad about it

We need to start talking about the Cyber Resilience Act, because according to all major Open Source organizations the CRA is a threat to Free Software itself, and it has been approved by the European Committee that was working on it.

This, is going to be a complex story of laws and burocracy, and even though it might seem like Europe accidentally hurted Open Source whilst doing an otherwise great bill... this is very much intentional.

Let's get started. Everything revolves around cybersecurity. Yes, as users, it's terrible when a vulnerability is found in a product we are using, as it might put our sensitive data at risk; but did you know that the EU Parliment estimated that better cybersecurity would make European companies save between one hundred eghty billion and two hundren ninty billion euros, every year? So, not only security is essential for us consumers, but even from an "economical" point of view, there's a very strong incentive for the EU to require a higher level of cybersecurity.

But how? Well, the proposal is roughly what you would expect: they made a list of software products that are potentially more at risk than others (such as: network managers, server operating systems, password managers, and so on) and they outlined a list of security requirements for these products. These requirements cover everything that should be done before publishing the product (such as, a full security audit) but also whenever a critical vulnerability is discovered (such as, tell ENISA).

Who's ENISA? It's the European Union Agency for cybersecurity, the part of the EU that deals with cybersecurity.

So, roughly speaking, this is the spirit of the CRA: if you have a potentially vulnerable product, you have to self certificate that it's secure enough, and if there's a vulnerability please tell us ASAP. All of this sounds amazingly reasonable, as an idea; and, in fact, pretty much all big Open Source organizations that speak against the CRA start off by saying: "we do like the act, we think it's a good idea, but". But? What's the big deal?

First of all: people are not happy that they have to communicate to ENISA all the vulnerabilities as soon as they are exploited (in fact, the act requires to do it withi hours). The established policies for unpatched vulnerabilities are: only tell the people that can actually contribute to fixing the security vulnarability. Quoting GitHub: a wide disclosure of unpatched vulnerabilities does not make the open soure ecosystem more resilient - it makes it more perilous.

Personally, I have no idea how valid of a criticism that is; however, there's a secondary obvious problem. I've talked about a self-certification to attest the security of a product before releasing; the obvious question is: how much work is that? Well, the EU estimates that the work needed will result in 25% more in cost overhead, which is a significant amount. Thus, an obvious question is raised: who, in volounteer-based Open Source communities, is going to volounteer to do this work?

Let's take Apache, as an example. They're clearly the kind of software that these requirements are thought for. If Apache was developed by a big commercial company, well, no big deal: they will have to pay the extra 25% to hire a team dedicated to this. But Apache is not that, in fact it's volounteer-based. They say: hey, some of these CRA obligations are virtually impossible to meet: for example, there is an obligation to deliver a product only if it has no known exploitable vulnerabilities. This is an almost impossible bar to set, especially as the open source authors, neither know, nor have control over, how their code is integrated downstream.

Or, take the Eclipse Foundation: they say right away, hey, we're in a better situation compared to others. We have a staff, we have infrastructure, we have a security team with security policies; and yet, the CRA requires that if there is a vulnerability, the organiation has to immediately address it through automatica updates and it has to notify users of those updates. Eclipse can't do that: because of privacy, their projects do not call home, or require user registration, and there's no mechanism to notify all users of the update. This would require to completely re-think the entire eclipse infrastructure.

Even worse, the CRA puts strict requirements to what they call "incomplete" releases, that is: beta and/or nighly releases. Either you certify every beta release - every single one - with all of those security requirements (which is crazy, usually you do a lot of beta releases to quickly iterate before the stable one) or you only make the incomplete release for a (1) short period of time, (2) clearly stating that this does not comply with the CRA and (3) only and exclusively for testing. Here's the issue: even if you write on the release page "hey, this is only for testing", that doesn't matter, because the licence of the beta software is: do whatever you want with it. This is open source, we cannot publish something "just for testing".

The thing is: this is worse than you think. It doesn't affect, you know, just a couple of open source projects. It affects a lot of them, and the list of products in the CRA is quite long. Think of KDE, which mostly makes a desktop, and applications. KDE would be affected, firstly because of KDE Neon - which is technically an operating system - but even KDE's Network Manager, the tool that sets up your wifi in Plasma, falls in the category of "Network Management Tools". GNOME would be affected, too. A lot of Open Source projects would be affected, and I'll be honest: KDE does not have the money to hire somebody just to do CRA compliance, and I really don't think we can get enough skilled people to volounteer to certify every release of every piece of software affected by the CRA.

This is why, immediately, everybody started asking: hey, Europe, did you know that 90% of the software out there is Open Source? I mean, Open Source is really a big deal. And we can't really comply with the CRA as it is now. Would you please say that all of these requirements only apply to commercial entities? I mean, this would make sense: if you are a big company, making money with your product, you can afford to make security audits and everything. If you're maybe even just a single developer doing free software... not quite. Remember that the first line of all open source licences are always saying: this software is granted as is, and the author take no responsability with it: take it, use it, fork it, do not blame me.

Now, luckily, Europe did listen to all of this; but to explain that, I have to make a quick timeline the CRA. The Act was proposed last September; it was discussed in a commission of the European Parliment called ITRE, and they drafted up a proposal to give to the entire Parliment. The commission voted on this draft in July, and it was approved; so, the next step, is for it to be approved by the entire Parlament. At the same time, the European Council is developing its own version of the Act, and, when the Parlament votes positively for the Act, Europe will take the original proposal, the draft accepted by the Parlament, the one drafted by the Council, and they will actually merge the three into a single Act. This is expected to happen early next year, like, 9 months from now. When that happens, we will have 2 years to comply with it.

This means that the Act is really really close to being approved, and it might not change that much before it is. However, it's still three years away in time, roughly, before it starts being effective. It's a ... ticking time bomb, ain't it?

So, what did we manage to get out of the negotiations within the ITRE: there is an exemption for open source, non commercial use. The only issue is, it's a mess. It's just a mess.

Firstly: if you take a recurring donation from a company, that is considered commercial. This immediately rules out a lot of the Open Source world. Take, again, KDE as an example. Yes, KDE is a non-profit. We don't make money. However, we do have Patrons, which you can check at KDE.org. The Patreons of KDE do actually donate money to KDE every something. And, hey, would you look at all of these beautiful commercial companies. Now, I'm not saying that KDE would be considered commercial because of those donations, I am not a lawyer, but I'm saying that the best interpretation I can give of the Act is, yeah, and this worries me.

But forget KDE. Do you know just how many small github open source projects, maintained by just one person, receive recurring donations from companies using that software? It's a lot. Just this little sentence in the bill, "recurring donations from companies make you commercial", creates a lot of issues to both small one-person Open Source projects and bigger ones.

And it's not just that. In order to be a non profit, the CRA requires that the project is completely decentralized and there is not a single one company that can decide what gets in the project, and what doesn't. This means that if you give git access to anyone involved in a commercial project, boom, you have to follow the CRA. Any corporate developer makes you a commercial project.

Even worse - and, here, I really hope me and all of the articles I've read are just... missing the point, because this is terrifying - the exact wording here is weeeird: it is: "Similarly, where the main contributors to free and open-source projects are developers employed by commercial entities and when such developers or the employer can exercise control as to which modifications are accepted in the code base, the project should generally be considered to be of a commercial nature".

Do you see the big, big, big loophole? It says: if the main developers are employed, and the main developers exercise control on the project, the project is commercial. It never actually states that the company behind those employees has to be... relevant. Taking this literally, if I was employed in, I don't know, a pizza restourant, doing free software on my spare time, that would still make the project commercial, because I'm employed. What the fuck?

It would be really, really easy now to just say: well, ops, the ITRE just did a pretty bad job with making the definitions. We just have to tell them to fix these couple of sentences and it's basically okay, isn't it?

This is on purpose. This is not some random mistake. They meant to do this. And for a quite simple reason. They noticed that the typical European Small and Medium Enterprise (acronym: SME) uses roughly 95% of open source code, and then just adds their own 5% on top. And they say: well, if we ask SMEs to certify the 5% they make, they can do that, it's not a lot. But if we ask to certify the whole thing, they don't have the resources to do that. And somebody has to certify the open source part of the stack, because, well, it's 95% of the whole thing. Because of that, they want most open source organizations to provide these security certifications, because they don't think SMEs would be able to handly it.

To quote the Apache Software Foundation: "For this reason, the policy makers have made it crystal clear to the ASF that they intend to have the CRA apply to open source foundations; the current exemptions for open source are either fo rpure hobbyists, code that is not used in real life, or for things such as mirrors and package repositoris like NPM". All of this, is on purpose.

Eventually, all of this is mostly a ... clash of ideologies. The free software ideology is: we are volounteer, we provide you with software that's freely available to everyone, you can do whatever you want with it, but you take responsability for it. Instead, Europe wants to save billions of euros in cybersecurity, and to do that, they need all open source software to be certified; and, right now, they believe that the only way to achieve that is to throw away the "don't blame the author", and actually put the cybersecurity blame on Open Source foundations.

This is scary. It feels like it could be really dangerous. Luckily, it might actually work out, it the end; maybe, in practice, only few projects will be considered commercial, and the requirements won't be that strict. A lot of people were terrified of the GDPR laws, but they didn't end up being that bad. However, to the best of my knowledge after hours of research, yeah, this does not look good.