Hey everyone, does anyone have opinions on AI regulation or the EU AI Act? Meeting a major policy maker tomorrow who looks to talk technical folks! Some context here.
07/26/2023, 4:58 AM
I'll jot down a couple possibly contrarian opinions:
1) We don't know enough about AI yet to get the regulations right.
We don't even understand the legality of training AI on diverse datasets yet (everyone has opinions, but the key courts have not however delivered their opinions), much less whether these systems are going to drive economic and social prosperity or collapse.
If legislatures make a big push to regulate now, they are unlikely to be willing to extend the energy required to significantly revisit such a large and supposedly "handled" topic 24 months later, leading to world stuck for perhaps 5 or more years during a critical period with a set of laws that were well intentioned but that no one on either side of any of the debates thinks are what we need as we learn more about the capabilities and impacts of these systems.
2) "Slowing the pace of development" at the national or regional level to buy time to "get the legislation right" sounds like an appealing work-around to the problem of not knowing what legislation to produce, but it's demonstrably neither realistic nor effective.
We saw for example in the early 2000's how efforts at the national level to restrict development and proliferation of encryption technology simply caused the nations that had been on the leading edge of that type of development to fall behind as innovation and innovators moved to other nations that did not introduce such rules. The pace of innovation did not slow and those attempting to regulate the innovation ended up with less ability to influence development than they had before the legislation existed (there are certainly those who think that's a good thing, but it's unlikely that those who wrote that legislation feel that way).
3) Open Source has been an incredibly important force in the software industry but LLM's are unlikely to follow the same model, at least for the next few years (with the caveats called out above that no one really knows what to expect).
The incredibly low cost of modern software development is responsible for the economic viability of most Open Source projects. One, or a few, or sometimes many passionate individuals investing their disposable time and/or small amounts of seed capital to build a thing they care about has led to the lion's share of Open Source projects, including eventual giants like Linux.
Open Source will obviously remain a powerful force in software development, but training competitive/state of the art LLM's and similar from scratch currently exhibits none of the low-cost, disposable-time, passion-project economics that fed the Open Source movement we're all familiar with. Algorithmic advances may well make a big dent in those costs, but the odds are equally likely that legal issues including liability questions and source rights clearances will raise the cost of training as rapidly as silicon and algorithmic advances lower it.
4) It's incredibly important for legislators to be thinking about AI.
I want legislators working to understand what they should do about AI, what they should do with AI, and what they should do for AI, but I'm not nearly as optimistic today about codifying those thoughts into laws as I hope to be in the future.