The government recently launched its first National Artificial Intelligence (AI) Strategy, a ten-year plan designed to “help strengthen [the UK’s] position as a global science superpower” and “seize the potential of modern technology to improve people’s lives and solve global challenges such as climate change and public health”.
Among its recommendations, the strategy includes a consultation on copyright and patents for ideas generated by AI, with a particular focus on protecting AI-generated inventions which currently wouldn’t meet traditional inventorship criteria. However, the publication of the strategy came at the same time the Court of Appeal ruled that an AI system couldn’t be named as an inventor on patent applications because it isn’t a person.
So, while the government’s strategy is likely to be of considerable benefit to the UK’s tech sector and for the development of AI technologies in particular, greater clarity is required around protection for AI-generated inventions and AI-generated copyright works.
The DABUS case
Tony Wood, SVP of Medicinal Science and Technology at GSK, believes there is potential for AI and machine learning in uncovering “insights from human genetics and genomics … to develop more and better medicines and vaccines.” Fully capitalising on this potential may have to wait though.
In a recent ruling, the Court of Appeal seemed reluctant to create a precedent to allow for “autonomous” AI inventions, preferring instead – and as argued by the UK Intellectual Property Office (IPO) – to wait for the proposed consultation and for new legislation to be introduced if necessary.
In the original 2019 case, a team of academics filed patent applications with an AI system called DABUS named as the sole inventor. DABUS is an AI system designed to devise and develop new ideas, a process its creator Dr Stephen Thaler says is “traditionally considered the mental part of the creative act”. The UK application was rejected by the IPO arguing that under the terms of the UK Patents Act 1977, which restricts inventorship to “natural persons”, DABUS couldn’t be considered an inventor and so Dr Thaler had no standing to apply for a patent.
Dr Thaler subsequently brought a legal challenge forward and the Court of Appeal ruled the IPO to have been right in its decision, with two of the three senior judges again declaring that DABUS didn’t qualify as an inventor because “it wasn’t a person.”
Assistant or inventor?
Patents play a vital role in the way pharmaceutical companies operate, allowing them to protect their considerable investments in long and complex R&D cycles, and enabling them to continue producing innovative new drugs. But, in cases where those drugs are “invented” using AI and machine learning technologies applied to large data sets, it is possible (if not now, then in the future) that there could be a serious question of inventorship in circumstances where it isn’t plausible to name a natural person as the inventor.
If, following the DABUS decision, pharma companies can’t protect their discoveries with a patent, this runs the risk they’ll be disincentivised to invest in AI-based research and many important new drugs may go undiscovered – or, as suggested in the Court of Appeal, they will simply lie about how the inventions were made.
Technically, the way AI is used at present means such discoveries are AI-assisted rather than AI-generated. There’s yet to be a completely compelling argument for the use of AI in the pharmaceutical industry as being anything more than sophisticated number crunching in a data-driven approach to innovation. A significant amount of human effort goes into training it and using the outcome to generate sensible and meaningful solutions. Therefore, if AI only assists in a discovery, it shouldn’t be considered the inventor – that title should go to the human that programmed and trained the AI system or performed the related lab work.
Move with the times
The problem is that there’s a disconnect in the UK’s approach to copyright. As things stand, the author of a computer-generated work is considered to be the person who programmed and trained the AI or technology platform involved, and the work in question is then automatically afforded copyright protection without question.
But in the case of technical innovation, as the DABUS case illustrates, the law views the absence of a human inventor as a reason not to grant protection. Indeed, there are many who don’t think computer-generated works, for example computer-generated songs, music, art, and literature, should be given copyright protection for the very same reason, arguing that there’s no creative input from a human.
The fact is, digital transformation of ideation, innovation, invention using data, AI and machine learning algorithms is happening – and it’s evolving quickly. The UK simply can’t afford to be left behind based on IP laws drafted in the 1970s and 80s, when these technologies were little more than futuristic concepts. It’s time the law was updated to reflect recent developments.
The government is arguably to be applauded for its ambition. DCMS Minister Chris Philip describes its strategy as “laying the foundations for the next ten years’ growth … to help us seize the potential of artificial intelligence.” But it seems there’s some way to go before AI-generated inventions can be protected and a better balance is found for the copyright protection of computer-generated works.