The accepted stance taken by many regulators is that they do not regulate technology but regulate those who use technology and the outcomes and actions that result from it. However, Nick Cook, Director of Innovation at the FCA in the UK, has raised the question: “Should we talk solely in terms of ‘outcomes’ while remaining ‘agnostic’, or can we show a preference for certain technologies? Can we remain ‘technology-neutral’ in a world where technology is so embedded in the delivery of financial services and so fundamental a driver of consumer outcomes?”. Interestingly, Cook also continued in the same speech to say, “It seems untenable for regulators and central banks to not have an opinion on technology given it is so embedded in the markets we regulate”.
It is not simply within financial services that regulating tech is being discussed. In Europe there has been considerable discussion about how to regulate Artificial Intelligence (AI) as the Europeans look to take a lead on regulating this wide-ranging branch of computer science and address the challenges around protection, liability and discrimination. Unsurprisingly, this inevitably leads to the question of ‘algorithmic accountability’ - i.e should companies which operate AI platforms be held liable for the results of their algorithms? Furthermore, another challenge arising is that even some of the brightest ‘techies’ cannot foresee the potential consequences of what they have created. A prime example of this occurred when Facebook was required to shut down its project AI on its smart speakers when its internet bots (computer software applications) were discovered to be ‘talking to each other’ in a language not understood by the programmers. Interestingly, Musk, who is a massive fan of new technology and innovation is very circumspect about AI, claiming: “AI is a rare case where we need to be proactive in regulation instead of reactive because if we’re reactive in AI regulation it’s too late,” adding that, ‘AI is the biggest risk we face as a civilisation”.
When it comes to Blockchain and Digital Assets there are clearly some challenges and questions that have emerged which will need addressing, such as:
• Data security and access
A public Blockchain, such as Bitcoin: once a record has been created it is not possible to amend or delete that data, therefore potentially directly contravening the General Data Protection Regulations (GDPR) which became law across Europe on 25th May 2018. GDPR states that one has a “right to be forgotten”, but how can your information be removed from a Blockchain that is immutable?
• DeFi
Software developers, such as Andre Cronje, are creating products and services that are available on DeFi platforms, and then accessible globally. Only in December 2020 Cronje released his ‘yCredit’ protocol, having already created various other DeFi applications - Deriswap, Keep3r Network, StableCredit, and yInsure.Finance. These various software tools have been released but the users will have no redress to Cronje as he is not regulated. Often the buyers of these DeFi tokens will reside in jurisdictions far away from wherever Cronje inhabits.
• Cryptos - unregulated assets
To date, many cryptocurrencies have not been regulated so how do you regulate the likes of Bitcoin which, allegedly, was invented by someone called Satoshi Nakamoto whose whereabouts is unknown. But how long Bitcoin will remain unregulated is up
for debate as the head of the European Bank, Christine Lagarde, is now “calling for Bitcoin’s ‘funny business’ to be regulated”?
The need for regulations in other areas that are likely to use this Blockchain technology will also need to be considered - such as driverless cars. A ‘heady cocktail’ of technology including AI, Big Data, Blockchain, Internet of Things (IoT), Machine Learning, to name just a few, are likely to be the backbone of driverless cars. Indeed, Google’s driverless vehicles have already covered over 20 million miles and, while it took Google 10 years to drive the initial 10 million miles, it has only taken one year to complete the other 10 million. So, how long will it be before driverless cars are for sale in the public domain? Furthermore, how will they be regulated and who will be liable for any insurance claims as, since they are being driven by computers, how can the owner of the car be responsible in the event of an accident? Reassuringly in the UK, such questions are already being addressed by the Law Commission, but the situation is far from certain. It is not just a matter for the drivers of cars, but also the highway authorities and or local councils who are responsible for the roads, traffic lights. Should the IoT devices or algorithms that control them fail, who then is responsible for them - the local authority/highway commission, the manufacturers, or the software developer?
Such legal conundrums are already being addressed, with there now being a greater deployment of smart contracts there was a need to have some legal certainty around smart contracts. Many lawyers often derided smart contracts claiming they were not smart, nor could they form part of a contract. Therefore in the UK The Law Commission launched a consultation paper in December 2020 to look at this topic having previously asked the Chancellor of the High Court, Sir Geoffrey Vos, look at smart contracts and he concluded that: “Smart contracts are capable of satisfying the requirements of contracts in English law and are thus enforceable by the courts”.
There is certainly likely to be more pressure to regulate technology as opposed to regulating those who use it but, in an increasingly global economic environment, this could prove to be problematic. It may be worth a reminder that Blockchain technology was originally created to offer a decentralised alternative of doing business, in the same way that DeFi stands for Decentralised Finance, and therefore it becomes a challenge to create a set of enforceable regulations. It may be almost impossible to really know who wrote the code that powers a protocol or who created the crypto currency that has transgressed.