CbatGPT developer OpenAI announced last week that it had fired CEO Sam Altman due to a loss of confidence by the board — only to see him return to the company after 90% of OpenAI staffers threatened to resign. The firing caused a flurry of excitement from companies offering to match OpenAI salaries in an attempt to lure top-tier talent.
The debacle — and its associated lack of transparency — highlighted the need to regulate AI development, particularly when it comes to security and privacy. Companies are developing their artificial intelligence divisions rapidly and a reshuffling of talent could propel one company ahead of others and existing laws. While President Joe Biden has taken steps to that effect, he has been relying on executive orders, which do not require input from Congress. Instead, they rely on agency bureaucrats to interpret them — and could change when a new president is inaugurated.
Biden this year signed an executive order related to the “safe, secure, and trustworthy artificial intelligence.” It commanded AI companies to “protect” workers from ‘harm,’ presumably in reference to the potential loss of their jobs. It also tasked the Office of Management and Budget (OMB) and Equal Employment Opportunity Commission (EEOC) with, in part, establishing governing structures within federal agencies. It also asked the Federal Trade Commission (FTC) to self-evaluate and determine whether it has the authority “to ensure fair competition in the AI marketplace and to ensure that consumers and workers are protected from harms that may be enabled by the use of AI.”
Biden’s executive orders are not going to last long
The fundamental problem with an approach driven by executive fiat is its fragility and limited scope. As evident by the SEC and CFTC’s (largely unsuccessful) attempts to classify cryptocurrencies as securities, tasking agencies with promulgating laws can cause confusion and apprehension amongst investors, and are ultimately open to interpretation by the courts.
Policies developed by agencies without legislative support also lack permanence. While public input is necessary for the passing of agency-backed regulations, the legislative process allows consumers of artificial intelligence and digital assets to have a stronger voice and assist with the passage of laws that deal with actual problems users face — instead of problems invented by often ambitious bureaucrats.
BREAKING: In a sudden turn of events, OpenAI signs agreement to bring Sam Altman back to the company as CEO.
There will be a new board of directors initially consisting of Bret Taylor, Larry Summers, and Adam D’Angelo.
Less than 1 week after Sam Altman was fired, OpenAI is…
Biden’s failure to address the complex ethical implications of AI implementation on a mass scale is dangerous; concerns such as bias in algorithms, surveillance and privacy invasion are barely being addressed. Those issues should be addressed by Congress, made up of officials elected by the people, rather than agencies composed of appointees.
Without the rigorous debate required for Congress to pass a law, there is no guarantee of a law that promotes security and privacy for everyday users. Specifically, users of artificial intelligence need to have control over how this automated technology uses and stores personal data. This concern is particularly acute in the field of AI, where many users fail to understand the underlying technology and the severe security concerns that come with sharing personal information. Furthermore, we need laws that ensure companies are conducting risk assessments and maintaining their automated systems in a responsible manner.
Reliance on regulations enacted by federal agencies will ultimately lead to confusion — consumers distrusting artificial intelligence. This precise scenario played out with digital assets after the SEC’s lawsuits against Coinbase, Ripple Labs, and other crypto-involved institutions, which made some investors apprehensive about their involvement with crypto companies. A similar scenario could play out in the field of AI where the FTC and other agencies sue AI companies and tie vital issues up in the court system for years ahead.
It’s imperative that Biden engage Congress on these issues instead of hiding behind the executive branch. Congress, in turn, must rise to the occasion, crafting legislation that encapsulates the concerns and aspirations of a diverse set of stakeholders. Without such collaborative efforts, the United States risks repeating the pitfalls experienced in the digital assets domain, potentially lagging behind other nations and driving innovation elsewhere. More importantly, the security and privacy of American citizens — as well as many around the globe — is in jeopardy.
John Cahill is an associate in national law firm Wilson Elser’s White Plains, N.Y., office. John focuses his practice on digital assets, and ensures that clients comply with current and developing laws and regulations. He received a B.A. from St. Louis University and a J.D. from New York Law School.
This article is for general information purposes and is not intended to be and should not be taken as legal or investment advice. The views, thoughts, and opinions expressed here are the author’s alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.
Jess Phillips has said “there is no place” where violence against women and girls “doesn’t happen” – as a new law is set to make spiking a criminal offence.
Earlier on Friday, the government said spiking will now be its own offence with a possible 10-year prison sentence as part of the Crime and Policing Bill, which will be introduced in parliament next week.
It also announced a nationwide training programme to help workers spot and prevent attacks.
Speaking to Sky News correspondent Ashna Hurynag, the safeguarding minister said that while spiking is already illegal under existing laws, the new classification will simplify reporting the act for victims.
“Spiking is illegal – that isn’t in question, but what victims and campaigners who have tried to use the legislation as it currently is have told us is that it’s unclear,” Ms Phillipssaid.
Image: Spiking will be made a criminal offence, carrying a sentence of up to 10 years. Pic: iStock
UK ‘was never safe’ for women
When asked if the UK is becoming a less safe place for women, the minister for safeguarding and violence against women and girls, said: “I don’t think it’s becoming less safe, if I’m being honest. I think it was never safe.”
Speaking about a rise in coverage, Ms Phillips said: “We have a real opportunity to use that, the sense of feeling [built by campaigners] in the country, to really push forward political change in this space.”
“The reality is that it doesn’t matter whether it’s the House of Commons or any pub in your local high street – there is no place where violence against women and girls doesn’t happen, I’m afraid,” she added.
Spiking is when someone is given drugs or alcohol without them knowing or consenting, either by someone putting something in their drink or using a needle.
Police in England and Wales received 6,732 reports of spiking in the year up to April 2023 – with 957 of those relating to needle spiking.
London’s Metropolitan Police added that reports of spiking had increased by 13% in 2023, with 1,383 allegations.
Please use Chrome browser for a more accessible video player
1:00
November 2024: If you got spiked would you report it?
As part of the nationwide training programme, a £250,000 government-funded scheme was started last week to teach staff how to spot warning signs of spiking crimes, prevent incidents and gather evidence.
It aims to train 10,000 staff at pubs, clubs and bars for free by April this year.
Alex Davies-Jones, minister for victims and violence against women and girls, said in a statement that “no one should feel afraid to go out at night” or “have to take extreme precautions to keep themselves safe when they do”.
“To perpetrators, my message is clear: spiking is vile and illegal and we will stop you,” he said. “To victims or those at risk, we want you to know: the law is on your side. Come forward and help us catch these criminals.”
Colin Mackie, founder of Spike Aware UK, also said the charity is “delighted with the steps being taken by the government to combat spiking”.
He added: “Spiking can happen anywhere, but these new initiatives are the first steps to making it socially unacceptable and we urge anyone that suspects or sees it happening, not to remain silent.”