Connect with us

Published

on

A woman walks past tents for the homeless lining a street in Los Angeles, Calif. on Feb. 1, 2021.

FREDERIC J. BROWN | AFP | Getty Images

In December of last year, single mom Courtney Peterson was laid off from her job working for a now-shuttered inpatient transitional living program. Aside from the flexibility it allowed her to sometimes bring her seven-year-old son to work, it paid enough to cover rent in a studio apartment in the Van Nuys neighborhood in Los Angeles, where they had lived for a year and a half. 

Peterson said she began to research potential avenues for help, immediately concerned about making January’s rent. When her son was an infant, they lived in a travel trailer, she said, a situation she did not want to return to.

“I started to reach out to local churches or places that said they offered rent assistance,” Peterson told CNBC. “But a lot of them wanted me to have active eviction notices in order to give me assistance. I felt like I was running out of options. I’d reached out to pretty much everyone I could possibly think of with no luck.”

Instead of an eviction notice, Peterson received a letter from Homelessness Prevention Unit within the Los Angeles County Department of Health Services, offering a lifeline. The pilot program uses predictive artificial intelligence to identify individuals and families at risk of becoming homeless, offering aid to help them stabilize and remain housed.

In 2023, California had more than 181,000 homeless individuals, up more than 30 percent since 2007, according to data from the U.S Department of Housing and Urban Development. A report from the Auditor of the State of California found the state spent $24 billion on homelessness from 2018 through 2023.

Launched in 2021, the technology has helped the department serve nearly 800 individuals and families at risk of becoming homeless, with 86 percent of participants retaining permanent housing when they leave the program, according to Dana Vanderford, associate director of homelessness prevention at the county’s Department of Health Services. 

Individuals and families have access to between $4,000 and $8,000, she said, with the majority of the funding for the program coming from the American Rescue Plan Act. Tracking down individuals to help and convincing them that the offer is real and not a scam can be a challenge, but once contact is established, aid is quickly put into motion.

“We often meet our clients within days of a loss of housing, or days after they’ve had a medical emergency. The timing with which we meet people feels critical,” Vanderford said. “Our ability to appear out of nowhere, cold-call a person, provide them with resources and prevent that imminent loss of housing for 86 percent of the people that we’ve worked with feels remarkable.”

Peterson said she and her son received some $8,000 to cover rent, utilities and basic needs, allowing her to stay put in her apartment while she looks for a new job. The program works with clients for four months and then follows up with them at the six-month mark and the 12-month mark, as well as 18 months after discharge. Case workers like Amber Lung, who helped Peterson, say they can see how important preventative work is firsthand.

“Once folks do lose that housing, it feels like there’s so many more hurdles to get back to [being] housed, and so if we can fill in just a little bit of a gap there might be to help them retain that housing, I think it’s much easier to stabilize things than if folks end up in a shelter or on the streets to get them back into that position,” Lung said.

Using AI to prevent homelessness: Here's what to know

Predicting Risk

The AI model was developed by the California Policy Lab at UCLA over the course of several years, using data provided by Los Angeles County’s Chief Information Office. The CIO integrated data from seven different county departments, de-identified for privacy, including emergency room visits, behavioral health care, and large public benefits programs from food stamps to income support and homeless services, according to Janey Rountree, executive director of the California Policy Lab. The program also pulled data from the criminal justice system.

Those data, linked together over many years, are what would be used to make predictions about who would go on to experience homelessness, developed during a period of time when the policy lab had the outcome to test the model’s accuracy. 

Once the model identified patterns in who experienced homelessness, the lab used it to attempt to make predictions about the future, creating an anonymized list of individuals ranked from highest risk to lowest. The lab provided the list to the county so it could reach out to people who may be at risk of losing housing before it happened.

However, past research has found that anonymized data can be traced back to individuals based on demographic information. A sweeping study on data privacy, based on 1990 U.S. Census data found 87% of Americans could be identified by using ZIP code, birth date and gender.

“We have a deep, multi-decade long housing shortage in California, and the cost of housing is going up, increasingly, and that is the cause of our people experiencing homelessness,” Rountree said. “The biggest misperception is that homelessness is caused by individual risk factors, when in fact it’s very clear that the root cause of this is a structural economic issue.”

The Policy Lab provided the software to the county for free, Rountree said, and does not plan to monetize it. Using AI in close partnership with people who have relevant subject matter expertise from teachers to social workers can help to promote positive social outcomes, she said. 

“I just want to emphasize how important it is for every community experiencing homelessness, to test and innovate around prevention,” she said. ” It’s a relatively new strategy in the lifespan of homeless services. We need more evidence. We need to do more experiments around how to find people at risk. I think this is just one way to do that.”

The National Alliance to End Homelessness found in 2017 a chronically homeless person costs the taxpayer an average of $35,578 per year, and those costs are reduced by an average of nearly half when they are placed in supportive housing.

Los Angeles County has had initial conversations with Santa Clara County about the program, and San Diego County is also exploring a similar approach, Vanderford said.

Government Use of Artificial Intelligence

AI in the hands of government agencies has faced scrutiny due to potential outcomes. Police reliance on AI technology has led to wrongful arrests, and in California, voters rejected a plan to repeal the state’s bail system in 2020 and replace it with an algorithm to determine individual risk, over concerns it would increase bias in the justice system.

Broadly speaking, Margaret Mitchell, chief ethics scientist at AI startup Hugging Face, said ethics around the government use of AI hinge on context of use and safety of identifiable information, even if anonymized. Mitchell also points to how important it is to receive informed consent from people seeking help from government programs.

 “Are the people aware of all the signals that are being collected and the risk of it being associated to them and then the dual use concerns for malicious use against them?” Mitchell said. “There’s also the issue of how long this data is being kept and who might eventually see it.”

While the technology aims to provide aid to those in need before their housing is lost in Los Angeles County, which Mitchell said is a positive thing to do from a “virtue ethics” perspective, there are broader questions from a utilitarian viewpoint.

 “Those would be concerns like, ‘What is the cost to the taxpayer and how likely is this system to actually avoid houselessness?'” she said.

As for Peterson, she’s in the process of looking for work, hoping for a remote position that will allow her flexibility. Down the road, she’s hoping to obtain her licensed vocational nursing certification and one day buy a home where her son has his own room.

“It has meant a lot just because you know my son hasn’t always had that stability. I haven’t always had that stability,” she said of the aid from the program. “To be able to call this place home and know that I’m not going to have to move out tomorrow, my son’s not going to have to find new friends right away… It’s meant a lot to both me and my son.”

Continue Reading

Technology

Here are 4 major moments that drove the stock market last week

Published

on

By

Here are 4 major moments that drove the stock market last week

Continue Reading

Technology

Oracle says there have been ‘no delays’ in OpenAI arrangement after stock slide

Published

on

By

Oracle says there have been 'no delays' in OpenAI arrangement after stock slide

Oracle CEO Clay Magouyrk appears on a media tour of the Stargate AI data center in Abilene, Texas, on Sept. 23, 2025.

Kyle Grillot | Bloomberg | Getty Images

Oracle on Friday pushed back against a report that said the company will complete data centers for OpenAI, one of its major customers, in 2028, rather than 2027.

The delay is due to a shortage of labor and materials, according to the Friday report from Bloomberg, which cited unnamed people. Oracle shares fell to a session low of $185.98, down 6.5% from Thursday’s close.

“Site selection and delivery timelines were established in close coordination with OpenAI following execution of the agreement and were jointly agreed,” an Oracle spokesperson said in an email to CNBC. “There have been no delays to any sites required to meet our contractual commitments, and all milestones remain on track.”

The Oracle spokesperson did not specify a timeline for turning on cloud computing infrastructure for OpenAI. In September, OpenAI said it had a partnership with Oracle worth more than $300 billion over the next five years.

“We have a good relationship with OpenAI,” Clay Magouyrk, one of Oracle’s two newly appointed CEOs, said at an October analyst meeting.

Doing business with OpenAI is relatively new to 48-year-old Oracle. Historically, Oracle grew through sales of its database software and business applications. Its cloud infrastructure business now contributes over one-fourth of revenue, although Oracle remains a smaller hyperscaler than Amazon, Microsoft and Google.

OpenAI has also made commitments to other companies as it looks to meet expected capacity needs.

In September, Nvidia said it had signed a letter of intent with OpenAI to deploy at least 10 gigawatts of Nvidia equipment for the San Francisco artificial intelligence startup. The first phase of that project is expected in the second half of 2026.

Nvidia and OpenAI said in a September statement that they “look forward to finalizing the details of this new phase of strategic partnership in the coming weeks.”

But no announcement has come yet.

In a November filing, Nvidia said “there is no assurance that we will enter into definitive agreements with respect to the OpenAI opportunity.”

OpenAI has historically relied on Nvidia graphics processing units to operate ChatGPT and other products, and now it’s also looking at designing custom chips in a collaboration with Broadcom.

On Thursday, Broadcom CEO Hock Tan laid out a timeline for the OpenAI work, which was announced in October. Broadcom and OpenAI said they had signed a term sheet.

“It’s more like 2027, 2028, 2029, 10 gigawatts, that was the OpenAI discussion,” Tan said on Broadcom’s earnings call. “And that’s, I call it, an agreement, an alignment of where we’re headed with respect to a very respected and valued customer, OpenAI. But we do not expect much in 2026.”

OpenAI declined to comment.

WATCH: Oracle says there have been ‘no delays’ in OpenAI arrangement after stock slide

Oracle says there have been 'no delays' in OpenAI arrangement after stock slide

Continue Reading

Technology

AI order from Trump might be ‘illegal,’ Democrats and consumer advocacy groups claim

Published

on

By

AI order from Trump might be ‘illegal,’ Democrats and consumer advocacy groups claim

“This is the wrong approach — and most likely illegal,” Sen. Amy Klobuchar, D-Minn., said in a post on X Thursday.

“We need a strong federal safety standard, but we should not remove the few protections Americans currently have from the downsides of AI,” Klobuchar said.

Trump’s executive order directs Attorney General Pam Bondi to create a task force to challenge state laws regulating AI.

The Commerce Department was also directed to identify “onerous” state regulations aimed at AI.

The order is a win for tech companies such as OpenAI and Google and the venture firm Andreessen Horowitz, which have all lobbied against state regulations they view as burdensome. 

It follows a push by some Republicans in Congress to impose a moratorium on state AI laws. A recent plan to tack on that moratorium to the National Defense Authorization Act was scuttled.

Collin McCune, head of government affairs at Andreessen Horowitz, celebrated Trump’s order, calling it “an important first step” to boost American competition and innovation. But McCune urged Congress to codify a national AI framework.

“States have an important role in addressing harms and protecting people, but they can’t provide the long-term clarity or national direction that only Congress can deliver,” McCune said in a statement.

Sriram Krishnan, a White House AI advisor and former general partner at Andreessen Horowitz, during an interview Friday on CNBC’s “Squawk Box,” said that Trump is was looking to partner with Congress to pass such legislation.

“The White House is now taking a firm stance where we want to push back on ‘doomer’ laws that exist in a bunch of states around the country,” Krishnan said.

He also said that the goal of the executive order is to give the White House tools to go after state laws that it believes make America less competitive, such as recently passed legislation in Democratic-led states like California and Colorado.

The White House will not use the executive order to target state laws that protect the safety of children, Krishnan said.

Robert Weissman, co-president of the consumer advocacy group Public Citizen, called Trump’s order “mostly bluster” and said the president “cannot unilaterally preempt state law.”

“We expect the EO to be challenged in court and defeated,” Weissman said in a statement. “In the meantime, states should continue their efforts to protect their residents from the mounting dangers of unregulated AI.”

Weissman said about the order, “This reward to Big Tech is a disgraceful invitation to reckless behavior
by the world’s largest corporations and a complete override of the federalist principles that Trump and MAGA claim to venerate.”

In the short term, the order could affect a handful of states that have already passed legislation targeting AI. The order says that states whose laws are considered onerous could lose federal funding.

One Colorado law, set to take effect in June, will require AI developers to protect consumers from reasonably foreseeable risks of algorithmic discrimination.

Some say Trump’s order will have no real impact on that law or other state regulations.

“I’m pretty much ignoring it, because an executive order cannot tell a state what to do,” said Colorado state Rep. Brianna Titone, a Democrat who co-sponsored the anti-discrimination law.

In California, Gov. Gavin Newsom recently signed a law that, starting in January, will require major AI companies to publicly disclose their safety protocols. 

That law’s author, state Sen. Scott Wiener, said that Trump’s stated goal of having the United States dominate the AI sector is undercut by his recent moves. 

“Of course, he just authorized chip sales to China & Saudi Arabia: the exact opposite of ensuring U.S. dominance,” Wiener wrote in an X post on Thursday night. The Bay Area Democrat is seeking to succeed Speaker-emerita Nancy Pelosi in the U.S. House of Representatives.

Trump on Monday said he will Nvidia to sell its advanced H200 chips to “approved customers” in China, provided that U.S. gets a 25% cut of revenues.

Continue Reading

Trending