Connect with us

Published

on

inflection point — The generative AI revolution has begunhow did we get here? A new class of incredibly powerful AI models has made recent breakthroughs possible.

HaomiaoHuang – Jan 30, 2023 12:00 pm UTC Enlarge / This image was partially AI-generated with the prompt “a pair of robot hands holding pencils drawing a pair of human hands, oil painting, colorful,” inspired by the classic M.C. Escher drawing. Watching AI mangle drawing hands helps us feel superior to the machines… for now. AurichAurich Lawson | Stable Diffusion reader comments 38 with 0 posters participating Share this story Share on Facebook Share on Twitter Share on Reddit

Progress in AI systems often feels cyclical. Every few years, computers can suddenly do something theyve never been able to do before. Behold! the AI true believers proclaim, the age of artificial general intelligence is at hand! Nonsense! the skeptics say. Remember self-driving cars?

The truth usually lies somewhere in between.

Were in another cycle, this time with generative AI. Media headlines are dominated by news about AI art, but theres also unprecedented progress in many widely disparate fields. Everything from videos to biology, programming, writing, translation, and more is seeing AI progress at the same incredible pace. Why is all this happening now?

Further ReadingThe basics of modern AIhow does it work and will it destroy society this year?You may be familiar with the latest happenings in the world of AI. Youve seen the prize-winning artwork, heard the interviews between dead people, and read about the protein-folding breakthroughs. But these new AI systems arent just producing cool demos in research labs. Theyre quickly being turned into practical tools and real commercial products that anyone can use.

Theres a reason all of this has come at once. The breakthroughs are all underpinned by a new class of AI models that are more flexible and powerful than anything that has come before. Because they were first used for language tasks like answering questions and writing essays, theyre often known as large language models (LLMs). OpenAIs GPT3, Googles BERT, and so on are all LLMs. Advertisement

But these models are extremely flexible and adaptable. The same mathematical structures have been so useful in computer vision, biology, and more that some researchers have taken to calling them “foundation models” to better articulate their role in modern AI.

Where did these foundation models came from, and how have they broken out beyond language to drive so much of what we see in AI today? The foundation of foundation models

Theres a holy trinity in machine learning: models, data, and compute. Models are algorithms that take inputs and produce outputs. Data refers to the examples the algorithms are trained on. To learn something, there must be enough data with enough richness that the algorithms can produce useful output. Models must be flexible enough to capture the complexity in the data. And finally, there has to be enough computing power to run the algorithms.

The first modern AI revolution took place with deep learning in 2012, when solving computer vision problems with convolutional neural networks (CNNs) took off. CNNs are similar in structure to the brain’s visual cortex. Theyve been around since the 1990s but werent yet practical due to their intense computing power requirements.

In 2006, though, Nvidia released CUDA, a programming language that allowed for the use of GPUs as general-purpose supercomputers. In 2009, Stanford AI researchers introduced Imagenet, a collection of labeled images used to train computer vision algorithms. In 2012, AlexNet combined CNNs trained on GPUs with Imagenet data to create the best visual classifier the world had ever seen. Deep learning and AI exploded from there.

CNNs, the ImageNet data set, and GPUs were a magic combination that unlocked tremendous progress in computer vision. 2012 set off a boom of excitement around deep learning and spawned whole industries, like those involved in autonomous driving. But we quickly learned there were limits to that generation of deep learning. CNNs were great for vision, but other areas didnt have their model breakthrough. One huge gap was in natural language processing (NLP)i.e., getting computers to understand and work with normal human language rather than code. Advertisement

The problem of understanding and working with language is fundamentally different from that of working with images. Processing language requires working with sequences of words, where order matters. A cat is a cat no matter where it is in an image, but theres a big difference between this reader is learning about AI and AI is learning about this reader.

Until recently, researchers relied on models like recurrent neural networks (RNNs) and long short-term memory (LSTM) to process and analyze data in time. These models were effective at recognizing short sequences, like spoken words from short phrases, but they struggled to handle longer sentences and paragraphs. The memory of these models was just not sophisticated enough to capture the complexity and richness of ideas and concepts that arise when sentences are combined into paragraphs and essays. They were great for simple Siri- and Alexa-style voice assistants but not for much else.

Getting the right training data was another challenge. ImageNet was a collection of one hundred thousand labeled images that required significant human effort to generate, mostly by grad students and Amazon Mechanical Turk workers. And ImageNet was actually inspired by and modeled on an older project called WordNet, which tried to create a labeled data set for English vocabulary. While there is no shortage of text on the Internet, creating a meaningful data set to teach a computer to work with human language beyond individual words is incredibly time-consuming. And the labels you create for one application on the same data might not apply to another task. Page: 1 2 3 4 5 6 7 8 Next → reader comments 38 with 0 posters participating Share this story Share on Facebook Share on Twitter Share on Reddit Advertisement Channel Ars Technica ← Previous story Next story → Related Stories Today on Ars

Continue Reading

Sports

Mets sit banged up McNeil, Nimmo vs. Nationals

Published

on

By

Mets sit banged up McNeil, Nimmo vs. Nationals

WASHINGTON — Jeff McNeil has a sore right shoulder, the latest nagging injury for the New York Mets as they try to recover from a late-summer swoon.

McNeil was out of the lineup for Thursday’s series finale at Washington, with Brett Baty starting at second base. One of the Mets’ most consistent hitters, McNeil went 4 for 8 with a homer, two doubles and five RBI in the previous two games against the Nationals.

“It doesn’t bother him to swing the bat. It’s just more the throwing,” manager Carlos Mendoza said.

The shoulder problem began late last week, Mendoza said, which is why McNeil started at designated hitter on Saturday and Sunday.

Brandon Nimmo was also out of the lineup Thursday with the stiff neck that forced him to leave Wednesday night’s game in the second inning. Tyrone Taylor started in left field.

“We didn’t see much improvement overnight,” Mendoza said of Nimmo.

McNeil has experience in left, but the shoulder problem means he’s not an option there for now.

New York’s series at Washington began Tuesday with the news that catcher Francisco Alvarez has a sprained ligament in his right thumb that will require surgery. Alvarez is hoping he can play through the pain after a stint on the injured list.

Backup catcher Luis Torrens had a rough night Wednesday that included getting hit in his receiving hand by a bat on a catcher’s interference play, but Mendoza said Thursday that Torrens was “fine.”

The Mets had a three-game winning streak before Wednesday night’s loss, but the team with the biggest payroll in the majors is just 5-15 since July 28. New York entered Thursday trailing Philadelphia by 6 1/2 games in the NL East and was one game ahead of Cincinnati for the final wild-card spot.

Continue Reading

Science

Rice University Scientists Confirm Flatband Discovery in Kagome Superconductor

Published

on

By

flatband states in CsCr₃Sb₅, a kagome superconductor. This experimental validation connects lattice geometry with emergent superconductivity, opening new pathways for engineered quantum materials, superconductors, and advanced electronics.

Continue Reading

World

Israel maintains pressure on Gaza City as ‘first stages of attack begin’

Published

on

By

Israel maintains pressure on Gaza City as 'first stages of attack begin'

Gaza City residents say Israel carried out intense overnight bombardments as it prepares a controversial offensive to take control of the area.

Sixty-thousand reservists are being called up after Benjamin Netanyahu‘s security cabinet approved the plan earlier this month.

UN chief Antonio Guterres has warned of more “death and destruction” if Israel tries to seize the city, while France’s Emmanuel Macron said it would be a “disaster” that would lead to “permanent war”.

Live – UN warns of ‘forcible transfer’ as forces advance on Gaza City

Hundreds of thousands of people could end up being forcibly displaced – a potential war crime, according to the UN’s human rights office.

Gaza’s health ministry said at least 70 people had been killed in Israeli attacks in the past 24 hours, including eight people in a house in the Sabra suburb of Gaza City.

Israel currently controls about 75% of the Gaza Strip, but Prime Minister Netanyahu has said Israel must take Gaza City to “finish the job” and defeat Hamas.

More on Gaza

Mr Netanyahu and his ministers are due to meet on Thursday to discuss the plans, according to Israeli media.

Military spokesperson Effie Defrin said earlier that “preliminary operations and the first stages of the attack” had begun – with troops operating on the outskirts of Gaza City.

Israel has said it will order evacuation notices before troops move in but satellite images show thousands of people have already left.

Please use Chrome browser for a more accessible video player

Aftermath of fresh Israeli strikes on Gaza

Residents said shelling has intensified in the Sabra and Tuffah neighbourhoods and that those fleeing have gone to coastal shelters or to central and southern parts of the Strip.

The decision to stay or leave is an agonising choice for many.

“We are facing a bitter-bitter situation, to die at home or leave and die somewhere else, as long as this war continues, survival is uncertain,” said father of seven Rabah Abu Elias.

“In the news, they speak about a possible truce, on the ground, we only hear explosions and see deaths. To leave Gaza City or not isn’t an easy decision to make,”

Please use Chrome browser for a more accessible video player

Sky’s Adam Parsons explains what is in the new Israel-Hamas ceasefire deal.

Most of the Israeli reservists being summoned are not expected to be in a frontline combat role and the call-up is set to take a while.

The window could give mediators more time to convince Israel to accept a temporary ceasefire.

Hamas has already agreed to the proposal – envisaging 10 living hostages and 18 bodies being released in return for a 60-day truce and the freedom of about 200 Palestinian prisoners.

Israel hasn’t officially responded, but insists it wants all 50 remaining hostages released at once. Only 20 of them are still believed to be alive.

The war started nearly two years ago when a Hamas terror attack killed about 1,200 people and kidnapped around 250.

Read more:
Tents abandoned as Palestinians flee Israeli advance

Please use Chrome browser for a more accessible video player

What would a two-state solution look like?

Follow the World
Follow the World

Listen to The World with Richard Engel and Yalda Hakim every Wednesday

Tap to follow

More than 62,000 Palestinians have been killed in the war, according to Gaza’s health ministry.

The figure doesn’t break down how many were Hamas members, but it says women and children make up more than half.

Two more people also died of starvation and malnutrition in the past 24 hours, the ministry said on Thursday, taking the total to 271, including 112 children.

COGAT, the body controlling aid into Gaza, said 250 aid trucks entered on Wednesday, with 154 pallets air-dropped.

Continue Reading

Trending