Thu. Mar 28th, 2024
npressfetimg-921.png

Share this
Article

You are free to share this article under the Attribution 4.0 International license.

<!–
Topic

A new study of American Sign Language suggests one potential reason for why languages change over time: we just want to make our lives a little easier.

Speakers, ideas, and technologies all seem to play a role in shifting the ways we communicate with each other, but linguists don’t always agree on how and why languages change.

The new study used artificial intelligence to find that American Sign Language (ASL) signs that are challenging to perceive—those that are rare or have uncommon handshapes—are made closer to the signer’s face, where people often look during sign perception.

By contrast, common ones, and those with more routine handshapes, are made further away from the face, in the perceiver’s peripheral vision.

Researchers used artificial intelligence to analyze thousands of videos from a database of American Sign Language—this clip shows the sign for “believe”—to figure out where the hands are in relation to the face and body. (Credit: ASL-LEX.org)

The findings in the journal Cognition suggest that ASL has evolved to be easier for people to recognize signs.

“Every time we use a word, it changes just a little bit,” says Naomi Caselli, a Deaf studies scholar who is also codirector of the Boston University Rafik B. Hariri Institute for Computing and Computational Science & Engineering’s AI and Education Initiative. “Over long periods of time, words with uncommon handshapes have evolved to be produced closer to the face and, therefore, are easier for the perceiver to see and recognize.”

Although studying the evolution of language is complex, “you can make predictions about how languages might change over time, and test those predictions with a current snapshot of the language,” Caselli says.

For the study, researchers used an artificial intelligence tool to look at the evolution of ASL. The tool analyzed videos of more than 2,500 signs from ASL-LEX, the world’s largest interactive ASL database.

Caselli says they began by using the AI algorithm to estimate the position of the signer’s body and limbs.

“We feed the video into a machine learning algorithm that uses computer vision to figure out where key points on the body are,” says Caselli. “We can then figure out where the hands are relative to the face in each sign.”

The researchers then match that with data from ASL-LEX—which was created with help from the Hariri Institute’s Software & Application Innovation Lab—about how often the signs and handshapes are used.

They found, for example, that many signs that use common handshapes, such as the sign for children—which uses a flat, open hand—are produced further from the face than signs that use …….

Source: https://www.futurity.org/american-sign-language-communication-artificial-intelligence-2719222/