Wednesday, October 20

Artificial Intelligence explains our beauty patterns



A new Artificial Intelligence device “reads” in people’s brains the features that make a face seem particularly beautiful, and then succeeds in reproducing those patterns of beauty using different models.

Researchers from the universities of Helsinki and Copenhagen have made Artificial Intelligence capable of generating virtual faces that respond to our subjective patterns of beauty.

AI performs a triple function: on the one hand, to understand people’s facial features and their characteristics. On the other hand, identify a person’s aesthetic preferences. And finally, generate virtual images that respond to those preferences.

Technology can be used, for example, to model preferences and decision-making, as well as to potentially identify unconscious attitudes.

According to a release, the tests carried out allow to verify the effectiveness of the system, since the device manages to create new portraits independently that connect directly with the preferences of the people.

As explained by the scientists in charge of the new study, published in IEEE Transactions on Affective Computing, the innovation has as its background previous developments in which Artificial Intelligence had been used to identify basic aspects of faces, such as eye color or The hair. Now, the scientists sought to delve into more subjective and complex topics, such as beauty preferences and patterns.

For Michiel Spapé, lead author of the study, “attractiveness is associated with cultural and psychological factors that probably play unconscious roles in our individual preferences. In fact, we often find it very difficult to explain why something or someone seems particularly beautiful to us. Apparently, the beauty is in the eyes of the beholder, “he said.

Deciphering minds

How to do so that such a human and subjective question can be dominated by Artificial Intelligence? How to achieve that a machine can capture the beauty that only “the eyes of the beholder” see?

Facing that challenge, the specialists trained an artificial neural network to create virtual portraits, which were later presented to a group of volunteers. The participants’ reactions were analyzed and recorded in the brain using electroencephalography (EEG).

Subsequently, the brain activity data were worked with machine learning techniques and integrated into a brain-computer interface. As a result of this information crossing, the image-generating artificial neural network incorporated a great diversity of individual preferences, tastes and patterns of beauty.

Nurtured with this new data set, the Artificial Intelligence device was now able to produce images by automatically detecting each person’s preferences. Based on new tests carried out after enriching the artificial neural network, the researchers found that the new images generated matched people’s preferences with an accuracy of more than 80%.

Other apps

In other words, by integrating individual preferences, the Artificial Intelligence model that interprets brain responses, and the neural network that models facial images, the system produces a completely new portrait that “predicts” the particular taste of each user.

According to the effectiveness achieved, scientists believe that this same scheme can be useful to analyze other cognitive functions, such as perception or decision-making.

At the same time, it would be possible to orient the device towards the identification of stereotypes, prejudices and other issues that allow a better understanding of individual differences between people.

How far will Artificial Intelligence go in its effort to read the human mind, replicate the functioning of our brain and even try to explain our deepest emotions? Will it be able to interpret even the most subjective, contradictory and complex aspects that determine the human essence?

Reference

Brain-computer interface for generating personally attractive images. M. Spape, K. Davis, L. Kangassalo, N. Ravaja, Z. Sovijarvi-Spape and T. Ruotsalo. IEEE Transactions on Affective Computing (2021).DOI:https://doi.ieeecomputersociety.org/10.1109/TAFFC.2021.3059043

Photo: Rafaella Mendes Diniz en Unsplash.

Video: Michiel Spapé.


www.informacion.es

Leave a Reply

Your email address will not be published. Required fields are marked *

Share