Machines still can’t think, but now they can validate your feelings, based on new research from New Jersey Institute of Technology Assistant Professor Jorge Fresneda.
Fresneda started his career as a chemist and then became an expert in neuroanalytics. He studies how measurements of brain activity and skin conductance can predict a person’s emotions with high accuracy, and how such information can be used in fields such as entertainment, management, marketing and well-being.
“Neuromarketing is a subfield within marketing that uses sensors for marketing purposes, to inform managers and make better marketing decisions,” Fresneda explained. He collaborated with colleagues in NJIT’s Martin Tuchman School of Management — Professor Jerry Fjermestad and doctoral student David Eisenberg — plus Virginia Tech graduate research assistant Tanmoy Sarkar Pias, to publish Neuromarketing Techniques to Enhance Consumer Preference Prediction earlier this year at the 57th Hawaii International Conference on System Sciences.
Currently, most marketing research relies on people self-reporting their responses to anything from sale prices to dramatic videos. Fresneda found that if you add electroencephalograms (EEG) probes, which detect brain waves, and galvanic skin response (GSR) sensors, which measure electrical conductance, then you can predict people’s feelings about marketing stimuli with greater accuracy than their own self-reporting. That’s determined by feeding the various sensor results through graphing algorithms and then comparing them to existing academic databases.
The field isn’t as far-fetched as it may sound. Modern EEG equipment, at non-healthcare levels, is small enough to blend into an ordinary Bluetooth headset. And GSR sensors, contrary to sounding Frankensteinian, are already built into the latest Samsung smartwatches. Fresneda added that one of the most impressive sensor networks may be deployed at the North Jersey’s American Dream mall. Although not fully in use, he said it’s potentially capable of collecting GSR data from smart devices, or from radio-frequency data transmitted by smart shopping bags, and linking that information to social media profiles.
Fresneda said he would be willing to wear such technology, since most shoppers already allow companies like Amazon, Google and Facebook to track us. “If I get value in return, yes, of course. But you have to show me,” he said. “Otherwise, I would be legitimately scared.”
At a demonstration for retail store managers, the opinions were mixed. Fresneda said all but two appreciated the technology’s potential to provide feedback to managers and sales staffers, on their own performance. Those who were against it were very strong in their opinions about privacy concerns. As a consumer, Fresneda said he would be willing to wear such technology, since most shoppers already allow companies like Amazon, Google and Facebook to track us. “If I get value in return, yes, of course. But you have to show me,” he said. “Otherwise, I would be legitimately scared.”
“Furthermore, the same algorithms can be used to measure emotional reactions, such as calmness or fear, which can potentially measure people’s reactions to various emotionally charged experiences,” their paper stated. “Future research could use the same algorithm to anticipate liking or choices of consumers for different kinds of products, as well as in new product development, beyond music videos. Moreover, the same neural analytics could be applied to tracking people’s emotional states in other contexts, including customer satisfaction, worker satisfaction, or even employee productivity tracking.”
Fresneda and colleagues are now working on a follow-up journal article, based on tests of applying the technology to fields such as consumer-oriented finance. It’s been submitted for fast-track consideration to AIS Transactions on Human-Computer Interaction. The team is also in the early stages of developing patents that apply their research to healthcare and videogames.
Funding to date was provided by Martin Tuchman School of Management, a National Science Foundation Innovation Corps mini-grant and the team’s own contributions.