Eugina Leung
Assistant Professor

Biography
Professor Leung's research explores the influence of technology on consumer judgment and how it can hinder identity-based consumption. She also examines the role of language and culture in shaping consumer responses. Her work has been published in the Journal of Marketing Research and the Journal of Consumer Psychology.
She received her PhD in marketing from Erasmus University in the Netherlands and served as a visiting research scholar at the University of Chicago and the University of Zurich. She holds a master's degree in economics from Universitat Pompeu Fabra in Spain and a bachelor's degree in business administration from Hong Kong University of Science and Technology.
Education
Erasmus University
Universitat Pompeu Fabra
Hong Kong University of Science and Technology
Articles
The narrow search effect and how broadening search promotes belief updating
In a time of societal polarization, the combination of people’s search habits and the search tools they use being optimized for relevance may perpetuate echo chambers. We document this across various diverse studies spanning health, finance, societal, and political topics on platforms like Google, ChatGPT, AI-powered Bing, and our custom-designed search engine and AI chatbot platforms. Users’ biased search behaviors and the narrow optimization of search algorithms can combine to reinforce existing beliefs. We find that algorithm-based interventions are more effective than user-based interventions to mitigate these effects. Our findings demonstrate the potential for behaviorally informed search algorithms to be a better tool for retrieving information, promoting the shared factual understanding necessary for social cohesion.
Media Appearances
Internet search results often reinforce a person's existing beliefs
It’s easy to believe that the internet is an open door to knowledge. A place where a user’s search leads to discovery. A tool for understanding new ideas, correcting mistakes, or seeing both sides of a debate. But in practice, something very different is happening.
How Online Searches Reinforce Beliefs and Divide Us
In an era marked by polarization, researchers have found that even neutral search engines can lead people deeper into digital echo chambers. It’s not necessarily the technology that’s biased—it’s the way we use it. The study, published in the Proceedings of the National Academy of Sciences, shows that when people look for information online, they tend to type search terms that reflect what they already believe. This subtle habit, combined with algorithms designed to deliver the most “relevant” results, reinforces existing views rather than challenging them.
The Scientific Reason Why ChatGPT Leads You Down Rabbit Holes
"When people look up information, whether it's Google or ChatGPT, they actually use search terms that reflect what they already believe," Eugina Leung, an assistant professor at Tulane University and lead author of the study, told me.
How You Search the Internet Can Reinforce Your Beliefs—Without You Realizing It
These narrow terms tended to align with participants’ existing beliefs, and generally less than 10 percent did this knowingly. “People often pick search terms that reflect what they believe, without realizing it,” says Eugina Leung of Tulane University’s business school, who led the study. “Search algorithms are designed to give the most relevant answers for whatever we type, which ends up reinforcing what we already thought.” The same was true when participants used ChatGPT and Bing for searches aided by artificial intelligence.
Your online searches might be biased from the start. A Tulane professor studied the reason.
After studying nearly 10,000 participants on a variety of subjects from gas prices to caffeine, Eugina Leung, an assistant professor at Tulane’s A.B. Freeman School of Business, has concluded we need to reexamine the way we search for information.
Tulane Study Reveals Online Searches Reinforce Bias
A study co-authored by Tulane University professor Eugina Leung has found that ninety percent of people unknowingly phrase online searches in ways that reinforce what they already believe—even when they’re trying to remain neutral.
Scientists show how you’re unknowingly sealing yourself in an information bubble
“The inspiration for this research came from a personal experience,” said study author Eugina Leung, an assistant professor of marketing at the Freeman School of Business at Tulane University. “I was visiting my co-author, Oleg Urminsky, at the University of Chicago. It was at the end of November, and I came down with a cold. As I was searching Google for cold medicine, I started paying close attention to the exact words I was using.”
Scientists show how you’re unknowingly sealing yourself in an information bubble
“I noticed that searching for ‘cold medicine side effects’ gave me a very different, and much more alarming, set of results than searching for ‘best medicine for cold symptoms.’ It became clear how my own framing of the search was dramatically shaping the information I received, and that led us to investigate this phenomenon more systematically.”
Why Searching for Truth Online Might Be Making Us More Biased
What if your next Google search helped reinforce a false belief—without you even realizing it? In a sweeping set of 21 studies involving thousands of participants, cognitive scientist Eugina Leung and her research team uncovered a subtle but powerful psychological phenomenon: the very act of typing a question into a search engine can entrench existing biases, even when people believe they’re being objective.
Search, and You Shall Find What You Believe
Search “why is science wrong so often” and you’re apt to see links to stories that explain the worst aspects of science, outlier cases when research went awry. Search “how does science work” and you’ll see a totally different mix of stories that explain that science is often wrong, sure, and that’s part of the process of eventually getting it right.
The Scientific Reason Why ChatGPT Leads You Down Rabbit Holes
That chatbot is only telling you what you want to believe, according to a new study.
The Hidden Power Fueling Online Echo Chambers: What Your Google Searches Reveal
Lead author Eugina Leung emphasizes that “as AI and large-scale search are embedded in our daily lives, integrating a broader-search approach could reduce echo chambers for millions (if not billions) of users.” This research underscores the immense power that design choices wield in steering public knowledge frameworks and offers a roadmap for mitigating the fragmentation of modern discourse through intentional search architecture.
Internet search results often reinforce a person's existing beliefs
It’s easy to believe that the internet is an open door to knowledge. A place where a user’s search leads to discovery. A tool for understanding new ideas, correcting mistakes, or seeing both sides of a debate. But in practice, something very different is happening.