Visit

On secondhand marketplaces like eBay, people trust online sellers who post their own high-quality photos of items for sale more than they trust those who use stock images or poor-quality photos, a Cornell Tech study has found.

The findings could help online marketplaces improve trust in their sites by offering guidelines on how to take better photos or introducing augmented reality features that instruct users to change lighting or camera angles, the researchers said.

This is particularly important for new or growing sites that are working to establish trust, said Xiao Ma, lead author of “Understanding Image Quality and Trust in Peer-to-Peer Marketplaces,” to be presented at WACV 2019, Jan. 7-11 in Waikoloa Village, Hawaii.  

“The high-quality product images selected by our model automatically outperform stock images in generating perceptions of trustworthiness,” said Ma, a doctoral student in the field of information science at Cornell Tech. “People believe the user-generated images represent the actual condition of the product better. Stock images present more uncertainty and raise questions such as whether they are too good to be true.”

The study, co-authored with Mor Naaman, associate professor of information science at the Jacobs Technion-Cornell Institute at Cornell Tech, and Serge Belongie, professor of computer science at Cornell Tech, as well as colleagues at École Polytechnique, Google Research and eBay, grew out of previous work on trust in Cornell Tech’s Connected Experiences Lab. Trust is essential for society to function, but research shows levels of trust have been declining in recent years. Establishing and building trust in digital environments is even more complex.

“Without face-to-face interactions there is a lot of uncertainty,” Ma said. “You don’t know what’s going on on the other side of the screen. People could lie; they could post different images. In the beginning of e-commerce there were a lot of studies on trust, but that was limited because our ability to understand images and language computationally has been limited. Now, because of computer vision and natural language processing, we’re able to understand a lot of these online interactions better, and there’s an opportunity to revisit these questions of online trust.”

For this study, the researchers used publicly available data from the mobile classifieds app Letgo.com and private data from eBay. They focused on shoes and handbags because they’re among the most popular goods found on secondhand marketplaces, and because they are visually distinctive enough to pose an interesting computer vision challenge.

Using the images, they developed a deep learning algorithm – a kind of artificial intelligence frequently used for classification tasks – to predict image quality. They found the algorithm to be around 87 percent accurate, but because of the way deep learning works, researchers could not tell how the model arrived at its decisions. To learn more about which elements improve an image, they also analyzed images using classic computer vision methods and linear regression, a kind of statistical modeling.

Among their findings: Images are more likely to be labeled high quality if they are brighter, and less likely if they have a high foreground to background ratio. A good-quality image should have high contrast for the product and low contrast for the background, they found.

Once they had established methods of predicting quality, the researchers investigated its impact on sales and consumer trust. Shoes with higher-quality images were found to be 1.17 times more likely to be sold than those with lower-quality images, and handbags with better photos were 1.25 times more likely to sell. But because sales are also subject to factors such as price, further research is needed, Ma said.

To test trustworthiness, the researchers designed three hypothetical marketplaces and populated one of them with high-quality images, one with low-quality images and one with stock photos. They recruited 300 people to rate the marketplaces from one to five on a series of statements gauging trust.

The site with the better images scored the highest, with participants rating it around 3.8 for the statement “I believe that the products from these sellers will meet my expectations when delivered,” compared with around 3.7 for the site using stock imagery and 3.4 for the site with low-quality images.

The results – which surprised researchers, who did not expect high-quality personal images to perform better than stock imagery – could be especially helpful to new sites, which might introduce features to improve the quality of users’ photos. For example, instead of automatically using the first uploaded image as a thumbnail, apps could use an algorithm to choose the best-quality image. The research could also be applicable to other kinds of sites, such as real estate or dating.

“Digital environments create new challenges and opportunities for different types of trust,” Ma said. “A lot of the challenge in starting online platforms is in gaining the users’ trust to have people adopt it.”

The study was partly funded by a Facebook equipment donation and by Oath, which is part of Verizon, and Yahoo Research through the Connected Experiences Lab.

This article originally appeared in the Cornell Chronicle.