Publications

6 Results
Skip to search filters

Characterizing Human Performance: Detecting Targets at High False Alarm Rates [Slides]

Speed, Ann S.; Wheeler, Jason W.; Russell, John L.; Oppel, Fred O.; Sanchez, Danielle; Silva, Austin R.; Chavez , Anna C.

Analysts develop a “no threat” bias with high false alarms. If only shown alarms for actual attacks, may never actually see an alarm. We see this in the laboratory, but not often studied in applied environments. (TSA is an exception.) In this work, near-operational paradigms are useful, but difficult to construct well. Pilot testing is critical before engaging time-limited professionals. Experimental control is difficult to balance with operational realism. Grounding near-operational experiments in basic research paradigms has both advantages and disadvantages. Despite shortcomings in our second experiment, we now have a platform for experimental investigations into the human element of physical security systems.

More Details

Use of a controlled experiment and computational models to measure the impact of sequential peer exposures on decision making

PLoS ONE

Sarkar, Soumajyoti; Shakarian, Paulo; Sanchez, Danielle; Armenta, Mika; Lakkaraju, Kiran L.

It is widely believed that one's peers influence product adoption behaviors. This relationship has been linked to the number of signals a decision-maker receives in a social network. But it is unclear if these same principles hold when the "pattern" by which it receives these signals vary and when peer influence is directed towards choices which are not optimal. To investigate that, we manipulate social signal exposure in an online controlled experiment using a game with human participants. Each participant in the game decides among choices with differing utilities. We observe the following: (1) even in the presence of monetary risks and previously acquired knowledge of the choices, decision-makers tend to deviate from the obvious optimal decision when their peers make a similar decision which we call the influence decision, (2) when the quantity of social signals vary over time, the forwarding probability of the influence decision and therefore being responsive to social influence does not necessarily correlate proportionally to the absolute quantity of signals. To better understand how these rules of peer influence could be used in modeling applications of real world diffusion and in networked environments, we use our behavioral findings to simulate spreading dynamics in real world case studies. We specifically try to see how cumulative influence plays out in the presence of user uncertainty and measure its outcome on rumor diffusion, which we model as an example of sub-optimal choice diffusion. Together, our simulation results indicate that sequential peer effects from the influence decision overcomes individual uncertainty to guide faster rumor diffusion over time. However, when the rate of diffusion is slow in the beginning, user uncertainty can have a substantial role compared to peer influence in deciding the adoption trajectory of a piece of questionable information.

More Details

Can social influence be exploited to compromise security: An online experimental evaluation

Proceedings of the 2019 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, ASONAM 2019

Sarkar, Soumajyoti; Shakarian, Paulo; Armenta, Mika; Sanchez, Danielle; Lakkaraju, Kiran L.

While social media enables users and organizations to obtain useful information about technology like software and security feature usage, it can also allow an adversary to exploit users by obtaining information from them or influencing them towards injurious decisions. Prior research indicates that security technology choices are subject to social influence and that these decisions are often influenced by the peer decisions and number of peers in a user’s network. In this study we investigated whether peer influence dictates users’ decisions by manipulating social signals from peers in an online, controlled experiment. Human participants recruited from Amazon Mechanical Turk played a multi-round game in which they selected a security technology from among six of differing utilities. We observe that at the end of the game, a strategy to expose users to high quantity of peer signals reflecting suboptimal choices, in the later stages of the game successfully influences users to deviate from the optimal security technology. This strategy influences almost 1.5 times the number of users with respect to the strategy where users receive constant low quantity of similar peer signals in all rounds of the game.

More Details
6 Results
6 Results