Persuasion and Phishing: Analysing the Interplay of Persuasion Tactics in Cyber Threats
Persuasion and Phishing: Analysing the Interplay of Persuasion Tactics in Cyber Threats
Persuasion and Phishing: Analysing the Interplay of Persuasion Tactics in Cyber Threats
File Size:
534.75 kB
Kalam Khadka
Date:
27 November 2024
This study extends the research of Ferreira and Teles (2019), who synthesized works by Cialdini (2007), Gragg (2003), and Stajano and Wilson (2011) to propose a unique list of persuasion principles in social engineering. While Ferreira and Teles focused on email subject lines, this research analyzed entire email contents to identify principles of human persuasion in phishing emails. This study also examined the goals and targets of phishing emails, providing a novel contribution to the field. Applying these findings to the ontological model by Mouton et al. (2014) reveals that when social engineers use email for phishing, individuals are the primary targets. The goals are typically unauthorized access, followed by financial gain and service disruption, with Distraction as the most commonly used compliance principle. This research highlights the importance of understanding human persuasion in technology-mediated interactions to develop methods for detecting and preventing phishing emails before they reach users. Despite previous identification of luring elements in phishing emails, empirical findings have been inconsistent. For example, Akbar (2014) found 'authority' and 'scarcity' most common, while Ferreira et al. (2015) identified 'liking' and 'similarity.' In this study, 'Distraction' was most frequently used, followed by 'Deception,' 'Integrity,' and 'Authority.' This paper offers additional insights into phishing email tactics and suggests future solutions should leverage socio-technical principles. Future work will apply this methodology to other social engineering techniques beyond phishing emails, using the ontological model to further inform the research community.
Powered by Phoca Download
Copyright © 2024 CISSE™. All rights reserved.