Archives

Presentations

Presentation media provided by presenting authors and speakers.
This paper; a scenario-based teaching case study, aims to introduce students in a Cybersecurity Risk Management course to advanced quantitative risk assessment techniques. The case study utilizes a fictitious company for which a risk assessment is underway. Assuming the role of a Cybersecurity Risk Team of the company, the students are tasked with determining the risk exposure the company faces from a threat scenario against one of its mission-critical information resources. Specifically, the students are required to (1) quantify the monetary losses that could result from a threat scenario, (2) compute the inherited risk exposure from the threat scenario (3) compute the residual risk given the implantation of certain security controls, and (4) compute returns on security controls. The case study holds the promise of enhancing the overall learning of the students and boosting their marketability as future cybersecurity professionals.
Computer Science as a subject is now appearing in more school curricula for GCSE and A level, with a growing demand for cyber security to be embedded within this teaching. Yet, teachers face challenges with limited time and resource for preparing practical materials to effectively convey the subject matter. We hosted a series of workshops designed to understand the challenges that teachers face in delivering cyber security education. We then worked with teachers to co-create practical learning resources that could be further developed as tailored lesson plans, as required for their students. In this paper, we report on the challenges highlighted by teachers, and we present a portable and isolated infrastructure for teaching the basics of offensive and defensive cyber security, as a co-created activity based on the teacher workshops. Whilst we present an example case study for red and blue team student engagement, we also reflect on the wide scope of topics and tools that students would be exposed to through this activity, and how this platform could then be generalised for further cyber security teaching.
With cybercrime increasing by 600% during theCOVID-19 pandemic, the demand for cybersecurity professionals has also risen significantly. There are roughly 700,000 unfilled cybersecurity positions that continue to affect businesses and have the potential to cause significant problems. Education for novice cybersecurity students suffers from teaching materials not being practical, modern, nor intuitive enough to inspire these students to pursue a career in the cybersecurity field. In this paper, we present our methodology and create a module for teaching the basics of software security using Armitage and Metasploit. We design our module and hands-on labs using a preconfigured Windows 10 VM, a Metasploitable VM and a Kali Linux VM with custom-made tools. Our methodology and module is validated through the results of a high school cybersecurity camp. The module is available at GitHub.
In this session, the panelists will discuss their observations and experiences of cybersecurity myths across academia, industry, and government. They will draw on their decades of experience to discuss pitfalls they've encountered and examples of folk wisdom including: Is the user the weakest link? Is more security always better? Is cyber offense easier than defense? This will also touch on some of the biases humans bring to decision-making, and how those may negatively influence good security practices. These include the action and conformity biases. The panel will illuminate opportunities for education to help dispel prevalent and widespread myths that can be avoided or mitigated for the benefit of more effective cybersecurity. Portions of this presentation are drawn from personal experience and courses taught by the panelists, including a regular course offered at Purdue University as part of the graduate cybersecurity curriculum.
A coalition of Virginia universities, in partnership with the Virginia Department of Elections (ELECT), launched the Virginia Cyber Navigator Internship Program (VA-CNIP) - an innovative educational program to develop future cybersecurity professionals to protect the election infrastructure. The program addresses the need for more skilled cybersecurity professionals, and those who are supporting public services such as elections. This paper provides an overview of the key components of the program: a full semester gateway course covering sociotechnical election topics, a two-day kickoff bootcamp to prepare students for their internship, an internship with an election office, and a one-day debrief and assessment at the end of the internship.
Research indicates that deceitful videos tend to spread rapidly online and influence people's opinions and ideas. Because of this, video misinformation via deepfake video manipulation poses a significant online threat. This study aims to discover what factors can influence viewers' capability of distinguishing deepfake videos from genuine video footage. This work focuses on exploring deepfake videos' potential use for deception and misinformation by exploring people's ability to determine whether videos are deep fakes in a survey consisting of deepfake videos and original unedited videos. The participants viewed a set of four videos and were asked to judge whether the videos shown were deepfakes or originals. The survey varied the familiarity that the viewers had with the subjects of the videos. Also, the number of videos shown at one time was manipulated. This survey showed that familiarity with the subject(s) depicted in a deepfake video has a statistically significant impact on how well people can determine it is a deepfake. Notably, however, almost two-thirds of study participants (102 out of 154, or 66.23%) were unable to correctly identify a sequence of just four videos as either genuine or deepfake. The potential for deepfakes to confuse or misinform a majority of the public via social media should not be underestimated. This study provides insights into possible methods for countering disinformation and deception resulting from the misuse of deepfakes. Familiarity with the target individual depicted in a deepfake video contributed to viewers' accuracy in distinguishing a deepfake better than showing unaltered authentic source videos side-by-side with the deepfakes. Organizations, governments, and individuals seeking to contain or counter deepfake deception will need to consider two main factors in their operational planning: 1) a swift, near-real-time response to deepfake disinformation videos, and 2) creating more familiarity through additional, preferably live video footage of the target of the deepfake responding to and refuting the disinformation personally.
 
 
Powered by Phoca Download