Leveraging AI for Cybersecurity in Universities
Bad actors gather in about $1.5 trillion a year, according to the University of Surrey. Recent, Marriott breach caused millions of dollar worth data loss. Facebook and Cambridge Analytica episode forced companies to evaluate data breaches. These incidents validate that hackers engagement in data breaches consciously, with the intent to profit from the infringement. Implications of these events have resulted in making security an important commodity.
Colleges, universities, and research institutions have increased their investment in cybersecurity. The cost of security has increased due to advancements in AI tools which has made attacks more lethal, as result solutions have corresponded to these actions. Institutions can fight data breach by adopting the same capabilities that hackers use.
However, the novel way to approach cybercrime is by fixing vulnerabilities and analyzing user behavior to develop ideal security settings. Training ML algorithms can do this to the datasets that have current and potential bugs.
The University of California, San Diego, newly created Halıcıoğlu Data Science Institute which aims at exploring the scientific foundations of data science. Experts at the institution suggest that organizations must use people and computing power in concert. Since companies know that computers can detect suspicious data, but only humans have the agency to act on it, which leaves with one lingering question—what should colleges and companies do to fight the cybercrime?
The answer lies in training the staff and students in the domain of threats and solutions. Education of the tools and patterns of cybercrime is essential to help verify what is fake and what is real.
But colleges can take a head start by implementing cybersecurity principles by tightening their environments, resources, storage, networking, and monitoring. This will help prevent resource leakage. Additionally, institutions should make sure that attackers don’t have greater capabilities than legitimate users. Also, universities should use a secure runtime monitor. This will help when systems get compromised. The runtime monitor will ensure that the attacker can’t reach the data.
Machine learning systems can be leveraged to perform tasks such as evaluating user-application interactions to set session durations accordingly, thus limiting the window of opportunity for cybercriminals.
Check Out : CIOReview | Medium