has been an important concern for companies that need to protect intellectual property and security-sensitive information. This concern became even more important during the last decade, with the increase in cybercrime, driving an array of regulations, security frameworks and associated standards that enforce this need, and establish penalties for lack of compliance. The ultimate goals are to protect certain types of information from being exposed, detect data breaches and provide accountability for incidents related to data loss, so companies need to implement processes and adopt solutions that help them achieve these goals.
The issue is that, when auditing access to data, irrespective of the technology being used (filesystem driver, or audit logs for Windows- object access events), the OS only signals basic file operations, like: file create, file read, file write, etc. So that is all the current solutions are able to audit. There is no indication of complex file operations, such as file copy, file rename, file archive.
But data leakage incidents rely on complex file operations like copy, rename or archive in order to occur.
When it comes to Windows logs, the object access events only prove intent, they do not prove action. An application can request a handle to a file, specifying certain types of access, but then do nothing (not even display the file) and close the handle. There will still be an object access event for that application and that user, in the logs. Moreover, most applications specify most of the access rights, even if not needed, when requesting a handle to a file (just in case they may need it at a later stage). This means that someone opening an Excel spreadsheet and closing it, may come out as someone who wrote to the file, because the system logs as an object access event with access type that may include write. The existence of these scenarios seriously damage the reliability of OS logs for auditing access to files. Then, when relying on Windows logs, and object access events, it is impossible to accurately correlate file read and file write events (in order to detect copy operations), because the logging system simply does not deliver sufficient information about what happens to the data in the memory, as part of the copy operation. When multiple computers are involved, the task of correlating this information becomes utopic.
Obviously, from the data security perspective, there is a big difference between someone reading a file in order to display it on the screen (which may be part of their everyday job), or reading the file in order to copy it on a removable device, or to a cloud-sync folder (which exposes the information and may lead to data leakage). However, when it comes to auditing, it is all the same: a file read event (or several file read events, depending on the technology being used). Some experts would say that monitoring file read is enough, because simply reading the file makes the user accountable for its contents and is proof-enough when it comes to delivering accountability if the incident involves the particular file. But is it?
What if over a short period of time, there are multiple users that read from the compromised resource? Are they all accountable, just because we cannot know for sure if the file read access also resulted in a file write action on another computer or device? Obviously in this scenario, what we have right now in terms of file access auditing is not enough, so investigators would need to look someplace else in order to draw a conclusion.
Simply hooking on system file copying APIs for auditing file copy operations does not work either. There is no guarantee that the applications performing a file copy uses a system file copy API. File copy is a read – write process and there are many applications that perform the copy operation by reading and writing, without using system file copying routines.
Since the OS does not report on complex file operations, a solution is to use a technology that is capable to track and correlate operations that span over multiple files, involving the same content, security context, and other parameters, in order to determine the real actions that occur.
If interested, you can find out more by reading this whitepaper: Going Beyond Basic File Auditing for Data Protection
Liked this article? Follow us on LinkedIn for more, or subscribe to our newsletter.
This post was last modified on August 21, 2023 7:28 am
In the digital world, information is often stored and transferred through files. From the most…
Introduction Data security is more important than ever in today's fast-paced digital world. One critical…
Introduction: Cyber threats are a growing concern for businesses and individuals alike. With the increasing…
Microsoft Internet Information Services (IIS) is a popular web server that is widely used to…
File tracking is an important aspect of server administration, and it can help administrators detect…
File monitoring solutions are essential tools for administrators to manage and protect their organizations' data…