In a recent NetClean survey of IT experts, a fifth of those surveyed revealed that someone had downloaded child sexual abuse (CSA) material at work. Of those, just 3.5% lead to criminal investigations and in the vast majority of cases (69%) nothing happened.
We know from experience that child sexual abuse content (CSA) is not an issue that ceases at the entrance to the workplace. However, part of the problem with tackling the spread of CSA material is that people still underestimate the scale of the problem. There's an inherent belief that it is the 'local weirdo' accessing illicit images and not the person sat opposite them in their day-to-day jobs.
A third (33.3%) of respondents believe that just one in every 10,000 people look at child sexual abuse sites at work. Whilst, a further 34% estimate that it is just one in every million. This is simply not true. From our experience, one in every 1,000 employees will look at CSA content at work so the problem is much more commonplace.
The growth in portable USB devices and mobile storage means there is a disturbing trend of offenders increasingly bringing illegal images or videos into the workplace. In fact, many businesses are already unwittingly storing, and allowing the movement of, illegal images and videos across their networks.
Organisations have started proactively introducing measures to prevent the spread of CSA content in the workplace. The majority of those surveyed (78.7%) have an internet use policy in place that covers child sexual abuse sites and in a quarter of businesses (24.1%) the drive to purchase blocking software is coming from the board of directors.
However, more still needs to be done. Just 9.2% of those surveyed believe that it is employers that have a responsibility to stop child sexual abuse content. Instead, the majority of respondents believe that responsibility to tackle the issue lies with individuals (34.8%), Government (29%) or internet service providers (22%).
Today's employers have an ethical duty to tackle child abuse images on corporate networks. The people who view these images are participating in a cycle of abuse; perpetuating a market for ringleaders to continue producing material that makes more children suffer. Relying on web filters alone won't solve the problem. Organisations need to go one step further and use proven methods, such as file matching, to flag indecent images and cross reference them against existing ones on police databases to keep corporate networks clean from illegal content.
NetClean surveyed 141 senior executives, directors, IT managers, technical specialists and IT consultants at a security trade show earlier this year.