Big data and the fight against cyber-bullying

cyber-bullying

Investigations into allegations of cyber-bullying or harassment can be time-consuming and delicate. A piece of data analysis software might be able to help, discovers Cath Everett. 

While a quarter of employees claim to have been bullied at work, it seems that a growing number of workers are experiencing such harassment in the form of cyber-bullying.

Cyber-bullying is mistreatment that takes place via any form of electronic communications channel, ranging from a mobile phone to social media. It can be as distressing as its “real-world” counterpart.

According to James Blake, chief executive of big data start-up Hello Soda, employee complaints typically range from impolite or inappropriate emails from a manager or colleague, to a full-scale campaign waged against an unfortunate individual.

“It’s often very subjective, but if something has upset someone, HR has to take it seriously,” he says. “It’s often just misunderstandings and personality clashes, but if you can move people into one-on-one conflict resolution effectively, it speeds up the process rather than having it fester.”

This is where the firm’s Profile system comes in. Originally developed as a risk management and identity verification platform for the financial services, insurance and online gaming market, its application was beginning to broaden out into the HR sphere when a customer spotted its potential.

The organisation in question was experiencing a cyber-bullying problem among staff in its call centre and so it raised the question of whether or not the Profile software engine could be used to support investigations.

Profile searches

The engine works by collecting huge amounts of text-based information from websites, social media and blogs and, by using special psycholinguistic and deep learning techniques, it evaluates intent based on the written word.

At the same time, it also examines personality traits revealed by each individual writer’s style.

Hello Soda then runs a Profile search within stated parameters. These include high-risk words of, for example, a sexist or racist nature, as well as terms that refer to the kind of bullying being experienced.

The report will then provide risk indicators such as how many times people used inappropriate words, and in what context, within a given timeframe, as well as how many times they interacted with colleagues and whether or not the interaction was positive or negative.

It also provides insights into whether or not an individual’s state of mind or behaviour has changed, and can therefore help understand any issues they may be facing.

“The client said that they saved at least two weeks going through websites and social media pages to try and find out context, so it was a huge process saving for them,” says Blake.

The HR module, which costs around £15,000 depending on the depth and breadth of proposed searches, can also be used for a range of other activities, such as recruitment.

“We basically help the investigation process by speeding it up, but we won’t eradicate it. HR still has an active part to play,” Blake concludes.

Comments are closed.