Blog Archives

Making an online killing: A brief look at “suicide fetishes” and “addiction” to suicide websites

Back in March 2011, a then 46-year old American ex-nurse William Melchart-Dinkel from Minnesota was convicted of persuading two people he met online to commit suicide. Melchart-Dinkel was accused of having a “suicide fetish” because he got his kicks from frequenting online suicide chat rooms. Posing as a female nurse, he would chat online and feign compassion to depressed individuals and encourage them to commit suicide.

More specifically, a US court found him guilty of aiding the suicides of 18-year old Canadian student Nadia Kajouli (who jumped into a river and drowned), and 32-year old British IT technician Mark Drybrough (who hanged himself). During the trial, Nadia’s mother shared extracts of the online chats that took place between her daughter and Melchart-Dinkel (who was using various aliases including ‘Cami’, ‘Falcon Girl’ and ‘Li Dao’). A Minnesotan Internet crimes task force forensically examined Melchert-Dinkel’s computer and located online chats that he had with the Canadian teenager. The online conversation demonstrated that Melchart-Dinkel had urged Nadia to hang herself (rather than kill herself by drowning) and provided detailed instructions on how to kill themselves:

“If you wanted to do hanging we could have done it together online so it would not have been so scary for you…Most important is the placement of the noose on the neck…knot behind the left ear and rope across the carotid is very important for instant unconsciousness and death…I’m just trying to help you do what is best for you not me”.

Melchart-Dinkel even urged Nadia to kill herself while they were chatting online. A few hours after chatting with Melchart-Dinkel, Nadia emailed her roommate and told her she was going to “brave the weather and go ice skating” (in an effort to make it look like an accident). Nadia jumped into a frozen river (but her body was not found until 11 days after she had jumped in). In Mark’s case, Melchert-Dinkel replied to a question posted online by Mark about how he could hang himself if he didn’t have a high ceiling. Following a long email conversation, Melchert-Dinkel instructed him on what to do and convinced Mark that ‘she’ was suicidal too. Melchert-Dinkel wrote:

“I keep holding on to the hope that things might change. Caught between being suicidal and considering it. Same old story!…I don’t want to waste anyone’s time. If you want someone who’s suicidal, I’m just not there yet…Sorry. I admire your courage. I wish I had it”. 

Mark killed himself a few days later. Mark’s mother Elaine called Melchert-Dinkel her son’s “executioner”. She also told the Daily Mail in the UK:

“Mark had had a nervous breakdown and he was depressed and incredibly susceptible. [Melchert-Dinkel ]was there whispering in his ear every time he logged on. In the last email, [he] claimed to be a nurse, saying he had medical training, and proposed a suicide pact”

With the help of Celia Blay (a youth worker from Wiltshire in the UK), Mark’s mother managed to track Melchert-Dinkel. It was during their own investigation they discovered dozens of people had received similar emails to Mark’s:

“We found out everything about him on Google, including where he lived in Minnesota. He befriended them using a female identity, was very loving and sympathetic, but never suggested an alternative to death, even when they were only teenagers. He’d tell them that he intended to kill himself too, and said they should set up a web camera and he would do the same thing so they could watch each other die over the internet”.

During his testimony, Melchert-Dinkel admitted that he had asked between 15 and 20 people to commit suicide on camera while he watched (although when he was first caught, he said the online chatting must have been his teenage daughters). One report on Melchert-Dinkel’s case noted:

“While he never actually witnessed a suicide, he did believe that at least five of the people he had talked to were successful in taking their own lives. He also entered into around 10 ‘suicide pacts’ where he promised to kill himself simultaneously with the person he had been chatting with…Melchert-Dinkel was admitted to a hospital where he told doctors he had a ‘suicide fetish’ and an addiction to suicide websites”.

Before the trial, the Associated Press had interviewed Professor Jonathan Turley (George Washington University Law School), an expert on doctor-assisted suicide. It was reported that:

“[Professor Turley has] never heard of anyone being prosecuted for encouraging a suicide over the Internet. Typically, people are prosecuted only if they physically help someone end it all – for example, by giving the victim a gun, a noose or drugs. Last month, a Florida man was charged in his wife’s suicide after allegedly tossing several loaded guns onto their bed. Turley said if prosecutors file charges against Melchert-Dinkel, convicting him will be difficult – especially if the defense claims freedom of speech. The law professor said efforts to make it illegal to shout ‘Jump!’ to someone on a bridge have not survived constitutional challenges. ‘What’s the difference between calling for someone to jump off a bridge and e-mailing the same exhortation?’ he said”.

This line of defence was used by Melchert-Dinkel’s legal team. His behaviour was described as “abhorrent” by his own lawyer (Terry Watkins) but argued in court that his client’s actions were protected by the freedom of speech. Watkins said in court that:

“Freedom means you have to allow things to happen that some would find disgusting and completely unacceptable from a community or moral standpoint”.

However, the presiding judge (Thomas Neuville) said that the accused had “imminently incited the victims to commit suicide” and described Melchart-Dinkel’s online written comments as “unprotected speech”. He was sentenced to almost a year in prison (360 days) but was delayed until a ruling from the Supreme Court (SC). Earlier this year, the SC in Minnesota overturned Melchert-Dinkel’s conviction, and ruled that Minnesota’s law prohibiting the “encouraging” of suicide was unconstitutional and (as Professor Turley claimed) violated a person’s freedom of speech. However, the case (as far as I am aware) is still continuing because the original state prosecutors are trying to argue that Melchert-Dinkel “assisted” (rather than “encouraged”) people’s suicides.

My own take on this case is that Melchart-Dinkel committed a criminal act and that his claim to medics that he was addictedto encouraging people to commit suicide was made as a way of absolving responsibility for what he did. There was nothing about his online behaviour to suggest it was in any way addicted (at least not by my own criteria). Also, his own use of the word fetish is inappropriate in this instance. Although he did appear to get some kind of kick from his activity, there was nothing sexual in it. Again, his use of the word ‘fetish’ to describe his behaviour also appears to be another linguistic device to distance himself from taking the blame for his actions.

Dr. Mark Griffiths, Professor of Gambling Studies, International Gaming Research Unit, Nottingham Trent University, Nottingham, UK

Further reading

Associated Press (2011). Nurse William Melchart-Dinkel had ‘suicide fetish’, went online to provoke two people’s deaths: cops. New York Daily News, October 17. Located at:

Caulfield, P. (2011). ‘Suicide fetish’ nurse found guilty of provoking people he found online to kill themselves. Daily News, March 16. Located at:

Firth, N. (2010). Revealed: The suicide voyeur nurse who ‘encouraged people to kill themselves online’. Daily Mail, March 20. Located at:

Guariglia, M. (2014). William Melchert-Dinkel: 5 Fast facts you need to know. Heavy News, March 19. Located at:

Murray, Rheana. (2008). A search for death: How the internet is used as a suicide cookbook. Chrestomathy, 7, 142-156.

Yount, K. (2014). Minnesota Supreme Court turns its back on mentally ill. (i)Pinion, March 27. Located at:

For whom the hell trolls: Harassment in online gaming

Trolling is an online phenomenon that people may witness without necessarily knowing what it is.  The term “troll” appears to have originated from a method of fishing, where one would fish by trailing a baited line behind a boat. However, many internet users often use the description of being a troll as a mythological creature that hides under bridges, waiting for an opportunity to pounce. With the latter definition, one can see the comparison with the modern day world with hiding under bridges being the online world waiting for an opportunity that may warrant a troll to take action. With the first definition, it is clear that casting a baited line as a form of provoking individuals into some form of emotional response.

Trolling appears to be a variably defined concept, with multiple definitions existing. It appears to have been first reported in 1999 by Dr. Judith Donath who argued that “trolling is a game about identity deception”, which suggests that a troll’s personal opinion is often avoided during the act. According to Dr Susan Herring and her colleagues, trolling comprises “luring others into often pointless and time-consuming discussions”. In a 2010 paper, Lochlan Morrissey expanded this even further by saying trolling is an utterer producing an intentionally false or incorrect utterance with high order intention [the plan] to elicit from recipient a particular response, generally negative or violent”. Thus, it appears trolling is an act of intentionally provoking and/or antagonising users in an online environment that creates an often desirable, sometimes predictable, outcome for the troll. Morrissey also states that trolling is a complex intentional act, that some may consider an art. On the other hand, others have included trolling as a form of cyberbullying.

To date, there has been very little empirical research into online trolling, with only two key studies being documented before we carried out our own research (but more of that later). The first of these was published by Dr. Pnina Shachaf and Dr. Norika Hara in the Journal of Information Science, and examined trolling in the context of Wikipedia. The second study by Susan Herring and her colleagues focused on trolling in feminist forums. Despite the lack of research, some key findings have emerged. Firstly, Herring’s study identified three types of messages sent by trolls. These were (i) messages from a sender who appears outwardly sincere, (ii) messages designed to attract predictable responses or flames, and (iii) messages that waste a group’s time by provoking futile argument. From this, it is apparent that trolling often merges with several other online behaviours. They pointed out that a troll is an online user that can be uncooperative, that seeks to confuse and deceive and can be a flamer by using insults.

Shachaf and Hara’s study on trolling within Wikipedia revealed that the main reasons for trolling were boredom, attention seeking, and revenge. Furthermore, they regarded Wikipedia as an entertainment venue, and found pleasure from causing damage to it and the people who used the site. Herring’s paper argued that it is non-mainstream environments that are especially vulnerable (such as forums) as they “provide a new arena for the enactment of power inequities such as those motivated by sexism, racism, and heterosexism”. Due to this, one could suggest that trolling is a behaviour that is facilitated and possibly exacerbated by the anonymity of the internet.

Many authors have argued that relative anonymity facilitates disinhibition, resulting in flaming and harassment. This online disinhibition effect is well established in the literature (particularly in a 2004 paper in the journal CyberPsychology and Behavior by Dr. John Suler). As a 2011 paper on internet addiction by Dr. Laura Widyanto myself noted, the internet “might lead to disinhibition, whereby individuals feel more confident as they are protected by their anonymity”. Therefore, internet users have an opportunity to present themselves differently online. From this, the opportunity for trolling is undeniably present as Widyanto and myself make clear, “the internet provides anonymity, which removes the threat of confrontation, rejection and other consequences of behaviour”. This allows individuals to behave online in ways that they would not normally do in the offline world.

Research suggests that anonymity, which is naturally characterised by the internet, may affect a person’s self-esteem. Self-esteem has been consistently associated as an important determinant of adolescent mental health with lower self-esteem being linked to depression and increased levels of anxiety. Therefore, it has been claimed that high self-esteem is psychologically healthy. However, online interactions allow an individual to represent a different self, leading to increased feelings of self-worth and therefore be more psychologically healthy.

However, research into online trolling had not established any association between the effects of trolling and self-esteem, and was one of the main reasons we carried out our own research into the topic. There is quite a lot of research into self-esteem and more general internet use. For instance, research indicates that individuals with low self-esteem prefer to communicate with others through the internet, such as emails, rather than face-to-face. It has also been found that general internet use increases self-esteem, and some research has indicated that video game use decreases self-esteem. This suggests that the internet can be used as a form of social interaction that positively affects self-esteem for those with considerably low self-esteem. However, given the evolution of online gaming in recent years, the effect of self-esteem while playing online video games where social interaction (including trolling) can occur is relatively unknown.

Until recently, trolling had never been studied in an online video game context and there is still little empirically known about it in the most general sense of the term. Trolling often merges other online behaviours such as flaming. Dr. Angela Adrian (in a 2010 issue of Computer Law and Security Review) offers limited, albeit useful, insight into how an individual may troll during online gaming. Adrian names those who enact such behaviour as “griefers”, a term used on those who try to ruin a gaming experience, often by team-killing or obstructing objectives. It could be that griefing is one such behaviour used during trolling in the context of an online video game. Furthermore, given the evolution of online gaming, it is possible that the behaviour of trolling has evolved to fit into the context in which the trolling is being used in (e.g., online forums, Wikipedia, video games), and therefore, contains many other online behaviours that are used to disrupt others’ gaming enjoyment.

Given the little psychological research that had been conducted beyond the fact that it exists Scott Thacker and myself carried out a study to examine the (i) frequency of trolling, (ii) type and reasons for trolling and (iii) the effects trolling may have on self-esteem. Using an online survey, a self-selected sample of 125 gamers participated in our study. Our results showed that trolls tended to play longer gaming sessions. Frequent trolls were significantly younger and male. Types of trolling included griefing, sexism/racism, and faking/intentional fallacy. Reasons for trolling included amusement, boredom, and revenge. Witnessing trolling was positively associated with self-esteem, whereas experiencing trolling was negatively associated. Experience of trolling was positively correlated with frequency of trolling. Although the study used a self-selecting sample, the results appear to provide a tentative benchmark into video game trolling and its potential effects on self-esteem.

Our study has many limitations that need to be taken into account. Firstly, due to the nature of questionnaire design and it being self-report, it may be open to social desirability effects (i.e., participants may answer differently to represent a different self) and any of the other known problems with self-report methods (e.g., unreliable memory and recall biases, etc.). Another major limitation was that the sample was self-selecting and modest in size. This raises questions into its relative generalizability. Despite these limitations, our exploratory study appears to provide several key findings that now provide a preliminary benchmark into video game trolling where there was no previous research. Moreover, it expands the neglected research into online trolling and offers areas and directions for future research.

Dr Mark Griffiths, Professor of Gambling Studies, International Gaming Research Unit, Nottingham Trent University, Nottingham, UK

Additional input: Scott Thacker

Further reading

Adrian, A. (2010). Beyond griefing: Virtual crime. Computer Law and Security Review, 26, 640-648.

Donath, J. S. (1999). Identity and deception in the virtual community. In M. A. Smith and P. Kollock (Eds.), Communities in cyberspace (pp. 29–59). London: Routledge.

Herring, S., Job-Sluder, K., Scheckler, R. & Barab, S. (2002). Searching for safety online: Managing “Trolling” in a feminist forum. The Information Society, 18, 371-384.

Morrissey, L. (2010). Trolling is a art: Towards a schematic classification of intention in internet trolling. Griffith Working Papers in Pragmatics and Intercultural Communications, 3(2), 75-82.

Shachaf, P. & Hara, N. (2010). Beyond vandalism: Wikipedia trolls. Journal of Information Science, 36(3), 357-370.

Suler, J. R. (2004). The online dishibition effect. CyberPsychology and Behaviour, 7, 321-326.

Thacker, S. & Griffiths, M.D. (2012). An exploratory study of trolling in online video gaming. International Journal of Cyber Behavior, Psychology and Learning, 2(4), 17-33.

Widyanto, L., & Griffiths, M. D. (2011). An empirical study of problematic internet use and self-esteem. International Journal of Cyber Behaviour, Psychology and Learning, 1(1), 13-24.

Willard, N. (2006). Cyberbullying and cyberthreats: responding to the challenge of online social cruelty, threats, and distress. Center for Safe and Responsible Internet Use.