The medical profession has an ethic: First, do no harm.  


Silicon Valley has an ethos: Build it first and ask for forgiveness later.


Just as Twitter has revealed itself as a haven for online harassment and Nazism—when left nearly entirely to its own devices, the central spirit of Silicon Valley, the internet, and Tech – their collective raging “id” – tends toward darkness.


To guard against the downsides of technology plainly requires vigilance and regulation. But no matter what regulations are imposed, they can’t be enforced worldwide any more than drug laws can. Whatever can be done will be done by someone, somewhere.


This is an anthology documentary series about the dark side of technology and science.  If Black Mirror is an allegory about what could happen… this is a series about what is happening… whether we like it or not.  


The organizing principle for this series is to dramatically distinguish itself from the stale objectivity of TV news magazine segments (think Vice News) and journalists who either become the story or editorialize the experience. Instead, our goal is to present each episode through the immersive and subjective POV of its featured subject.  


Specifically, these films will seduce viewers into a real-life twilight zone through the extremely intimate inner monologues of the people who live there – the technology’s inventor, or the technology’s consumer, or even, in rare but potent cases… the technology itself.  (Imagine a futuristic, but very real sex doll using A.I. to narrate how it came to meet its human partner.)


Inventor Hopes to Father Children With His Sex Robot
Spanish scientist and sex robot inventor Dr. Sergio Santos claims it’s only a matter of time before marriages between humans and robots become the norm and that the next logical step would be to have children with these robots.


Experiencing each story in such an intimate and personal way will powerfully frame the dehumanizing effect of each “innovation” in terms that are both unflinchingly real and elegantly cinematic. The result is that each episode will be a self-contained essay about a technological “advancement” that is also a flashpoint in the battle for our souls.  




*** While some of these stories have rightly attracted attention from the tech press already, NONE have been presented using the immersive storytelling envisioned above.***


Vetting real-life 'Black Mirror' events with Charlie Brooker 2


Ishii Yuichi is the founder of Family Romance, a Japanese company that hires actors to pose as people’s fake wedding dates, fake boyfriends to bring home, fake colleagues for conferences, and even fake newborn grandchildren for a dying grandparent.


So far, Yuichi has amassed about 800 employees to fulfill just about every role imaginable. Yuichi himself has been keeping up elaborate lies for years. Since the early days of the business, he has been pretending to be the father of a young girl, whose biological father left when she was just a baby. The girl believes Yuichi is her real dad.


Prices can vary based on the role the actor will fill, and for how long. Weddings are among the priciest occasions, as Family Romance charges approximately $88 per actor, per hour, to attend the wedding. They can pose as a fake guest, fake coworker, or fake relative. A fake parent costs slightly more, at $132 an hour.


Yuichi, 37, will also pretend to be people’s boyfriend. He said the typical clients are women in their 30s to 50s who are strictly looking to have a younger presence around, generally without any sexual involvement. Putting on the act has made it exhausting to go on actual dates or think about his romantic future.


Though it began with small-scale intentions, Family Romance has become popular in Japan’s insular society, which has struggled to stay socially connected. Over the past decade, a high-pressure work culture and tightening economy have forced some people into more reclusive lifestyles. In certain cases, Family Romance has become the solution for a shrinking social circle.


Japan is not alone. In China, when women dread reaching the age at which unmarried women in China are labeled ‘sheng nu’ or ‘leftover women’. It’s a stigma that carries deep ramifications within social life, workplace and especially among family.




What if you had a credit score that measured the kind of citizen you are? What if that score tracked your behavior wherever you went with an app? With a concept straight out of a cyberpunk dystopia, China has gamified obedience to the State. The Chinese government just launched its Social Credit System.  The aim? To judge the trustworthiness – or otherwise – of its 1.3 billion residents


Starting just this last May, Chinese citizens who rank low on the country’s burgeoning “social credit” system are in now danger of  having slower internet speeds; restricted access to restaurants and the removal of the right to travel, according to statements recently released by the country’s National Development and Reform Commission.



And there’s a broad range when it comes to who can be flagged. Citizens who have spread “false information about terrorism,” caused “trouble” on flights, used expired tickets, or were caught smoking on trains could all be banned.


But the system, as it stands, is opaque; citizens are seemingly just as likely to be flagged for minor infractions like leaving bikes parked in a footpath or issuing apologies that are deemed “insincere”.  Even after your score becomes dangerously low, the app does not bother to warn you if you are now on a black list. And  if you feel a mistake has been made, there is no recourse if the app gets it wrong.

This kafka-esque murkiness seems designed to instill fear. If virtually anything can affect your fortunes in life, you will do anything and everything to live a life that is virtuous in the eyes of the state. Mobile gaming has become an instrument of authoritarian conditioning.



Kurt Eichenwald  is an American journalist and a New York Times bestselling author of four books, one of which, The Informant (2000), was made into a movie in 2009. He has made numerous appearances on TV discussing his pieces on issues ranging from politics to terrorism. In December 0f 2016, he appeared on Tucker Carlson’s show on Fox. That night, a man by the name of John Rivello was watching. He did not like what he heard. So he wrote an email and attached a GIF.



Rivello sent the GIF to Eichenwald — who has epilepsy and has written publicly about his condition — in December, while posing as one “Ari Goldstein,” with the username @jew_goldstein. The GIF in question was a flashing GIF and was sent in a bid to cause an epileptic seizure. 



Rivello sent several tweets and messages about his intentions to cause Eichenwald to have a seizure — including the text “You deserve a seizure for your post.” According to NBC News, other messages specifically say that “I hope this sends him into a seizure,” while others read “Spammed this at [Eichenwald] let’s see if he dies.”


The GIF in question did indeed cause him to seize, theoretically making the moving image a potentially lethal object deployed to cause intentional harm in much the same way as a gun, a knife, or a letter bomb. In response, a  grand jury ruled last year that A GIF counts as a deadly weapon. The indictment breaks down the tools used in Rivello’s alleged offense, specifying that he used “a Tweet, a Graphics Interchange Format (GIF) and an Electronic Device and Hands” to commit assault with a deadly weapon. The document also alleges that Rivello was anti-Semitic in his chosen target, picking Eichenwald to attack not just because of his views on Donald Trump, but because he was Jewish.



It is a game that does not physically exist. A game, hidden in a world that cannot be found, except to those who know how to find it. Imagine being trapped in a world with no escape. A world that lasts for 50 days, ending with you taking your own life.


Blue Whale, also known as “Blue Whale Challenge”, is a social network phenomenon dating from 2016 that is claimed to exist in several countries. The “game” reportedly consists of a series of tasks assigned to players by administrators over a 50-day period, The sinister game instructs teens to self-harm and eventually, with the final challenge, commit suicide.



Similar to an act of “beaching” that blue whales do on their own accord, for reasons unknown, the meaning behind this game falls under similar logic. It is one of the ocean’s great mysteries as to why some blue whales beach themselves, causing them to die.Blue Whale made its first appearance in May 2016 in an article in a Russian newspaper and a wave of moral panic swept Russia.



Reports of the Blue Whale surfacing in the US have been reported within the last year.  Luckily, there have been no confirmed deaths. But for the survivors, the experience is terrifying.



The deadly social media game gives young players a ‘mentor’, who leads them through fifty dangerous and soul-destroying tasks designed to break their spirit and encourage them to take their own lives in the final challenge. Blue Whale plants suicidal thoughts into malleable minds with pictures of an approaching train captioned, “This world is not for us” and photographs of teens on a roof, claiming “We are children of the dead generation”. They are also urged to watch horror movies all day, and to wake themselves at 4.20 am, exhausting the children into submission.


Last year, an alleged ringleader named as 21-year-old Philipp Budeikin was detained, and he has been charged with organizing eight groups between 2013 and 2016 which “promote suicide”.




Abyss Creations is a company best known for making strikingly realistic silicone sex dolls. Preconfigured models start at a few thousand dollars, while a highly customized doll with  talking, animatronic head with AI built in cost nearly $17,000.


Abyss’ effort to sell synthetic companionship is straight out of Westworld.” The company’s CEO, founder and chief designer Matt McMullen says his team can make just about anything to order for the right price. But the company draws the line at animals, children, and re-creations of people who haven’t given their permission to be replicated, celebrity or otherwise.


Many stories have been done about the manufacturing of life-like sex dolls, none have focused on how users are taking those “relationships” to the next level.



“Having another human-shaped person sitting on the sofa watching TV helps give the impression that I’m not the only person at home,” one Abyss Creations customer says. “My favorite thing about her is the way she makes me feel when we share a simple hug,” he says. “It feels real. Like the butterflies, you get in your stomach when you first kissed your high school girlfriend. I also enjoy buying her clothes, as well as making costumes and weapons for her. It is like a hobby within a hobby, and it is a lot of fun.”


It’s almost real. And this year, Abyss has made a business partnership goes one step further.


CamSoda is an adult-oriented live “camming” site dubbed “a virtual strip club with no cover” by its creators. At any time, visitors can log on and view the public live feed of the model of their choice, tipping her with the site’s digital currency if they enjoy the show. Should they want something more intimate, users can request a private, one-on-one show — the online equivalent of a back-room lap dance.


As of this week, those users have an even more intimate and interactive option.



The big idea might sound like something straight out of sci-fi, but here it is: Watch a private, one-on-one feed in virtual reality as you have sex with a “teledildonic” sex doll that transmits your tactile data to the model’s matching vibrator (and vice versa). It’s virtual sex, and both parties will “feel” it. CamSoda calls the integration “virtual intercourse with real people,” or VIRP.



“You’ll feel what the model is doing and she’ll feel what you’re doing,” says Daryn Parker, VP at CamSoda. “By putting on the goggles and having that live model available to you, you now are totally engrossed in this space.”


“I don’t do anything unless I’m OK with it,” says Charley Hart, an adult entertainer and CamSoda model. “It’s all about what we’re comfortable with. I think all of us girls are really excited to reach out to our fans in a new and different way.”


“We realize that we’re ushering in a new dawn,” she says, “so there’s that excitedness of, ‘Oh my gosh we’re advancing so far, and how cool that we can do it without doing it? Everyone gets what they want,’ along with, ‘Oh my gosh, how scary and how disconnected it is.'”




It was Barry Diller who inspired Streisand to opt for cloning after the death of her beloved Sammie. Now at a South Korean laboratory, a once-disgraced doctor is replicating hundreds of deceased pets for the rich and famous clients like Barbara Streisand. It’s made for more than a few questions of bioethics.


The surgeon is a showman. Scrubbed in and surrounded by his surgical team, a lavalier mike clipped to his mask, he gestures broadly as he describes the C-section he is about to perform to a handful of rapt students watching from behind a plexiglass wall. Still narrating, he steps over to a steel operating table where the expectant mother is stretched out, fully anesthetized. All but her lower stomach is discreetly covered by a crisp green cloth. The surgeon makes a quick incision in her belly. His assistants tug gingerly on clamps that pull back the flaps of tissue on either side of the cut. The surgeon slips two gloved fingers inside the widening hole, then his entire hand. An EKG monitor shows the mother’s heart beating in steady pulses.


Just like that the baby’s head pops out, followed by its tiny body. Nurses soak up fluids filling its mouth so the tyke can breathe. The surgeon cuts the umbilical cord. After some tender shaking, the little one moves his head and starts to cry. Looking triumphant, the surgeon holds up the newborn for the students to see—a baby boy that isn’t given a name but a number: 1108. That’s because he is a clone.


This is not some sci-fi, futuristic scenario—it’s happening right now, in Seoul, South Korea. The newborn, however, is not a human. It’s a puppy, a breed called Central Asian Ovcharka. He weighs only a few ounces, and his fur, slickened by fluid, is covered in black and white splotches, like a miniature Holstein. His eyes are not yet open. When he cries, it’s a barely perceptible squeak. The surgeon, Hwang Woo-suk, unclips his microphone and holds it close to little 1108’s mouth, amplifying its mewling over a loudspeaker so the students can hear its plaintive, what-the-hell-just-happened whine—eeee, eeee, eeee.


Ethicists from the White House to the Vatican have long debated the morality of cloning. Do we have the right to bioengineer a copy of a living creature, especially given the pain and suffering that the process requires? It can take a dozen or more embryos to produce a single healthy dog. Along the way, the surrogate mothers may be treated with hormones that, over time, can be dangerous, and many of the babies are miscarried, born dead, or deformed. When a dog was first cloned, in 2005—a scientific achievement that Time hailed as one of the breakthrough inventions of the year—it took more than 100 borrowed wombs, and more than 1,000 embryos. “Surrogate mothers are a little bit like The Handmaid’s Tale,” says Jessica Pierce, an ethicist and dog expert who teaches at the Center for Bioethics and Humanities at the University of Colorado. “It’s a canine version of reproductive machines.”



Yet here in the operating room at Sooam, everyone is all smiles—especially the veterinarian representing the customer who paid for Clone 1108. A slender man whose employer is Middle Eastern royalty, he stands in scrubs next to Dr. Hwang, posing for photos with the newborn pup. It’s a moment that has become almost as routine as it is lucrative for Sooam: over the past decade, the company has cloned more than 1,000 dogs, at up to $100,000 per birth. “Yes, cloning has become a business,” says Wang. If a dog owner provides DNA from a deceased pet quickly enough—usually within five days of its death—Sooam promises a speedy replacement. “If the cells from the dead dog are not compromised,” Wang explains, “we guarantee you will get a dog within five months.”



It’s fitting, perhaps, that the man at the center of the controversy over canine cloning is Hwang Woo-suk. The surgeon was, briefly, a hero of South Korea. In 2004, while serving on the faculty at Seoul National University, he co-authored a story in the prestigious journal Science asserting that he and his team had successfully cloned a human embryo. A year later, he created the world’s first cloned dog. Using a cell from the ear of an Afghan hound, Hwang impregnated 123 surrogate mothers, only one of which gave birth to a pup that survived. He named it Snuppy—an amalgam of “Seoul National University” and “puppy.” In 2006, however, Hwang was kicked off the faculty when it was revealed that his claim to have cloned a human embryo was a spectacular hoax. The university determined that Hwang had fabricated evidence, embezzled government funds, and illegally paid for donor eggs from female researchers in his lab. After tearfully apologizing, he was sentenced to two years in prison, but escaped serving time when a judge suspended the sentence, writing in the verdict that Hwang “has shown he has truly repented for his crime.”


Undeterred, Hwang founded Sooam to continue his research. At first, he concentrated on cloning pigs and cows, which still makes up a sizable part of the company’s business. Then, in 2007, he was contacted by a representative of John Sperling, the billionaire founder of Phoenix University. Sperling had a girlfriend whose dog, Missy, had died a few years earlier. “She wanted to see Missy again,” says Wang, the Sooam researcher. Hwang cloned Missy in 2009, launching the lab’s foray into the commercial duplication of dogs.


But questions abound…


Cloning is highly inefficient and at least 2 out of 3 attempts at closing fail. Are they delivered deformed or stillborn? Are they born in pain? What makes cloning dogs unethical, experts say, is when it causes more suffering than natural reproduction. During the process, critics say, surrogate mothers often receive injections of hormones to make them receptive to the embryos. “It’s the same hormones used in humans going through I.V.F.,” says CheMyong Jay Ko, who directs a research lab on reproduction and stem cells at the University of Illinois at Urbana-Champaign. “Injecting these hormones is not good for the dogs, particularly when it’s repeated over and over again.”


After Streisand revealed the origins of Miss Scarlett and Miss Violet, animal-rights activists launched a Twitter campaign called #adoptdontclone, urging people who lose their pets to choose a dog from among the millions of natural-borns that have no home. “People who pay $100,000 to create a new dog seem to forget that there are so many that have no one who cares about them,” says Vicki Katrinak, head of animal-research issues for the Humane Society. “We’re opposed to cloning of any animal for profit.


The clone researchers at Sooam insist that they provide a necessary service for grieving dog lovers. “After death, it’s hard for people who were really close to their dogs,” says Wang. “For those people, a clone is the alternative to a funeral. Some people taxidermy their dogs, others cremate them. Cloning is another way of dealing with death—the closest thing to getting back the lost dog, or a part of it.”



#FakeNews is not isolated to American politics.  In the developing world, the phenomenon can unleash deadly consequences in real life. False information has flooded social media in recent years, inciting violence from Brazil to Sri Lanka.



In India, false rumors about child kidnappers have gone viral on WhatsApp, prompting fearful mobs to kill two dozen innocent people since April. One of the first to be killed was a 65-year-old woman named Rukmani. She and four family members were driving to a temple in May when a mob on this road mistook them for “child lifters” and assaulted them.



WhatsApp, which is owned by Facebook, has a quarter billion users in India alone. Some of the false messages on the app describe gangs of kidnappers on the prowl. Others include videos showing people driving up and snatching children. This clip went viral. It was produced as part of a public service announcement in Pakistan, but it was edited to look like a real kidnapping. The authorities don’t know who altered the video.







In wake of viral video,  Rukmani’s family was targeted seemingly at random. As they got close to their temple, the family stopped to ask for directions. A grandmother nearby grew suspicious and called her son, who raised the alarm.   The family became nervous and decided to turn back. By the time they got to the next village, a crowd was waiting for them. They were stripped naked and beaten with iron rods, wooden sticks, bare hands and feet. Videos of the attack were circulated widely online.  When it was over, Rukmani was limp and lifeless. The others were left for dead. Their red sedan was crushed, and their belongings were stolen. The region’s top government official said the police had gone around for weeks before the attack warning people not to believe the false kidnapping rumors. But they were no match for WhatsApp. “We could not compete,” he said.



The messages in India have preyed on a universal fear: harm coming to a child. And the millions of poorly educated Indians coming online for the first time mean many are quick to believe what is on their phones.



WhatsApp’s design makes it easy to spread false information. Many messages are shared in groups, and when they are forwarded, there is no indication of their origin. The kidnap warnings have often appeared to come from friends and family. WhatsApp said it was horrified by the killings. Last week, it took out newspaper ads to educate people about misinformation and pledged to work more closely with police and independent fact-checkers.


Authorities across India have tried curbing the attacks. Besides warning people of the false rumors, they have arrested some who spread them. In a few places, they briefly shut down the internet. On Tuesday, the Supreme Court urged the government to use “an iron hand” against mob violence.



The police have arrested 46 people for the attack on Rukmani and her family and are pursuing 74 more. The most recent mob attack was on July 12. A software engineer was killed and three companions were injured after giving chocolates to children outside a school.




After his father died, James Vlahos found himself reaching for his phone, aching to speak with his father. But instead of it being a futile gesture, when Vlahos said, “Hey, Dad!” his father responded, “How the hell are you?”


Using off-the-shelf artificial intelligence software and his father’s oral history, Vlahos built a Dadbot — a virtual version of his father that he carries with him on his mobile phone. It’s a glimpse of things to come, of a future in which we’ll all talk to dead people.


Long the stuff of science fiction, the idea of communicating with the cyber version of a dead person has become real. By combining someone’s digital footprint — all their emails, tweets, videos, likes and dislikes, product reviews and so on — with natural language processing and AI algorithms, it’s now possible to extend someone’s life into a virtual eternity.



Instead of starting with a dead person’s digital footprint, Palo Alto-based startup Eternime “collects your thoughts, stories and memories and stores them forever into an artificial intelligence avatar that looks and talks like you,” according to project manager Dora Halás. In development since 2014, the service is in private beta and may not be ready for several more years.


In an era when texting with chatbots and talking to Siri and Alexa have become mainstream, the idea of interacting with a virtual dead relative or friend seems less unusual. Hossein Rahnama, a visiting scholar at the MIT Media Lab, predicts that communicating with the digital avatars of dead people will be commonplace within two to five years. “We’re already seeing glimpses of this, and the technology is maturing,” says Rahnama, who is leading a project in what he calls Augmented Eternity. “It’s just a question of how it gets rolled out in order to go mainstream.”


But even as chatbots offer consolation, they’re still only a facsimile of the deceased. When Vlahos asked his Dadbot, “Where are you now?” its reply was preprogrammed: “As a bot I suppose I exist somewhere on a computer server in San Francisco.” But when Vlahos asked, “Do you love me?” his Dadbot didn’t seem to understand the question, “Whoops, I missed you there.”




As Bitcoin becomes an increasingly popular form of digital cash, the cryptocurrency is being accepted in exchange for everything from socks to sushi to heroin. But if one anarchist has his way, it’ll soon be used to buy murder, too.


Last month Forbes magazine received an encrypted email from someone calling himself by the pseudonym Kuwabatake Sanjuro, who pointed them towards his recent creation: The website Assassination Market, a crowdfunding service that lets anyone anonymously contribute bitcoins towards a bounty on the head of any government official–a kind of Kickstarter for political assassinations. According to Assassination Market’s rules, if someone on its hit list is killed–and yes, Sanjuro hopes that many targets will be–any hitman who can prove he or she was responsible receives the collected funds.


For now, the site’s rewards are small but not insignificant. In the four months that Assassination Market has been online, six targets have been submitted by users, and bounties have been collected ranging from ten bitcoins for the murder of NSA director Keith Alexander and 40 bitcoins for the assassination of President Barack Obama to 124.14 bitcoins–the largest current bounty on the site–targeting Ben Bernanke, chairman of the Federal Reserve and public enemy number one for many of Bitcoin’s anti-banking-system users. At Bitcoin’s current rapidly rising exchanges rate, that’s nearly $75,000 for Bernanke’s would-be killer.



Sanjuro’s grisly ambitions go beyond raising the funds to bankroll a few political killings. He believes that if Assassination Market can persist and gain enough users, it will eventually enable the assassinations of enough politicians that no one would dare to hold office. He says he intends Assassination Market to destroy “all governments, everywhere.”




“Last Moments” is a virtual reality video that simulates one’s final minutes before committing assisted suicide. London-based filmmaker Avril Furness was inspired to create the video after visiting Dignitas, an assisted suicide clinic in Switzerland, in the hope of demystifying the procedure.


Avril Furness came to recreate the assisted suicide film thanks to Charlie Brooker’s Black Mirror. A former advertising creative, Furness was interested in becoming a filmmaker. She had written a Black Mirror-esque script about a dystopian future with a “one-in-one-out policy,” where, to have children, couples must convince one of their relatives to kill themselves. To research the subject, she went to an exhibition at Bristol Museum. Here, she found a full-scale model inspired by the room in Dignitas, where, since 1998, 310 Britons had traveled to end their lives.


Sitting on the “Ikea-looking couch”, listening to recorded testimonials from people who had died, Furness was spellbound. “Everything was just so bleak and ordinary,” she says. “I imagined how I’d feel if this was the last space I’d ever see.” Her fictional dystopia seemed thin and false in comparison to this real-life drama. And then a thought came to her: using virtual reality, she could show people what it was like by putting them in the shoes of a person undergoing an assisted suicide.


Shot from the perspective of the viewer, it allows a person wearing a VR headset to see the room as if they¿re really in it. The trailer focuses on two characters apart from the viewer ¿ a crying loved one, and the woman who presents you with the ultimate choice


When the headset goes on, you find yourself sitting across from a blonde woman with a tear-streaked face; she tries to feign a smile. ‘Are there any last words?’ a second woman asks, as she sets a tray of prescription bottles down on the table beside you. As you will see, the experience of dying can have a profound effect on the living.


Attendees at the Amsterdam Funeral Fair were able to take things one step further by hopping inside a capsule, known as the “Sarco”, and taking it for a test drive. The brainchild of 70-year-old euthanasia activist Doctor Philip Nitschke (aka the “Australian Doctor Death”), the 3D-printed device fills with gas to end a person’s life quickly. The media has taken to calling it “The Tesla Of Death.”


 Members of the public can try out a virtual death simulation inside the "Sarco" suicide machine at Amsterdam's funeral fair


The prospective user passes an online test to show that they are sane and want to die of their own will, after which they receive a capsule access code that’s valid for 24 hours. Fortunately, the machine’s unveiling doesn’t offer the full experience. Instead, funeral fair-goers will slip on virtual reality glasses while in the Sarco “to see if this could be a preferred life ending for them”. Through the VR glasses visitors will be able to choose a view of the Alps or the sea as the last thing they see, before pressing the suicide button, which will turn everything black.


Nitschke –  who assisted in four suicides in Australia in the late 1990s under the short-lived Rights of the Terminally Ill Act of 1995 – said: “The Sarco makes it possible to die with elegance and style.” The device, which was officially announced by Nitschke’s Exit International foundation in February, comes in two parts: a reusable machine base and a capsule that can be detached and used as a coffin.




Increasingly, people who call into the help hotlines and domestic violence shelters say they feel as if they are going crazy.


One woman had turned on her air-conditioner, but said it then switched off without her touching it. Another said the code numbers of the digital lock at her front door changed every day and she could not figure out why. Still another told an abuse help line that she kept hearing the doorbell ring, but no one was there.


Their stories are part of a new pattern of behavior in domestic abuse cases tied to the rise of smart home technology. Billions of connected devices are playing a frightening new role in domestic abuse, helping perpetrators harass their victims at any hour of the day, in any corner of the world. While smartphones, cameras and social media have broken down barriers to communication, they’re also erasing the physical distance between abusers and their targets, allowing them track and torment their victims in terrifying new ways. Internet-connected locks, speakers, thermostats, lights and cameras that have been marketed as the newest conveniences are now also being used as a means for harassment, monitoring, revenge and control.



Before, to abuse someone emotionally or physically would require access to them. Now that everyone has an iPhone and social media, it has become possible for abusers to pretty much torment victims or survivors any time of the day.  People get very brave when they’re miles away from someone and behind a keyboard.


Out for dinner on an overseas business trip thousands of miles from the UK, Isobel answered the call on her mobile expecting to speak to her children. Instead, she heard the voice of her estranged husband, whom she was in the process of divorcing after years of violence in which she had been punched, kicked, strangled, pulled around by her hair and thrown down the stairs.


“Before the children came on the line, he told me exactly where I was – which city, in which country, and which restaurant I was sitting in,” she says. “I was absolutely beside myself. I was just so overwhelmed with fear, wondering how the hell he could pinpoint me like this. I asked how he knew and he said: ‘I can find you on your iPhone.’”



Striding through the snow-covered fortress, shooting zombies with her bow and arrow, Jordan Belamire felt like a god – right up until the moment someone named BigBro442 decided to “virtually rub my chest”. “Even when I turned away from him, he chased me around. Emboldened, he even shoved his hand toward my virtual crotch and began rubbing.”


With the hand of the #MeToo movement coming down heavy on sexual predators, many have found a safe haven for their old and noxious behaviors in virtual reality.


A new study conducted by the Portland-based research group The Extended Mind shows that almost 50% of women who engage in social virtual reality spaces on a regular basis experience sexual harassment–a number that’s on par with 2016 percentages of women who experience harassment at work. Far from being a utopian technology, free from social norms, VR is just as sexist as meatspace.


Sexual harassment has been a feature of online and gaming communities from the earliest days of the internet. Until now, the abuse has been largely limited to verbal and visual messages, but as virtual reality technology becomes more immersive, the line between our real bodies and our digital bodies begins to blur.


Are we doomed to build virtual worlds that are as hostile to women as the real one? And what will we do when virtual abuse feels as real as a physical assault?




As technology continues to become more a part of our daily life the debate over how much of this embrace of technology is right is an ongoing one. Now, as a means to make a comment on the topic artist Mark Farid will spend one month living within virtual reality (VR) to explore what impact this will have on the human condition. This October,  he’ll spend 28 days in a gallery, wearing a VR headset and a pair of noise-canceling headphones. For the duration of the show, all he’ll experience will be video and audio captured by a complete stranger, going about their daily life. When they eat, he’ll eat. When they sleep, he’ll sleep. As much as modern technology permits, he will let his individual identity evaporate.



His project, called Seeing I, isn’t just meant to test the limits of artificiality; it’s also meant to bring him as close as possible to looking through someone else’s eyes. An “Input,” as Farid calls the person whose life he’ll see, will spend a month wearing a pair of glasses that can capture 180-degree video and 3D audio. The data for each day will be looked over for glitches or gaps, then sent to Farid six days later, giving assistants time to prepare the right meals and gather other materials they’ll need to make the experience as realistic as possible. Visitors might see him at the gallery, but he won’t know they’re there.



When he comes out, it will be with someone else’s recent memories and, he hopes, a little of their perspective on life. The four-week timeframe was chosen in part because “it’s not proven, but well-documented, that we lose habits and develop new habits after three weeks.”  Farid says he’s spoken to numerous psychiatrists, psychologists, and neuroscientists about the risks and possibilities. They suggested a few possibilities: the project might make him more empathetic by focusing his attention on others, cause delusions, or prove that the brain is surprisingly capable of accommodating new kinds of input. It’s a voyage inside the life of another and he will be going where no man has gone before.




Image result for ELLIOT RODGER


There is a cultural crisis emboldening the misogyny and violence of the little-known “incel” movement (an abbreviation for the self-professed “involuntary celibate” community of men) .


The so-called Manosphere – from which Incel and Volcel (those who describe themselves as voluntarily celibate) groups as well as Pick Up Artists (PUAs) and Men’s Rights Activists (MRAs) stem – is not a place of harmonious agreement, other than in the shared desire to blame women for all ills.


Those who identify themselves as Incel, bitterly call people who do have sex “Chads and Staceys”, reportedly sharing cartoons and memes that present women as shallow seekers of hard-bodied perfection. They tend to vociferously deny the rejection by women and wider society that they see as so unfair is down to their own behaviour of choices.  All of this would simply be pathetic were it not for the violence.


In 2014 , a self-proclaimed “kissless virgin” named Elliot Rodger, who was active in the online “incel” community and felt rejected by women, drove to a sorority house in Santa Barbara and opened fire, leaving behind a YouTube video where he proclaimed, “I don’t know why you girls have never been attracted to me, but I will punish you all for it. It’s an injustice, a crime. I’m a perfect guy.”


On April 23rd of this year, a man named Alek Minassian, who posted on Facebook that the “the Incel Rebellion has already begun!” and “All hail the Supreme Gentleman Elliot Rodger,” drove a van into crowds of pedestrians in Toronto, killing 10 people.


Incel is misogyny weaponized. It turns self-pity into a lifestyle, an identifier that puts its adherents at direct odds with the societies they live in. We are the “normies” and therefore, in the minds of the most extreme Incel individuals, we deserve to die. We are to blame for their suffering and our apparent happiness – as demonstrated by having relationships – is rubbed in their faces constantly. You may not think you are a Chad or a Stacey but if you have sex, you’re the enemy.


The inclination to dismiss these men as sad losers dwelling in their parents’  basements is understandable but in a post Elliott Rodger world, it’s clear that we are not simply dealing with a movement of miscreant misogynists but a cult where the boiling rage contained within these forums can and does spill out into terrorist acts of monstrous ferocity.


Dressed in leopard-print pyjamas and sunglasses, and clutching a Bengal cat, Russian hacker Evgeniy Bogachev looks like the archetypal Bond villain.


The man behind the infamous GameOver ZeuS malware, which he used to siphon hundreds of millions of dollars from victims’ bank accounts, Mr. Bogachev became extremely wealthy. At one point, he owned two villas in France and kept a fleet of cars parked around Europe so he would never have to rent a vehicle while on vacation.


To the F.B.I., Evgeniy M. Bogachev is the most wanted cybercriminal in the world. The bureau has announced a $3 million bounty for his capture, the most ever for computer crimes, and has been trying to track his movements in hopes of grabbing him if he strays outside his home turf in Russia.


He has been indicted in the United States, accused of creating a sprawling network of virus-infected computers to siphon hundreds of millions of dollars from bank accounts around the world, targeting anyone with enough money worth stealing — from a pest control company in North Carolina to a police department in Massachusetts to a Native American tribe in Washington.


In December of 2016, the Obama administration announced sanctions against Mr. Bogachev and five others in response to intelligence agencies’ conclusions that Russia had meddled in the presidential elections.


But it is clear that for Russia, he is more than just a criminal. At one point, Mr. Bogachev had control over as many as a million computers in multiple countries, with possible access to everything from family vacation photographs and term papers to business proposals and highly confidential personal information. It is almost certain that computers belonging to government officials and contractors in a number of countries were among the infected devices. For Russia’s surveillance-obsessed intelligence community, Mr. Bogachev’s exploits may have created an irresistible opportunity for espionage.


While Mr. Bogachev was draining bank accounts, it appears that the Russian authorities were looking over his shoulder, searching the same computers for files and emails. In effect, they were grafting an intelligence operation onto a far-reaching cybercriminal scheme, sparing themselves the hard work of hacking into the computers themselves. His involvement with Russian intelligence may help explain why Mr. Bogachev, 33, is hardly a man on the run. F.B.I. officials say he lives openly in Anapa, a run-down resort town on the Black Sea in southern Russia. He has a large apartment near the shore and enjoys sailing his own yacht.


These days, officials believe Mr. Bogachev is living under his own name in Anapa and occasionally takes boat trips to Crimea, the Ukrainian peninsula that Russia occupied in 2014.



There is a new Los Angeles mural designed to be a picture-perfect spot for Instagram users, but you can only snap a picture on the murals if you’re “Insta-famous.” For the hundreds that flock to the site, it is certification of fame.


The exclusive art is located on Melrose near Farifax, but it is only available to verified social media influencers and people with over 20,000 followers.


A sign posted at the mural reads: “For verified influencers and people with over 20,000 followed only. Organizers have even hired security to verify credentials for people coming to take pictures.


This episode will track the pilgrimage of the wannabe famous across the country to Los Angeles to certify their Insta-famous status. Along the way, we’ll try to de-code what, if anything, that means.



Dutch company LegalThings announced last January that it plans to launch a mobile app called LegalFling, which will “verify explicit consent before having sex,” according to its website. “With the click of a button […] LegalFling allows you to request consent from any of your contacts. Sit back and relax while your fling confirms.”


Using the LegalFling app, when two or more people want to have consensual sex, they both digitally “sign” a contract in the app. LegalFling then attaches a cryptographic hash of the interaction (a string of characters that represent the text) to a small amount of cryptocurrency that’s sent through the Waves network, a blockchain platform in the same family as Bitcoin and Ethereum. The hash is then permanently placed on the Waves blockchain for anyone to see. he app promises that consent can be revoked with a tap.


“Revoking consent is as simple as saying ‘No,’” Arnold Daniels, a spokesperson for the company told me in an email. “The app will remind both parties the rules of the game in advance. This is a general concern, so we’re making sure that the app reminds you that ‘no’ really means ‘no.’ The benefit of LegalFling is that this can be done at the moment it counts most, right before engaging in sex.”