Showing how moderation in daily life is good for all. 2nd Hand barking can definitely be moderated so that all can be happy. Thanks to the New York Times for featuring a great article to give us hope.
The dinosaurs didn’t know it, but their world might have narrowly averted upheaval this month.
Jim Wilson/The New York Times
Employees of the Metaverse Mod Squad monitor an array of Web sites from its headquarters in Sacramento. Many of its moderators work from home.
In 2007, Metaverse provided virtual crowd-control services for a Second Life speech by an avatar of Newt Gingrich.
NFL Rush Zone
In the N.F.L. Rush Zone, the league’s virtual world for children, a Metaverse member played the role of referee.
For two years, all the denizens of Webosaurs, an online virtual world for children 5 to 12, could customize their dinosaur avatars with leather armor and other whimsical outfits.
Recently, though, the Webosaurs founder, Jacques Panis, decided that leather armor should be available only to premium members, who pay about $6 a month. Players with free membership would be denied that attire.
Then the Metaverse Mod Squad stepped in. The company employs moderators around the country who monitor the Webosaurs site to keep its users safe and happy.
In this instance, it told Webosaurs that if the change were made, the free users might abandon the Webosaurs world or turn on one another. In the end, the dinosaurs kept their armor, and Webosaurs avoided the possibility of alienating some of its 1.5 million registered users.
“I’m running a business, but Metaverse Mod Squad, as the moderators and community managers, is the voice of the kids,” Mr. Panis says.
Since starting Metaverse in 2007, Amy Pritchard, its chief executive, has emerged as an industry expert in creating safe, engaging online communities for both children and grown-ups.
Metaverse has a client list that includes the Cartoon Network, the National Football League, Nickelodeon and the State Department. It employs an army of workers — often stay-at-home moms — to monitor and moderate Web sites where children create their own characters, or avatars, and can interact with thousands of other users. Metaverse’s employees frequently create their own avatars to help maintain the peace.
Ms. Pritchard says the stakes are higher in online worlds intended for children, like Webosaurs. In more adult-oriented sites like Second Life, users must be at least 16 and are presumably more equipped to deal with the threats of online interaction.
She has found that keeping children safe has a lot to do with keeping them entertained. “If you just release kids into these online playgrounds with no one to monitor them and no rules, it’s ‘Lord of the Flies,’ ” she says. “But if you can balance safety with fun and engage the kids, I guarantee you’ll have a site with a great group of kids and no cyberbullying.”
In three and a half years, Metaverse has grown from a whimsical idea hatched in a Second Life virtual bar into an agency that has been profitable since 2009 and had revenue “in the millions” last year, she says, declining to be more specific. The company is private.
WITH her sensible bob and librarian glasses, Ms. Pritchard, 42, looks like a typical suburban mom, until you see her shoes. Her chunky Mary Janes, with oversized stitching, give away her less conventional side. So do the skateboard stickers slapped on the back of her PowerBook. One sticker is from a surf shop, another is from a punk band and two are from “Sesame Street,” added by her 5-year-old daughter, Mary, who in some ways is responsible for the existence of her mother’s company.
A lawyer by training, Ms. Pritchard expected to continue as a commercial litigator when she became a mother. But “having Mary changed everything,” she says.
She stumbled onto a business idea while exploring the virtual world of Second Life with her husband, Ron, who had taken a job at Linden Lab, Second Life’s creator. Ms. Pritchard was taken with the breathtaking landscapes, elaborate buildings and whimsical avatars — from long-legged blond bombshells to blue giraffes — that users created for themselves. But she says she noticed that few users visited some of the elaborate environments created by major corporations because the companies offered nothing to do there.
“Companies had no idea how to create relationships in 3-D,” she says.
Ms. Pritchard, however, knew exactly how to make friends online. As a side job, she had moderated message boards for the WB television network and had struck up close friendships with several other moderators.
After introducing them to Second Life, she persuaded five of her moderator friends to create avatars and join her regularly at a Second Life virtual sports bar called the Thirsty Tiger.
There, Ms. Pritchard struck up a friendship with the bar’s creator, Mike Pinkerton, a real-life lawyer in New Orleans. One night in July 2007, she ran this idea past him: What about a virtual company, providing remote moderators to staff Second Life sites for corporations, and to moderate Web forums? Mr. Pinkerton signed on as chief operating officer of the fledgling business.
By the end of the summer, the company had a name, a pool of on-call moderators drawn from Ms. Pritchard’s network, and a client: Newt Gingrich. Ms. Pritchard had learned that Mr. Gingrich, the former House speaker, was planning to create an avatar and to give a speech in Second Life in an area designed to look like the United States Capitol, created by an interactive marketing firm as a space for public debate.
Knowing that Second Life events like this were sometimes plagued by “griefers”— troublemakers who might appear in outlandish or offensive attire, create protest signs that could fill the entire screen, or otherwise disrupt activities — Ms. Pritchard called Mr. Gingrich’s office. She offered her company’s virtual security services to a confused aide.
“No one had ever heard of avatar bodyguards before,” Ms. Pritchard says.
When Mr. Gingrich’s avatar delivered his speech in September 2007, a bevy of Metaverse bodyguards, clad in go-go boots and 1960s-style mini-trench coats, surrounded him. They peacefully resolved the only security breach, asking one guest to cover her bikini-clad avatar in more seemly attire.
The potent combination of surveillance and fun turned out to work in all kinds of online spaces. When the CW network started a Second Life site for its series “Gossip Girl,” Metaverse staffed the site, a re-creation of the Upper East Side of Manhattan, with greeters and party planners who ran daily events and contests to engage visitors. “We were like social directors,” Ms. Pritchard said. “We found if we greeted people, told them what they could do, gave them an event card and introduced them to other people, they had more fun.”
Ms. Pritchard soon had a chance to test the method against a far more demanding audience: children. While Second Life thrives on a free-wheeling, anything-goes culture, a different breed of virtual world began to proliferate soon after Ms. Pritchard started her company.
The sudden growth of Club Penguin, acquired by the Walt Disney Company in late 2007, spawned a galaxy of virtual worlds for children. Suddenly Barbie, Build-a-Bear, Webkinz and countless other toys, games and entertainment properties had their own mini-universes, where children could create avatars, play with one another, care for virtual pets and furnish virtual dream homes. The rise of social media, meanwhile, produced another explosion in social games like Zynga’s Farmville, where players create characters and play cooperatively.
Putting children into these social environments raises risks of predators, privacy breaches, inappropriate conversations and bullying.
“Anybody can buy a profanity filter, but kids have all kinds of work-arounds,” says Anne Collier, co-director of connectsafely.org, which promotes the well-being of children. “There really is no substitute for human moderation.” But not all companies can afford, or have the expertise, to hire an in-house moderation team, and they prefer to outsource to Metaverse or a handful of similar firms, including LiveWorld and I.C.U.C.
Ms. Pritchard’s approach to child safety is more camp counselor than cop. When children misbehave, Metaverse moderators send a private message to the miscreant, with a warning. Repeat offenders may receive a five-minute muted timeout or can be ejected from a site.
“Our policy is firm forgiveness,” Ms. Pritchard says. “Sometimes kids, and adults, too, come into a new environment and feel nervous or scared, and get attention by saying something inappropriate. By giving a warning or turning it into a joke and saying, ‘Come join us,’ you’ve given them a second chance to be part of the community.”
MANY companies have found Metaverse’s combination of surveillance and social direction appealing, both from a safety and brand-management perspective.
In the N.F.L. Rush Zone, the league’s virtual world, Metaverse avatars in striped referee shirts greet children with high-fives and hand out pigskins, the game’s virtual currency.
“For many of the kids, that conversation between their avatar and the referee is their first connection with the N.F.L.,” says Peter O’Reilly, the league’s vice president for fan strategy and marketing. “We needed a safe space that promoted the values of the N.F.L. and moderators who were passionate about the teams.”
Recently, when a live chat with Drew Brees, the quarterback of the New Orleans Saints, was delayed by 45 minutes, Metaverse referees pacified some 10,000 restless children with trivia contests and games, then rewarded them with pigskins for waiting patiently.
Still, outsourcing moderation does not work well for every company. Melissa Parrish, an analyst at Forrester Research specializing in interactive marketing, said a possible drawback of outsourced moderating was that “you have someone who’s not embedded in your company talking as if they are.”
To avoid losing touch with their users, some clients, like Webosaurs, insist on having a Metaverse manager working in the clients’ own offices, rather than managing the moderation remotely. “The manager sits right here, and is involved in our ongoing development efforts,” said Mr. Panis of Webosaurs, which is based in Dallas and owned by Reel FX. Employing an entire team of its own in-house moderators, he says, would not be cost-effective.
To staff a project, Metaverse assigns a manager, one of the company’s 115 regular employees, to oversee it. Managers then draw on a pool of 500 prescreened moderators around the country, many of whom are stay-at-home parents, students and others with flexible schedules.
The pool gives Metaverse quick access to moderators with expertise in a wide array of subjects, from the N.F.L. to Harry Potter. For one project, the company had to find people to judge user-submitted rap videos for a contest sponsored by a major record label. “We needed people who knew specifically about East Coast and West Coast rap, and would recognize gang signs” so they would not be shown, Ms. Pritchard says.
BECAUSE she started Metaverse as a way to spend more time with her daughter, the company endorses a family-friendly culture, and does not require specific hours, even for its regular employees. Until a year ago, the company didn’t even have an office. Instead, the staff met regularly in its swanky virtual headquarters in Second Life London. Now, 35 employees work out of Metaverse’s brick-walled studio in Sacramento or from a small office in Brooklyn. The rest work from home.
While moderators are often paid by the post — Ms. Pritchard herself was paid 3 cents a post for moderating a WB chat board — Metaverse typically pays contractors $8 to $25 an hour, with no benefits.
“For the service level most of our clients want,” she says, “we need to make sure that there’s a dedicated person moderating, even if there’s nothing to do.”
While some companies might have doubts about using contractors working from home, Ms. Pritchard is proud of her company’s model. What she pays may not be a lot to a professional in California, but it’s a decent salary for a stay-at-home mom in Wisconsin, she says.
In the last year, Ms. Pritchard has found that Metaverse’s approach to dealing with children also works for customer service, which companies increasingly provide via corporate Facebook pages, Twitter feeds or other social media forums.
Companies including Kabam, the social game developer, and Horizon DataSys, the data recovery firm, have hired Metaverse to provide online customer service. For social media support, interactions between moderators and customers occur in text, via instant messages, Facebook or e-mail. That makes these exchanges easy to monitor, says Charlene Li, founder of Altimeter, the technology research firm, and author of “Open Leadership: How Social Technology Can Transform the Way You Lead.”
“With a call center, you can only monitor about 5 percent of your calls,” she says. “Here you can monitor every single one, and if the tone isn’t quite right, you can correct it immediately.”
While customer service and children’s virtual play may seem worlds apart, both ultimately come down to respectful communication in a social environment.
“They hire us,” Ms. Pritchard says, “because we know how to have conversations when millions of people may be listening.”