Apple Had To Pull This Video-Sharing App For Having Too Much Porn
by Steve Kovach on Feb 24, 2012, 4:01 PM
Advertisement
Viddy, an app that lets you share short videos shot on your iPhone with the public, was pulled from the App Store last night. Apple discovered it contained a lot of user-uploaded content featuring sexually-charged material. We first heard about Viddy yesterday from an anonymous tipster. We were able to download the app before Apple got wind and pulled it. Viddy does have a lot of innocuous videos (animals, dancing kids, etc.). There are even some celebrities who endorse the service. Bill Cosby has an account as does the band Linkin Park. But there are an overwhelming number of clips featuring partially nude or fully nude people. In some, the clips are especially graphic, with women dancing in the shower, touching themselves, and other activities too raunchy to mention here. We scrolled through the most popular users on the Viddy network and many are young women with revealing videos like the ones described above. The whole experience is reminiscent of ChatRoulette, the service that became a cesspool of live video chats full of nudity and sexually-charged content. Viddy users have also created some hashtags that make the explicit videos even easier to find. For example, the #sexy hashtag is usually trending, and it directs you to public videos of genitalia, nude dancing, and other graphic content. We spoke to Viddy's CEO, Brett O'Brien, who said the app was pulled by Apple last night after a user complained. O'Brien says Viddy has a public flagging system that alerts the team when inappropriate videos go up, but there's a lag time between the reporting and when the videos get taken down. O'Brien also said it's not the company's vision to become a haven for explicit content like this. His explanation was that Viddy's user base grew too quickly and the team couldn't keep track of all the offensive videos. But it's still hard to imagine that the problem went unnoticed until yesterday. It's clear from the moment you launch Viddy that a huge portion of its users are there to share sexually explicit content. O'Brien said Viddy is now working with Apple and will have the app back for download once all the offensive content is removed and there's a new system in place to prevent more from going up. We were also pointed to Viddy's terms of service, which states: You acknowledge that creating, submitting or sharing your User Content may give rise to various types of legal liabilities and you represent that your User Content (whether or nor you are the author of such content) complies at all times (both when first submitted and throughout its accessibility on the Service) with the TOS and all applicable laws. You understand that Viddy does not pre-screen User Content and is not liable for the content (including User Content) transmitted by users. In the meantime, anyone who already has Viddy can post and view new content. Before being pulled, Viddy was a top-rated and downloaded app rated for people ages nine and up. To put that in context, Instapaper, an app that lets you save any online article and read them later on your iPhone or iPad, is rated for ages 17 and up. Apple would not respond when reached for comment. Please follow SAI: Tools on Twitter and Facebook.
|
No comments:
Post a Comment