User generated content (UGC) is one of the most powerful marketing techniques used today. USG includes: blog or forum comments, reviews, photos, videos, and tweets. However, putting your brand’s content solely in the hands of the public can be dangerous to your reputation, consumers, and site users. None-the-less, due to their popularity and far reach, most brands have implemented USG into their marketing efforts.
Case in point: millennials. Crowdtap, a social influence marketing platform, released a research report in 2014 on millennials’ media consumption habits, how they perceive information, and how these same media sources impact purchasing decisions. This report found that millennials, who are forecasted to have record-breaking purchasing power, are most influenced by UGC than any other marketing technique. Additionally: 
· Millennials spend roughly 18 hours with different types of media per day.
· UGC takes up the majority of millennial time, with 5.4 hours a day (about 30%), versus all traditional media types (print, radio, and television) combined at 33%.
· On a daily basis, 71% of millennials say they engage in social media daily.
· Millennials trust the information received through UGC 50% more than information from other media sources (i.e., TV, newspapers, and magazines).
· 35% of millennials find UGC content more memorable than other sources.
As social media grows into a multibillion-dollar industry and connects more people, companies have been challenged with the exposure of their online users to the internet’s vast array of jerks—as well as becoming largely dependent on their ability to police UGC.
With that in mind…What is Content Moderation?
Content moderation is the scanning of UGC for text, video, or images that violate your brand’s values (i.e., racism, nudity, violence) as well as unwanted ads or spam clogging up your forum posts. The goal is to protect your users and brand without affecting user experience. But for many companies, content moderation is often reactive instead of proactive. To ensure the successful application of content moderation, start with these considerations:
Determine Your Criteria
You need a content moderation plan, but what will your criteria be? Where will you draw the line? Some sites allow heated, aggressive conversations (such as sports or political forums), as opposed to a children’s or advocacy sites, where the language must remain welcoming and encouraging at all times. The success of your company’s branding campaign is important, but so is the safety of your brand’s reputation and users.
Determine Your Expected Volume and Peak UGC Time
Before determining the best method of content moderation for your site, look at your current metrics and consider the projections in volume of your UGC. Know your site’s audience and when you expect to receive content from them. Some sites experience a heavy amount of UGC traffic at certain times of the day. For example, a social media forum may have a high volume of UGC in the evening hours, but very little in the morning.
Timing is all important for when an ad appears in order to spark discussion on the site’s forum, comments on content posts, or even shoppers if the site is retail. So, if you are thinking about placing an ad, how much traffic are you expecting the ad to bring to your site?
Determine Your Process
Content moderation can be done in-house or by a content moderation service. Hiring outside help can be expensive, especially if the expected UGC volume is high and requires consistent coverage. If you decide to do internal content moderation, you will need to determine how many employees will be needed to ensure effective moderation. You will also have to be prepared to expand the moderation team if UGC volume spikes unexpectedly.
If you decide to outsource UGC moderation, be sure to use a reputable service provider that understands the role it plays in your business’ practices and campaign. Whether you choose to outsource or keep your content mediation in-house, recognize that these moderators will be accountable for the safety of your brand and company’s image, so select wisely.
The Six Types
Moderation, in the context of community members’ content, refers to the practice of monitoring submissions and applying a set of rules which define what is and is not acceptable content to be viewed on the site—followed by the removal of unacceptable content. There are 6 common types of moderation that can be used to maintain some sense of order within your brand’s community.
Pre-moderation provides the benefit of ensuring that inappropriate content is removed and kept off the visible sections of your website. To do so, when someone submits content to your website, have it placed in a queue to be checked or approved by a moderator before it goes visible.
Ø This type of moderation is a popular choice for online communities that cater to users with a high level of risk; such as ones with children users (e.g., online gaming), as a way to evade bullying or sexual grooming.
Ø However, while pre-moderation provides high control of UGC, this kind of moderation is a common cause of death to online communities as it creates a lack of instant gratification for the participant, who is left waiting for their submission to be cleared by a moderator.
From a user-experience perspective, in an online environment where active moderation is necessary, post-moderation is a better practice than pre-moderation; content is immediately posted on the website, but replicated in a queue for a moderator to examine later. The main benefit of this for users is that conversations take place immediately, making for a faster-paced community.
Ø Unfortunately, as the community grows and each piece of UGC is scanned, the website operator legally becomes the publisher of the content, which can be a risk for some communities such as celebrity-based news sites.
Reactive moderation is relying on users to flag content that is either in breach of the company’s rules, or that members deem to be inappropriate. Here, community members themselves become responsible for reporting content.
Ø This type of moderation can be applied alongside other types of moderation as a safety net if any unsuitable posts get past the moderators, or more commonly as the only moderation method.
Ø However, if your company is chiefly concerned about how your brand is viewed, you might not be willing to take the risk that some undesirable content will be visible on your site for any period of time.
Distributed moderation is still considered an uncommon method, relying on a system which community members vote on whether submissions are in line with the site’s rules of use. This allows control of UGC posts to remain within the community with supervision from senior moderators.
Ø Keep in mind, relying on the online community to police themselves is a direction companies are rarely willing to take, for legal and branding reasons similar to the ones outlined in reactive moderation above.
Ø For this reason, a distributed moderating system can also be devised within the company, using staff to process submissions and determine whether posts should be removed.
In addition to human-powered UGC moderation systems, automated moderation is another option. Automated mediation uses technical tools to process content and apply defined rules to reject or approve user submissions.
Ø The most common tool used is the word filter, in which a list of banned words is entered and the tool either flags the word for review by a moderator, replaces it with an alternative, or rejects the post altogether.
Ø The best practice for a company to take up if deciding to use automated moderation is to use this practice in conjunction with one of the other types of mediation discussed above.
Ø Another practice that would help the process is by creating a CAPTCHA, a system designed to establish that a user is human in order to protect the website from bots.
Other types of auto moderation include:
Ø Block User: rejects all incoming posts from a specific user
Ø Whitelist User: approves incoming posts from a specific user, bypassing the moderator’s queue
Ø Block Keyword: rejects posts that contain specific banned words
Ø Image Filter: removes posts with banned images (such as nudity or violence)
Although an option, choosing to not moderate the business’ community at all is a dangerous idea. But there are some valid reasons as to why a company might choose not to regulate their user content. Some businesses just don’t have the resources or finances to monitor user content where others may feel as if their community is too small to be affected by user content.
Ø However, there are more benefits to using one of the moderation types discussed than using none at all. Without some form of content mediation, the community atmosphere of your business can deteriorate and turn off potential members.
With today’s technology, anything can go viral in the blink of an eye, so any company that allows UGC needs to have some form of content moderation. Take the necessary steps to ensure your website and business is protected so you may bask in the results of a successful UGC campaign, rather than be left trying to clean up a mess that harms your brand’s values.
If you are looking for a reputable content moderation service, contact Divergent Web Solutions today at 1-800-806-5661 or firstname.lastname@example.org
Divergent Web Solutions creates high-quality websites and is a full service web presence firm, which supports small businesses, by creating space and focusing on their design, marketing, social, and/or development platforms where they do not have to pay the price of hiring a custom developer. Contact us for a free consultation, and we’ll develop a customized quote tailored just for your unique needs and budget at email@example.com