Latest SEO Link Building Sites

SEO Updates: latest update seo

The Importance of Proper Site Updating and Maintenance to Achieve Success

SEO update and content strategy comprises of the dissemination phase that includes the process of maximizing its potential for the effective distribution through social networks.

The Importance of Proper Site Updating and Maintenance to Achieve Success

SEO update and content strategy comprises of the dissemination phase that includes the process of maximizing its potential for the effective distribution through social networks

Showing posts with label latest update seo. Show all posts
Showing posts with label latest update seo. Show all posts

Monday, December 30, 2013

WordPress Plugins to Improve SEO & Usability


SEO Services


1. WordPress SEO by Yoast

Download this plugin now and use it on every WordPress site you own. This plugin, in my opinion, is as important as the WordPress installation itself. It's also extremely easy to use.
Also, if you haven't already, make sure to use Yoast's Google Analytics plugin.

2. Simple URLs

This plugin is great. You can track outbound links and control them completely right within the WordPress backend. If you add Disallow: /go/ into your robots.txt file it will also stop any authority from passing through the link itself.
You can use this plugin to keep track of these outbound links. For example, if you have affiliate links on your site, you can calculate a conversion rate from knowing the number of clicks to the number of people who purchase something via the affiliate link.

3. RB Internal Links

Although this plugin hasn't been updated in more than two years, it still should be included in every WordPress installation. This plugin helps with internal linking.
RB Internal Links is great because it uses the post ID to link internally rather than the URL itself. This means that if you want to change the URL of a page or post, then the URL will be updated dynamically.
This cuts the risk of internal 404 pages that can harm SEO for internal pages, as well as ensuring that no visitor reaches a page that does not exist.

5. Widget Logic

This plugin works specifically with your widgets. When installed, an extra option is added into each widget where you can define exactly where that widget should/shouldn't appear. This is great when you want to control what content appears in which sections of the site.
Here are a few examples of widget logic you can use:
  • is_category(X) || (is_single() && in_category(X)) - if viewing Category X or a post within Category X.
  • is_archive() - if viewing any archive page.
  • is_page() - if viewing a page.
  • !is_page() - if viewing anything other than a page. Note that the use of ! turns the condition into if is not.
When you use this plugin ensure that you are confident with using some PHP code as incorrect use can lead to potential problems.

6. Members

You may find that the default user roles provided by WordPress aren't enough for you to control the access that you want. This plugin adds flexibility to edit existing user roles as well as adding additional user roles. The plugin also comes with easy to use widgets and shortcodes so you can limit content based on the user's role.

7. Use Google Libraries

This very simple plugin that substitutes JavaScript libraries called locally on your own server with Google's own CDN. This saves on bandwidth, keeps using compressed versions of the scripts, and increases the chance that a user already has these files cached and therefore increases the general performance of your site.

8. W3 Total Cache

While we're on the subject of performance, W3 Total Cache is the most powerful and comprehensive caching plugin available. This plugin handles everything from combine and minification for both CSS and JS to HTML linebreak and comment removal, disk caching, browser caching and more.
It's useful to test any settings out to ensure that there are no issues once enabled and deployed but most of the time W3 plays ball with your WordPress installation.

9. Gravity Forms

This is the best plugin available. Although paid (from $39), this plugin is a must have on any WordPress installation and pays for itself.
This plugin handles all kind of form generation and management from basic contact forms to complete content management. There is so much you can do within Gravity Forms that I can't cover it in this post alone. Some examples of how you can use Gravity Forms:
  • Basic contact forms: Also includes seamless integration with CAPTCHA
  • Contact forms with email routing: This is great for larger companies. Based on options filled out in the form, email is routed to a different address saving time sifting through a generic email address
  • MailChimp integration: You can use any form's email input and send that information to MailChimp directly through an add-on available (only for people who purchased the developer license) using MailChimp's API.
  • Creating content: Forms can actually generate posts or pages within your site. You can create a form that populates all the data needed to publish a new post or page, including the title, excerpt, body, featured image and more.
Another great thing about Gravity forms is that all entries are stored and viewable within the backend of the site meaning that some forms don't even need to have email notifications upon submission. Not enough for you? You can also export all entries as a CSV file, as well as being able to import and export all forms for you to backup or transfer to other sites.

10. Twitter Feed Pro

(Full disclosure: This is my own plugin.)
This paid plugin ($19.99) outputs a Twitter feed based on a number of settings using the shortcode. You can output your latest tweets or someone else's (or a combination), view replies and public mentions, view favorites of any username or search for any term or hashtag. There are also many options for customizing the look and feel of how the tweets are output.
There are two reasons I mention this plugin.
First is that tweets use HTML to output tweets rather than using jQuery (as this is the only other way to do so via Twitter's official embedded timeline widget).
Additionally, this plugin is fully compatible with Twitter API v1.1. Some other plugins or Twitter Feed options within WordPress themes use v1.0 of the API which will retire on June 11. If you want to know more about this issue, I have written about Twitter API v1.1 and its implication.
More Details:- 

Wednesday, December 18, 2013

A Little Duplicate Content Won't Hurt Your Rankings

SEO SERVICE

Duplicate content is always a concern for webmasters. Whether it's a website stealing content from another site, or perhaps a website that hasn't taken an active role in ensuring they get great unique quality content on their site, being duplicated out of the Google index is a problem.
In the latest webmaster help video from Google's Matt Cutts, he addresses how Google handles duplicate content, and when it can negatively impact your search rankings.
Cutts started by explaining what duplicate content is and why duplicate content isn't always a problem, especially when it comes to quoting parts of other web pages.
It's important to realize that if you look at content on the web, something like 25 or 30 percent of all of the web's content is duplicate content. … People will quote a paragraph of a blog and then link to the blog, that sort of thing. So it's not the case that every single time there's duplicate content it's spam, and if we made that assumption the changes that happened as a result would end up probably hurting our search quality rather than helping our search quality.
For several years, Google's stance has been that they try to find the originating source and give that result the top billing, so to speak. After all, Google doesn't want to serve up masses of identical pages to a searcher because it doesn't provide a very good user experience if they click on one page, didn't find what they're looking for, and then go back and click the next result only to discover the identical page, just merely on a different site.
Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as of it's just one piece of content. So most of the time, suppose we're starting to return a set of search results and we've got two pages that are actually kind of identical. Typically we would say, "OK, rather than show both of those pages since they're duplicates, let's just show one of those pages and we'll crowd the other result out," and then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, "OK, I want to see every single page" and then you'd see that other page. But for the most part, duplicate content isn't really treated as spam. It's just treated as something we need to cluster appropriately and we need to make sure that it ranks correctly, but duplicate content does happen.
Next, Cutts tackles the issue of where duplicate content is spam, such as websites that have scraped content off the original websites or website owner suggests republish a lot of “free articles” that are republished on masses of other websites. These types of sites have the biggest problem with duplicate content because they merely copy content created on other websites.
It's certainly the case that if you do nothing but duplicate content, and you are doing in an abusive, deceptive, malicious, or a manipulative way, we do reserve the right to take action on spam. So someone on Twitter was asking a question about "how can I do an RSS auto blog to a blog site and not have that be viewed as spam," and the problem is that if you are automatically generating stuff that is coming from nothing but an RSS feed, you're not adding a lot of value, so that duplicate content might be a little bit more likely to be viewed as spam.
There are also cases where businesses might legitimately end up with duplicate content that won't necessarily viewed as spam. In some cases, websites end up with duplicate content for usability reasons, rather than SEO. For the most part those websites shouldn't worry either
But if you're just making a regular website and you're worried about whether you'd have something on the .com and the .co.uk, or you might have two versions of your terms and conditions, an older version and a newer version, that sort of duplicate content happens all the time on the web and I really wouldn't get stressed out about the notion that you might have a little bit of duplicate content.
Cutts does caution against local directory types of websites that list masses of cities but serve up empty listings with no true content about what the user might be looking for, as well as sites that create individual pages for every neighborhood they service, even though the content is the same as what's on main city web page.
As long as you're not trying to massively copy for every city in every state in the entire United States, show the same boilerplate text which is, "no dentists found in this city either," for the most part you should be in very good shape not have anything to worry about.
Bottom line: as long as your duplicate content is there for legitimate reasons (e.g., you're quoting another website or you have things like two versions of terms and conditions), you really shouldn't be concerned about duplicate content. However, Google certainly can and will take action against sites utilizing duplicate content in a spammy fashion, because they aren't adding value to the search results
More information :- 

Tuesday, December 17, 2013

6 Ways to Accelerate Your Local SEO Success in 2014

seo services


Forecasting SEO trends for local is quite difficult – primarily because I know in my heart of hearts that local SEO isn't a trend, phenomenon or fad. Local – along with personalized search – is a necessary evolution to truly optimize the search experience for users across the globe. It is a mindset that SEO professionals can't "arrive to" late.
Early adoption of best practices, voraciously reading case studies and experimenting on your own is mission critical to surviving this new era of marketing. Join me on a palatable overview on what's happened already, and what's coming up next.
This article will cover Google Hummingbird; the overlap between mobile and local; a SoLoMo case study using Pinterest, turnkey local SEO strategies; tools; and a few important infographics.

The Hummingbird

Frank Chimero wrote an essay called "What Screens Want". Along with crystallizing the very nature of our relationship to screens – and hinting and the importance of cross-device consumption and consumerism – he talks about the language designers are using to, well, design.
In it, he rejects the current state of the web – one that is built around ideals such as privatization and power. Instead Chimero begs for an Internet that celebrates community and wildness. When it comes to local marketing, there seems no better starting place than here.
When the Hummingbird algorithm dropped, the majority of SEO professionals hardly noticed a difference. While Google itself said that the algorithm affected upwards of 90 percent of the queries, many rankings across keywords stayed the same.
Due to the interaction between Hummingbird and the Venice update – a tweak that lead to more localized organic results for unbranded, non-geo-modified keywords keywords – local SEO pros should celebrate this new algorithm. What this means is that there are even more opportunities to capture local traffic, for both queries such as [seo agency montreal] and [seo agency], as more keywords now trigger local results.
More than ever, it has become important to include regional-vertical pages for your local business. Hummingbird – likely the first of many algorithmic updates that will prefer context to content – forces local businesses eliminate catch-all Our Locations pages, instead encouraging them to publish content specific to each place of business.

Mobility and Locality

One of my favorite "year in review" posts for digital marketing came from Karen McGrane, a brilliant content strategist. She compiled a list of mobile web statistics (sources found within) that are sure to knock the socks off of digital marketing managers across the globe. A sample:
  • 91 percent of American adults own a mobile phone.
  • 56 percent of American adults own a smartphone.
  • 63 percent of mobile phone owners use their phones to access the Internet.
  • Amazon, Wikipedia, and Facebook all see about 20 percent mobile traffic.
  • 77 percent of mobile searches take place at home or at work.
For the purposes of this article, one statistic that stood out in particular:
  • 46 percent of shoppers report using their phone to research local products and services.
Nearly one in twoshoppers for local products and services are using their phone. If your mobile game isn't on lock, you are essentially neglecting or potentially insulting half of your target demographic.
Finding resources to make a mobile-friendly website in 2014 isn't a simply a good idea, a high-priority or mission critical task. It is essential to the longevity and profitability of your business. Period. With sad statistics showing most B2B, Fortune 100 and consumer brands failing to stay up to snuff, mobile could be a fantastic opening for your business to slay your competition.'''
In November, I had the opportunity to listen to author and Google Digital Marketing Evangelist Avinash Kaushik speak at Think Quebec about the impact of mobile on search marketing. He pulled up example after example of terrible mobile search experiences. He drew attention to huge brands that were throwing away search traffic – often after creating demand for a particular product on a different marketing channel, and then now showing up in search.
"The web is so good at destroying things. If you suck – you die," he said.
Kaushik's words ring no less true for local SEO. If you aren't performing for 46 percent of your potential customers, you suck. And you will die – or at least your business might.

On-Site Quick Technical Fixes

"Share of voice" is becoming an increasingly interesting application to search engine marketing, and verges on being the most inclusionary digital marketing trend of 2013.
Share of voice addresses the entirety of the search engine results page, which as we know, is getting more complicated by the week. The crew at IGO Mobile Marketing has put together a digest of technical fixes to dominate search share of voice, including:
  • New advances in meta data for local marketing.
  • Local caps on sitelinks.
  • De-indexing and demoting useless pages.
  • Maximizing presence in IYP or local directories.
  • Optimizing review management processes.
  • Rich snippets – sentiments and testimonials.

Off-Site: Organic vs. Local Strategies

Examining the crossover between organic and local results can be a difficult mental exercise for many. Thankfully, Adam Steele at Lean Marketing put together an extensive, step-by-step guide to figuring out whether your organic efforts – on-site optimization and off-site outreach – make an impact on the particular local search results you are aiming to optimize for.
In brief, his findings are:
  • A "supermajority" of pack results relied on being picked up on in Google Maps.
  • A strong correlation between being the first position in organic and in the pack.
  • Local factors have a strong influence over the first position in organic.
This type of analysis is hugely beneficial because if the local results – or "pack results" – are being heavily influenced by strictly organic search signals, it may be enough for you to focus your SEO efforts on solidifying your placement in organic, using traditional SEO methodologies. In fact, Steele concludes, "your ability to crush it with organic SEO may just make or break your [local] campaign."

Local Citation Building Checklist

Citation building is a practice that is acutely separate from link building, but they share one definite similarity: if you abuse citation building, you will get burned. In this section we'll give some quick insights on what citations are exactly, and how to leverage them to influence your rankings. Before we get started, however, it's important to note that not only are your listings on Google Maps important – but also on Bing, Yahoo, and Apple.
There are five categories of directories that you'll be looking at in terms of citations:
  • Data-aggregators (LocalEze)
  • Horizontal directories (Yelp)
  • Industry-specific directories (Avvo)
  • Region-specific directories (Denver.com/places)
  • Unstructured citations (blogs)
Rather than focusing on the Moz- or PageRank values of these directories, you will want to look for the opportunity to plug structured citations for your business online. Structured citations commonly consist of NAP (name, address, phone number) information. The quality of the website, the accuracy of these citations and the relevancy of the directory are all essential to executing a successful citation building campaign.

Sunday, December 8, 2013

Matt Cutts Discusses Duplicate Meta Descriptions

SEO Services 

“The way I would think of it is, you can either have a unique metatag description, or you can choose to have no metatag description, but I wouldn’t have duplicate metatag description[s],” Cutts says. “In fact, if you register and verify your site in our free Google Webmaster Tools console, we will tell you if we see duplicate metatag descriptions, so that is something that I would avoid.”
“In general, it’s probably not worth your time to come up with a unique meta description for every single page on your site,” he adds. “Like when I blog, I don’t bother to do that. Don’t tell anybody. Ooh. I told everybody. But if there are some pages that really matter, like your homepage or pages that have really important return on investment – you know, your most featured products or something like that – or maybe you’ve looked at the search results and there’s a few pages on your site that just have really bad automatically generated snippets. We try to do our best, but we wouldn’t claim that we have perfect snippets all the time.”
No, believe it or not Google is not perfect (as Executive Chairman Eric Schmidt also reminded us).
Cutts concludes, “You know, in those kinds of situations, then it might make sense to go in, and make sure you have a unique handcrafted, lovingly-made metatag description, but in general, rather than have one metatag description repeated over and over and over again for every page on your site, I would either go ahead and make sure that there is a unique one for the pages that really matter or just leave it off, and Google will generate the snippet for you. But I wouldn’t have the duplicate ones if you can help it.”
Some will probably take Matt’s advice, and start spending a lot less time bothering with meta descriptions. Just remember that part about looking at the search results and making sure that Google isn’t displaying something too weird, particularly if it’s an important page.

Friday, December 6, 2013

Surprise! Google Updates PageRank

SEO Services 

Christmas has come early for webmasters eagerly waiting to see if PageRank would ever update again. Twitter is abuzz with webmasters who noticed PageRank was updated early this morning, for the first time since February 2013. It’s surprising to many, since Matt Cutts had said there were no plans for another 2013 PageRank update due to technical issues.
At Pubcon Las Vegas in October, Cutts said we hadn’t seen a recent PageRank update because the pipeline that pushes PageRank data from the internal Google servers to the toolbar broke. There were no plans to fix it, he said, at least not for the remainder of the year. However, it appears that they decided getting this information out to the public was useful enough that they fixed the problem.
It seems the majority of sites that are new since the February update are starting out with PR1 or PR2. However, sites that have already had PageRank in the neighborhood of PR4-PR6 didn’t see much in the way of PR improvement this time.
“We looked at hundreds of sites and 90 percent dropped,” says Dave Naylor of Bronco. “We’ve not seen many gain PageRank in big leaps this time. We saw PR6s drop to PR1s, but not many PR1s rise to a PR6.”
There is also a lot of speculation that this data is not that fresh, and that the data seems to be several months old. Since Google gets daily updates to their internal PageRank tool, it seems a bit mysterious that Google would push out stale data -- unless there is a reason for it.
“My gut feeling is that the PageRank is from when the Google PageRank system broke, so it’s not new PageRank; it’s second hand PageRank from September-ish,” says Naylor.
Having a Google Page Rank update done in the few weeks leading up to Christmas (especially as it’s their first update since February) is also a bit unusual. With updated PageRank during a busy shopping season, it will definitely have an impact on those who have purchased links from high PR sites if those sites have seen a drop in PR.
It is also very important to remember that Google sees their PageRank update internally on a daily basis, and that is what impacts the algorithm, not the PageRank that is publicly seen by users through the Google toolbar or another similar tool. However, the perception of site quality, particularly when it comes to buying links or advertisements, still gets a lot of its weight from the current site’s PageRank in determining its value, even when that data is months old.

Google Webmaster Tools Now Finds Smartphone Crawl Errors


SEO Services
It can be bit complicated for websites with a high number of smartphone visitors to figure out issues like 404 errors when only smartphone visitors or desktop users may be affected. Often, users don’t realize there is a problem because the majority of the time, they’re doing troubleshooting and maintenance from a desktop computer; they just don’t notice the mobile issues unless someone specifically alerts them.
Google Webmaster Tools realizes this is an issue, especially with mobile traffic increasing at such a rapid rate. They’ve made some changes to their crawl errors page to include specific smartphone crawl errors that the Googlebot-Mobilebot discovers while crawling the web as a mobile useragent.
Pierre Far, a webmaster trends analysts, has announced that webmasters can now find a wide range of crawl information and errors for smartphones:
  • Server errors: A server error is when Googlebot got an HTTP error status code when it crawled the page.
  • Not found errors and soft 404s: A page can show a "not found" message to Googlebot, either by returning an HTTP 404 status code or when the page is detected as a soft error page.
  • Faulty redirects: A faulty redirect is a smartphone-specific error that occurs when a desktop page redirects smartphone users to a page that is not relevant to their query. A typical example is when all pages on the desktop site redirect smartphone users to the homepage of the smartphone-optimized site.
  • Blocked URLs: A blocked URL is when the site's robots.txt explicitly disallows crawling by Googlebot for smartphones. Typically, such smartphone-specific robots.txt disallow directives are erroneous. You should investigate your server configuration if you see blocked URLs reported in Webmaster Tools.
The mobile crawlers are already live in Webmaster Tools. Simply log into your account, click on “Crawl Errors” under the “Crawl” submenu, and select the smartphone tab to view any crawl errors from your website.

Facebook Developers Gets A New Web Site

For the longest time, developers of Facebook apps were met with the same ol’ Web site when they visited the official Facebook Developers portal. Now those same developers are in for a treat as Facebook has decided to give its developer portal a complete makeover.
Facebook Developers announced today on its blog that it completely remodeled its Web site to make it easier for you, the developer, to find what you’re looking for. The services they offer, including API documentation and app submissions, have also been completely revamped.
Here’s what the new landing page looks like:
Seo Services
Facebook says the new design will make the following easier:
  • Manage your apps and configure Facebook integrations
  • Navigate our improved documentation
  • Submit your app to Facebook App Center with a simplified flow
  • Find and report bugs with a faster response turnaround
  • Learn about the latest updates and news relevant to you on our homepage
  • Oh, and before you go running off to check it out for yourself,
    You can check out the new Web site here. If you don’t see it yet, Facebook says that it’s slowly rolling it out right now. Everybody should start seeing the new site within the next few weeks. When you do get the new site, you’ll be greeted with a tour of all the changes. Afterwards, Facebook welcomes any and all feedback developers may have.

    Thursday, December 5, 2013

    How to Use Images in Your Link Building Campaigns

    SEO Service 

    It's no secret that content that includes images attracts more attention. I've seen statistics that say that content with images can get well over 90 percent more total views than content without images and that paid social posts with images generate more engagement.
    Also, I prefer content that includes images. It breaks up the tedium of reading an article, sometimes provides a laugh, and helps make the content stick in my mind.
    However, too few people actively seem to market their images in their link building campaigns. It's been awhile since we've been asked to build image links and that's a shame. And I've also noticed that the pages we're building links to, for the most part, also don't really make good use of images.

     Easy Ways to Use Images in Link Campaigns

    Let's look at a few easy ways that images can be used in your link campaigns.
    1. Use them to link back to your site from another site. We've had site owners say they don't do text links, but would happily do an image link.
    2. Use them in your content to naturally attract more links. Anything that makes users happier makes them more likely to share, and that ups the odds for more links.
    3. Use them in your content to increase user engagement. Maybe you've used an image that is just too hysterical not to comment on, whereas a visitor wouldn't have commented on your text? Yes, it happens.
    4. Gain traffic from them when they're found in an image search. Maybe a person finds your site for the first time through an image search, and that leads to a new customer, a new email subscriber, some social love, or a new link.
    5. Use them to promote your site on social media (not just Pinterest.) Build links to those social sites, too. According to a Buffer blog post, "tweets with image links get 2x the engagement rate of those without." More engagement equals the potential for more links.
    6. Create infographics and comics.
    7. Use them to help promote your seasonal content. If you're having a company-wide donation program for the holidays, a photo of all the donations you've gathered will be much more interesting than just saying "yeah, we have a lot of donations."
    8. More Info:- Latest google Updates , Latest update seo  

    Friday, November 29, 2013

    How to Prepare for Google's 2014 Algorithm Updates


    SEO Services

    It has been an incredibly eventful year in terms of updates from Google. Major 2013 changes included further releases of Penguin and Panda, Hummingbird taking flight, and the shift away from providing keyword data thanks to encrypted search.
    Many have gone so far as to ask whether SEO as a profession is dead: for one interesting perspective, see my recent Forbes interview with Sam Roberts of VUDU Marketing. My own take is less alarmist: Google has taken major spam-fighting steps that have shifted the playing field for SEO professionals and anyone trying to get their site on the map in the year ahead.
    At the same time, the need for an online presence has never been stronger, while the landscape has never been more competitive. The potential to make a real ROI impact with your company's online marketing initiative is greater than ever. But defaulting to so-called "gray hat" tactics no longer works. Instead, SEO professionals need to step up and embrace a more robust vision of our area of expertise.

    Content Marketing is Bigger than Ever

    Content marketing will move from buzzword to mature marketing movement in 2014. From an SEO perspective, Google will be looking at companies that have robust content marketing efforts as a sign that they're the kind of business Google wants to support.
    Think of all the advantages of a good content strategy:
    • Regular, helpful content targeted at your audience.
    • Social signals from regular sharing and engagement.
    • Freshness or signs that your site is alive and growing.
    • Increasing authority connected to your body of work.
    Sound familiar? It's the very approach to SEO that all of Google's recent updates have been designed to shape.
    What changes you need to make in 2014 depends largely on where your company stands now in relation to an active content marketing strategy. Companies with existing content strategies will need to assess the role of mobile, specifically.
    If you've just begun to move in the direction of content marketing, it's time to really commit and diversify. If you haven't started yet, it's time to take the plunge.

     

    Invest in Google+

    In addition to strengthening your overall social media marketing position, it's going to be absolutely critical that you are investing in your Google+ presence.
    Moz's most recent study of ranking factors confirms that Google+ is playing an increasingly significant role in a solid SEO ranking. The immediate areas to focus on include:
    • Establishing Google Authorship of your content, and tying it to your Google+ account. Authorship, which brings your body of content together, will play an important role in the SERPs as well as strengthening your Author Rank.
    • Those +1's add up. It isn't clear exactly how much Google +1's directly contribute, but it's fair to say that it's a major factor in the "social signals" component of Google's algorithm. I expect this to increase in the year ahead.

     More info:-  PPC terms  , SEO Services


    Use the Link Disavow Tool Even if Your Site Hasn't Been Penalized


    SEO Services
     When Google launched the disavow tool, there was lots of discussion about whether people should use the disavow tool even Google hadn't taken a manual action against their sites. There were people making blacklists of what they considered low-quality sites that were linking to many different sites, that they would mass submit to Google to disavow across the entire network of sites.
    But there was really no consensus over whether webmasters needed to do anything with the disavow tool if they didn't have manual action, or if they should just leave it until they did have a warning pop up.
    Google's Distinguished Engineer Matt Cutts discussed the disavow tool, and whether it should be used as a pre-emptive measure, in a recent webmaster help video.
    Cutts started by describing when the disavow tool should be used. After a webmaster has gone through the usual routes of trying to get links removed off sites they see as being low-quality, whether it was from poor SEO decisions or a bad SEO company, then that is when you would use the disavow tool, to disavow the links you haven't been able to get removed.
    What about if you don't have a warning? Cutts recommended you use the disavow tool if you have identified any suspicious or low-quality backlinks you believe might hurt you.
    "If you are at all worried about someone trying to do negative SEO or it looks like there's some weird bot that's building up a bunch of links to your site and you have no idea where it came from, that's the perfect time to use disavow as well," Cutts said. "I wouldn't worry about going ahead and disavowing links even if you don't have a message in your webmaster console."
    Clearly, Google is hinting the webmaster should keep a pretty close eye on their backlink profile, and make disavows based on any suspicious changes they see, and not wait for it to be penalized and show up as a message in webmaster tools.
    "So if you've done the work to keep an active look on your backlinks and you see something strange going on, you don't have to wait around," Cutts said. "Feel free to just go ahead and pre-emptively say, you know what this is a weird domain, I have nothing to do with it, and no idea what this particular bot is doing in terms of making links, so go ahead and do disavows even on a domain level."
    It is worth noting that this is a different stance on the disavow tool than Google has previously taken. From the disavow tool page Google states that the disavow tool should be used only "if you are confident that the links are causing issues for you." However, Cutts is now saying something different, that the disavow tool should be used even if those links aren't yet causing issues.
    Cutts also recognized that webmasters tend to get really stressed out about being penalized by Google. There is a significant concern amongst a lot of webmasters that links, something that is often out of their control, could negatively impact them, and if they do get penalized, will suffer the traffic consequences while a disavow is done and the penalty eventually lifted.
    "If you're at all stressed, if you're worried, if you're not able to sleep at night because you think Google might have something, or might see it, or we might get a spam report about you, or there might be some misunderstanding or an algorithm might rank your site lower, I would feel free to just go ahead and disavow those links as well," Cutts said.
    So definitely keep a close eye on your backlinks, particularly your new ones, if you aren't already, and make a point of disavowing those poor quality or suspicious backlinks before it can have any impact on your site.