Latest SEO Link Building Sites

SEO Updates: Latest google updates

The Importance of Proper Site Updating and Maintenance to Achieve Success

SEO update and content strategy comprises of the dissemination phase that includes the process of maximizing its potential for the effective distribution through social networks.

The Importance of Proper Site Updating and Maintenance to Achieve Success

SEO update and content strategy comprises of the dissemination phase that includes the process of maximizing its potential for the effective distribution through social networks

Showing posts with label Latest google updates. Show all posts
Showing posts with label Latest google updates. Show all posts

Saturday, January 18, 2014

PPC Management Operations:




When your PPC operations are set up well, account management should run like a finely tuned machine. Just like a car's engine, PPC management needs regular tune-ups and maintenance in order to perform at the optimal level.
When you have regularly scheduled checkups daily, monthly, quarterly and semi-annually, you miss less opportunity to respond to change and increase your return on investment.
Planning is half the battle, so here's your starter calendar for your PPC management at the start of a new year.

Daily PPC Tasks

Every business is different and the level of attention needed for certain accounts will vary. But in general, one of the first things you should do is check your accounts each day to see what happened yesterday.
  1. For ecommerce, check ad spend and revenue.
  2. For lead gen sites, check conversions.
Those are the two basic performance indicators – what everyone cares about anyway. But look for standouts – did any one account spend way more or less than usual? Did it have a fantastic sales day? Track the ups and downs.
And don't get too freaked out if something looks out of the ordinary. Unless the client is used to making $10,000 per day and it dropped down to $1,000, it's likely part of the inconsistencies businesses sometimes see.
Pro tip: If you're an agency or consultant and you notice something is off, call up the client. They might be able to explain the odd behavior, for example, the shopping cart was buggy or someone internally clicked on the ad and put items in the shopping cart to test the process, thus making a purchase. (I've seen both of those scenarios and more). If you're running your own PPC, talk to others on your team or dig deeper to uncover reasons why you might be seeing this behavior.

Monthly, Quarterly & Semi-Annual PPC Tasks

These tasks within your PPC calendar will once again depend on the individual business and its advertising needs. For example, changing ad messaging could be a quarterly task, but depends on how much data you have about your ad performance, or if you have special campaigns that run consistently on a shorter timeline.
A good rule of thumb is: the newer the account, the more attention it will need – and some of the tasks we're going to go over may occur more frequently in that case.

5 Monthly PPC Tasks

1. Negative Keyword Management
The simplest and easiest way to avoid wasted money is managing negative keywords. If you stay on top of this, squandered ad spend is practically non-existent. For newer accounts or existing accounts that need a revamp, doing this daily is wise.
2. Review Ad Positions and Manage Bids
You want to make sure you're claiming top spots in paid search placement, just like in organic search. Bid management is key, and you can help streamline this with rules and settings in advertising platforms like AdWords.
3. Communications With Clients or the Team
If you're a party of one running your own campaigns, this doesn't really apply. But if you're managing PPC on behalf of a company, a monthly check-in is a must, and can go a long way when you learn about something the client or team is working on that you can assist with.
4. Watch for Good and Bad Performers
View this at the campaign level down to the ad group and keyword level. It's also sometimes beneficial to do this in reverse as well.
5. Google Display Network Management
Site exclusions are to the Display Network what negative keywords are to paid search ads. Make sure you're managing which sites your ads are featured on, so you're not inadvertently showing up on weather sites if you're a cloud computing company (true story!).

12 Quarterly or Semi-Annual PPC Tasks

Many of the following are interchangeable (quarterly or semi-annual), and depending on the account, you could do them on a monthly basis. I recommend semi-annually at the very least.
1. Click-Through Rate Review
This is at the keyword level, and the rule of thumb is to flag anything below 1 percent CTR. However, it's not uncommon to see high-volume keywords with less than 1 percent.
If the ROI and quality score for the keyword is healthy, you can often just leave it be. Otherwise, consider pausing it and weighing your options for tweaking the messaging or something else.
2. Sitelinks Audit
It's a good idea to check in and see what sort of sitelinks are set up. When things change on the website (for example, sales or product pages are added), the sitelinks can also change to drive traffic to the new pages. You can also review how the sitelinks are performing, and axe the links that don't get clicks.
3. Quality Score Reports
Here, you're looking for low score between 1 and 3 that should be addressed immediately. Sometimes when accounts have been poorly managed, the quality score will remain low for some time – even if you're doing everything right – until the improved practices have been in place for a while. You might even decide to pause these keywords and reinstate at a later date.
Quality scores of 4 and 5 are also worth flagging to investigate.
4. Keyword Trends and Program Development
You can sometimes find keyword trends when you're managing negative keywords. However, longer data sets like three or six months can help you better identify them. These trends can forge new campaigns you may not have thought of before.
5. Consider Remarketing and Product Listing Campaigns
These advertising options can fall by the wayside in favor of regular search network campaigns. Have a close look to see if remarketing or PLAs might be an option at this time. Similarly, if they are already set up, review their performance to look for improvements.
6. Campaign Performance Review
Starting at the campaign level, and working down to the keyword level, isolate the "darlings" of the campaigns and optimize for best performance. Also work on improving the worst. With poor-performing campaigns, make a decision about what you want to do – pause it or work on it more.
7. Ad Messaging
At this point in the campaign, it might be a good time to refine and test the ad copy.
8. Settings Audit
Sometimes campaign settings "mysteriously" change. It happens to the best of us, so it's always good to do a review of every setting to check for anything that may have been turned on, off, or modified accidentally.
9. Geo "Hot Spots"
Are there any cities, regions, or states that show the most ROI? Decide if you'll isolate those; it's always worth trying. Don't forget about international opportunities as well.
10. Search Partner Review
When you're advertising on the Google search network, it includes other search partners like Amazon and even Yahoo and Bing. Check the ROI for those other sites and if it makes sense, turn off these partner sites to save money (especially if you're working with a smaller budget).
11. Competitor Review
This one can be challenging if you don't have good tools. It's sometimes hard to trust the data coming from providers that claim they have it. You can start with Google Auction Insights.
12. Day Parting
Check the campaigns against times of day or days of the week to see if anything may need adjusting. If you have a tight budget, you can save a little money by only running when you know your audience is online, for example, during the week for some B2B businesses.

Ongoing PPC Management

Advertising platforms change all the time with new features. Staying on top of this can be a full-time job, but build it into your calendar. You don't want to miss important updates that can enhance your ROI.
Try reviewing the latest updates in AdWords here and here, and in Bing once a month when you're performing some of your other reviews.
But keep in mind that every new advertising product doesn't fit every account, so use discretion on what you pursue; testing new features are good, but you don't want to waste resources on those that aren't a good fit for your account.

Google Images Makes it Easier to Search by Usage Rights

Seo Services


Google Images has made a great change to its search results. Users can now search for images with specific usage rates more easily. This will be extremely helpful for webmasters and others to find images that they can use for publishing on their own sites.
While Google has actually offered filtering based on photo licenses since 2009, it was a little-known search feature buried in the advanced search options. With the change, users can easily see it and filter the results accordingly.
To access it, simply click "Search Tools" on the image results page, and along with the usual search settings such as size and date, there is now a new drop-down for usage rights. The default is set to "not filtered by license" but users can change it to "labeled for reuse", "labeled for commercial reuse", "labeled for reuse with modification", and "labeled for commercial reuse of modification".
Sites such as Flickr, as well as stock photography sites that offer a variety of photo rights, have long had this type of filtering in their own search results. Bing began offering its own license search filter last summer, so it's surprising that Google took so long to make the change obvious to the average searcher.
As a word of caution, as with any image search, do be aware there are sites that republish photos allowing for reuse, but are actually not the original owner. However, using Google’s reverse image search can help determine the originating owner of an image and to determine what the correct licensing on the photo is.

Monday, December 30, 2013

Tips Of Using Webmaster Tool

SEO Services


Web Master Tool is a free toolset that’s absolutely invaluable for SEO trouble shooting.
It’s pretty simple to set up, you just need to verify that you’re the site owner (there are a number of ways to do this, so just use the one that is best for you) and you’ll have instant access to an abundance of useful information that will help you to improve your website and your search engine optimisation (SEO).
Here are five tips that will get you started:

1. Crawl Stats

Crawl Stats give you information in Google’s crawling activity for the last 90 day period. When you click into this report which is located in Diagnostics, you’ll see three reports:
Pages crawled per day: Overall, it’s a good sign to see this graph going up. Whilst there are peaks and troughs, you’ll be able to see if there is a steady incline, decline or no change at all. Spikes in this report are often due to the introduction of new pages or an increase in inbound links.
Kilobytes crawled per day: This graph should bear some resemblance to the Pages crawled per day graph in terms of the peaks and troughs in the graph.
Time spent downloading a page: This graph will be different from the above two and is likely (hopefully) to not show as many peaks. Peaks on this graph could be a server problem as in the norm, Google should not take very long downloading your pages.
These stats are useful for diagnosing problems and gauging performance issues.

2. Not Found Errors

Not found crawl errors are very useful for usability & SEO. If customers are browsing around your site and finding that links are not taking them anywhere, they’re likely to get annoyed and go elsewhere. This tool (which is accessed on the top right of the dashboard) will identify all not found URLs in your site. Be aware, that this can sometimes be slightly outdated, and Google state:
If you don’t recognize these URLs and/or don’t think they should be accessible, you can safely ignore these errors. If, however, you see URLs listed in the ‘Not found’ section that you recognize and would like crawled, we hope you find the ‘Details’ column helpful in identifying and fixing the errors.
So don’t dwell too much on getting this down to 0 errors in GWT, just use the information to improve site usability.
As well as links from within your site that are leading to a 404, this will also show you links from outside sites that are leading to a 404. This aspect is particularly valuable for SEO. Use this feature in GWT to do is identify the linked to pages within your site that no longer exist and redirect those pages to a real page within your site. This tactic will lead to increased link juice and increased visitors.

3. Meta Descriptions and Title Tags

Google Webmaster Tools will provide you with a list of URLs that have problems in their title tags or Meta descriptions, this list will include duplicates as well as incidences of titles or Meta descriptions that are too long or too short. Go into Diagnostics and HTML suggestions to find this information. Duplicate meta titles, especially can affect your rankings within Google and meta descriptions should be snappy and targeted to each specific page to help CTR of each page on your site.

4. Top Search Queries

Whilst you can get your top search queries out of Google Analytics or whatever analytics tool you use, I particularly like the Webmaster Tools version for the simple reason that it shows your average position within Google as part of the data. This enables you to look at your top search terms by position. The reason this is helpful is that when deciding which keywords to push, I particularly like to focus on the keywords that are currently in positions 2-4 as increases in positions at this level will have the most increased in traffic.

5. Site Links

If your site had a list of links below its Google listing, you can use the sitelink section within Site Configuration to control the links that are shown. You can’t actually tell Google which links to show, but you can block links that you don’t want shown.
These are just a few of the many tools available in Google Webmaster Tools and Google often add new features to this great tool. If you’re not a regular user of GWT, try these features out for size and look around to get used to other features on offer. If you are a regular user of GWT, let us know your favourite features and why.

WordPress Plugins to Improve SEO & Usability


SEO Services


1. WordPress SEO by Yoast

Download this plugin now and use it on every WordPress site you own. This plugin, in my opinion, is as important as the WordPress installation itself. It's also extremely easy to use.
Also, if you haven't already, make sure to use Yoast's Google Analytics plugin.

2. Simple URLs

This plugin is great. You can track outbound links and control them completely right within the WordPress backend. If you add Disallow: /go/ into your robots.txt file it will also stop any authority from passing through the link itself.
You can use this plugin to keep track of these outbound links. For example, if you have affiliate links on your site, you can calculate a conversion rate from knowing the number of clicks to the number of people who purchase something via the affiliate link.

3. RB Internal Links

Although this plugin hasn't been updated in more than two years, it still should be included in every WordPress installation. This plugin helps with internal linking.
RB Internal Links is great because it uses the post ID to link internally rather than the URL itself. This means that if you want to change the URL of a page or post, then the URL will be updated dynamically.
This cuts the risk of internal 404 pages that can harm SEO for internal pages, as well as ensuring that no visitor reaches a page that does not exist.

5. Widget Logic

This plugin works specifically with your widgets. When installed, an extra option is added into each widget where you can define exactly where that widget should/shouldn't appear. This is great when you want to control what content appears in which sections of the site.
Here are a few examples of widget logic you can use:
  • is_category(X) || (is_single() && in_category(X)) - if viewing Category X or a post within Category X.
  • is_archive() - if viewing any archive page.
  • is_page() - if viewing a page.
  • !is_page() - if viewing anything other than a page. Note that the use of ! turns the condition into if is not.
When you use this plugin ensure that you are confident with using some PHP code as incorrect use can lead to potential problems.

6. Members

You may find that the default user roles provided by WordPress aren't enough for you to control the access that you want. This plugin adds flexibility to edit existing user roles as well as adding additional user roles. The plugin also comes with easy to use widgets and shortcodes so you can limit content based on the user's role.

7. Use Google Libraries

This very simple plugin that substitutes JavaScript libraries called locally on your own server with Google's own CDN. This saves on bandwidth, keeps using compressed versions of the scripts, and increases the chance that a user already has these files cached and therefore increases the general performance of your site.

8. W3 Total Cache

While we're on the subject of performance, W3 Total Cache is the most powerful and comprehensive caching plugin available. This plugin handles everything from combine and minification for both CSS and JS to HTML linebreak and comment removal, disk caching, browser caching and more.
It's useful to test any settings out to ensure that there are no issues once enabled and deployed but most of the time W3 plays ball with your WordPress installation.

9. Gravity Forms

This is the best plugin available. Although paid (from $39), this plugin is a must have on any WordPress installation and pays for itself.
This plugin handles all kind of form generation and management from basic contact forms to complete content management. There is so much you can do within Gravity Forms that I can't cover it in this post alone. Some examples of how you can use Gravity Forms:
  • Basic contact forms: Also includes seamless integration with CAPTCHA
  • Contact forms with email routing: This is great for larger companies. Based on options filled out in the form, email is routed to a different address saving time sifting through a generic email address
  • MailChimp integration: You can use any form's email input and send that information to MailChimp directly through an add-on available (only for people who purchased the developer license) using MailChimp's API.
  • Creating content: Forms can actually generate posts or pages within your site. You can create a form that populates all the data needed to publish a new post or page, including the title, excerpt, body, featured image and more.
Another great thing about Gravity forms is that all entries are stored and viewable within the backend of the site meaning that some forms don't even need to have email notifications upon submission. Not enough for you? You can also export all entries as a CSV file, as well as being able to import and export all forms for you to backup or transfer to other sites.

10. Twitter Feed Pro

(Full disclosure: This is my own plugin.)
This paid plugin ($19.99) outputs a Twitter feed based on a number of settings using the shortcode. You can output your latest tweets or someone else's (or a combination), view replies and public mentions, view favorites of any username or search for any term or hashtag. There are also many options for customizing the look and feel of how the tweets are output.
There are two reasons I mention this plugin.
First is that tweets use HTML to output tweets rather than using jQuery (as this is the only other way to do so via Twitter's official embedded timeline widget).
Additionally, this plugin is fully compatible with Twitter API v1.1. Some other plugins or Twitter Feed options within WordPress themes use v1.0 of the API which will retire on June 11. If you want to know more about this issue, I have written about Twitter API v1.1 and its implication.
More Details:- 

Wednesday, December 18, 2013

A Little Duplicate Content Won't Hurt Your Rankings

SEO SERVICE

Duplicate content is always a concern for webmasters. Whether it's a website stealing content from another site, or perhaps a website that hasn't taken an active role in ensuring they get great unique quality content on their site, being duplicated out of the Google index is a problem.
In the latest webmaster help video from Google's Matt Cutts, he addresses how Google handles duplicate content, and when it can negatively impact your search rankings.
Cutts started by explaining what duplicate content is and why duplicate content isn't always a problem, especially when it comes to quoting parts of other web pages.
It's important to realize that if you look at content on the web, something like 25 or 30 percent of all of the web's content is duplicate content. … People will quote a paragraph of a blog and then link to the blog, that sort of thing. So it's not the case that every single time there's duplicate content it's spam, and if we made that assumption the changes that happened as a result would end up probably hurting our search quality rather than helping our search quality.
For several years, Google's stance has been that they try to find the originating source and give that result the top billing, so to speak. After all, Google doesn't want to serve up masses of identical pages to a searcher because it doesn't provide a very good user experience if they click on one page, didn't find what they're looking for, and then go back and click the next result only to discover the identical page, just merely on a different site.
Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as of it's just one piece of content. So most of the time, suppose we're starting to return a set of search results and we've got two pages that are actually kind of identical. Typically we would say, "OK, rather than show both of those pages since they're duplicates, let's just show one of those pages and we'll crowd the other result out," and then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, "OK, I want to see every single page" and then you'd see that other page. But for the most part, duplicate content isn't really treated as spam. It's just treated as something we need to cluster appropriately and we need to make sure that it ranks correctly, but duplicate content does happen.
Next, Cutts tackles the issue of where duplicate content is spam, such as websites that have scraped content off the original websites or website owner suggests republish a lot of “free articles” that are republished on masses of other websites. These types of sites have the biggest problem with duplicate content because they merely copy content created on other websites.
It's certainly the case that if you do nothing but duplicate content, and you are doing in an abusive, deceptive, malicious, or a manipulative way, we do reserve the right to take action on spam. So someone on Twitter was asking a question about "how can I do an RSS auto blog to a blog site and not have that be viewed as spam," and the problem is that if you are automatically generating stuff that is coming from nothing but an RSS feed, you're not adding a lot of value, so that duplicate content might be a little bit more likely to be viewed as spam.
There are also cases where businesses might legitimately end up with duplicate content that won't necessarily viewed as spam. In some cases, websites end up with duplicate content for usability reasons, rather than SEO. For the most part those websites shouldn't worry either
But if you're just making a regular website and you're worried about whether you'd have something on the .com and the .co.uk, or you might have two versions of your terms and conditions, an older version and a newer version, that sort of duplicate content happens all the time on the web and I really wouldn't get stressed out about the notion that you might have a little bit of duplicate content.
Cutts does caution against local directory types of websites that list masses of cities but serve up empty listings with no true content about what the user might be looking for, as well as sites that create individual pages for every neighborhood they service, even though the content is the same as what's on main city web page.
As long as you're not trying to massively copy for every city in every state in the entire United States, show the same boilerplate text which is, "no dentists found in this city either," for the most part you should be in very good shape not have anything to worry about.
Bottom line: as long as your duplicate content is there for legitimate reasons (e.g., you're quoting another website or you have things like two versions of terms and conditions), you really shouldn't be concerned about duplicate content. However, Google certainly can and will take action against sites utilizing duplicate content in a spammy fashion, because they aren't adding value to the search results
More information :- 

Tuesday, December 17, 2013

6 Ways to Accelerate Your Local SEO Success in 2014

seo services


Forecasting SEO trends for local is quite difficult – primarily because I know in my heart of hearts that local SEO isn't a trend, phenomenon or fad. Local – along with personalized search – is a necessary evolution to truly optimize the search experience for users across the globe. It is a mindset that SEO professionals can't "arrive to" late.
Early adoption of best practices, voraciously reading case studies and experimenting on your own is mission critical to surviving this new era of marketing. Join me on a palatable overview on what's happened already, and what's coming up next.
This article will cover Google Hummingbird; the overlap between mobile and local; a SoLoMo case study using Pinterest, turnkey local SEO strategies; tools; and a few important infographics.

The Hummingbird

Frank Chimero wrote an essay called "What Screens Want". Along with crystallizing the very nature of our relationship to screens – and hinting and the importance of cross-device consumption and consumerism – he talks about the language designers are using to, well, design.
In it, he rejects the current state of the web – one that is built around ideals such as privatization and power. Instead Chimero begs for an Internet that celebrates community and wildness. When it comes to local marketing, there seems no better starting place than here.
When the Hummingbird algorithm dropped, the majority of SEO professionals hardly noticed a difference. While Google itself said that the algorithm affected upwards of 90 percent of the queries, many rankings across keywords stayed the same.
Due to the interaction between Hummingbird and the Venice update – a tweak that lead to more localized organic results for unbranded, non-geo-modified keywords keywords – local SEO pros should celebrate this new algorithm. What this means is that there are even more opportunities to capture local traffic, for both queries such as [seo agency montreal] and [seo agency], as more keywords now trigger local results.
More than ever, it has become important to include regional-vertical pages for your local business. Hummingbird – likely the first of many algorithmic updates that will prefer context to content – forces local businesses eliminate catch-all Our Locations pages, instead encouraging them to publish content specific to each place of business.

Mobility and Locality

One of my favorite "year in review" posts for digital marketing came from Karen McGrane, a brilliant content strategist. She compiled a list of mobile web statistics (sources found within) that are sure to knock the socks off of digital marketing managers across the globe. A sample:
  • 91 percent of American adults own a mobile phone.
  • 56 percent of American adults own a smartphone.
  • 63 percent of mobile phone owners use their phones to access the Internet.
  • Amazon, Wikipedia, and Facebook all see about 20 percent mobile traffic.
  • 77 percent of mobile searches take place at home or at work.
For the purposes of this article, one statistic that stood out in particular:
  • 46 percent of shoppers report using their phone to research local products and services.
Nearly one in twoshoppers for local products and services are using their phone. If your mobile game isn't on lock, you are essentially neglecting or potentially insulting half of your target demographic.
Finding resources to make a mobile-friendly website in 2014 isn't a simply a good idea, a high-priority or mission critical task. It is essential to the longevity and profitability of your business. Period. With sad statistics showing most B2B, Fortune 100 and consumer brands failing to stay up to snuff, mobile could be a fantastic opening for your business to slay your competition.'''
In November, I had the opportunity to listen to author and Google Digital Marketing Evangelist Avinash Kaushik speak at Think Quebec about the impact of mobile on search marketing. He pulled up example after example of terrible mobile search experiences. He drew attention to huge brands that were throwing away search traffic – often after creating demand for a particular product on a different marketing channel, and then now showing up in search.
"The web is so good at destroying things. If you suck – you die," he said.
Kaushik's words ring no less true for local SEO. If you aren't performing for 46 percent of your potential customers, you suck. And you will die – or at least your business might.

On-Site Quick Technical Fixes

"Share of voice" is becoming an increasingly interesting application to search engine marketing, and verges on being the most inclusionary digital marketing trend of 2013.
Share of voice addresses the entirety of the search engine results page, which as we know, is getting more complicated by the week. The crew at IGO Mobile Marketing has put together a digest of technical fixes to dominate search share of voice, including:
  • New advances in meta data for local marketing.
  • Local caps on sitelinks.
  • De-indexing and demoting useless pages.
  • Maximizing presence in IYP or local directories.
  • Optimizing review management processes.
  • Rich snippets – sentiments and testimonials.

Off-Site: Organic vs. Local Strategies

Examining the crossover between organic and local results can be a difficult mental exercise for many. Thankfully, Adam Steele at Lean Marketing put together an extensive, step-by-step guide to figuring out whether your organic efforts – on-site optimization and off-site outreach – make an impact on the particular local search results you are aiming to optimize for.
In brief, his findings are:
  • A "supermajority" of pack results relied on being picked up on in Google Maps.
  • A strong correlation between being the first position in organic and in the pack.
  • Local factors have a strong influence over the first position in organic.
This type of analysis is hugely beneficial because if the local results – or "pack results" – are being heavily influenced by strictly organic search signals, it may be enough for you to focus your SEO efforts on solidifying your placement in organic, using traditional SEO methodologies. In fact, Steele concludes, "your ability to crush it with organic SEO may just make or break your [local] campaign."

Local Citation Building Checklist

Citation building is a practice that is acutely separate from link building, but they share one definite similarity: if you abuse citation building, you will get burned. In this section we'll give some quick insights on what citations are exactly, and how to leverage them to influence your rankings. Before we get started, however, it's important to note that not only are your listings on Google Maps important – but also on Bing, Yahoo, and Apple.
There are five categories of directories that you'll be looking at in terms of citations:
  • Data-aggregators (LocalEze)
  • Horizontal directories (Yelp)
  • Industry-specific directories (Avvo)
  • Region-specific directories (Denver.com/places)
  • Unstructured citations (blogs)
Rather than focusing on the Moz- or PageRank values of these directories, you will want to look for the opportunity to plug structured citations for your business online. Structured citations commonly consist of NAP (name, address, phone number) information. The quality of the website, the accuracy of these citations and the relevancy of the directory are all essential to executing a successful citation building campaign.

Sunday, December 15, 2013

New Facebook Like-Share Buttons Now Available to All

SEO Services


Facebook announced a redesign to its Like and Share buttons to unveil the new Like-Share combo button to select brands. This week, Facebook announced the new buttonswould be available to all, saying tests of the functionality have boosted sharing.
From Facebook:
In early tests over the past month after launching the new Like and Share buttons, we've seen more than a 5% lift in Likes and Share across the web. This is significant, given that both buttons are viewed over 22 billion times daily. Based on these results, we've rolled out the new design to everyone and extended it to the Follow and Like Box plugins as well.
          Sharaholic released its data for Facebook referrals over the past month, and showed referring                traffic from Facebook was up 47 percent.
Although Shareaholic does not offer the new FB buttons among our own share buttons, we still track inbound traffic on sites that use any of our other offerings (analytics and related content). Some of our publishers have adopted the new Like and Share buttons from Facebook and are obviously reaping the benefits. Others that use our share buttons still see gains in Facebook referral traffic.
"I think this is a very positive change for users and brands," said Danny Wong, who oversees growth and marketing at Shareaholic. "The buttons perform well, increasing engagement, which means more sharing across the web. Users now have an easier time acknowledging they 'like' a story and can easily 'share' things they think their friends would love, too. Brands benefit because their earned, owned and paid media will continue to attract more eyeballs." 


Sunday, December 8, 2013

Matt Cutts Discusses Duplicate Meta Descriptions

SEO Services 

“The way I would think of it is, you can either have a unique metatag description, or you can choose to have no metatag description, but I wouldn’t have duplicate metatag description[s],” Cutts says. “In fact, if you register and verify your site in our free Google Webmaster Tools console, we will tell you if we see duplicate metatag descriptions, so that is something that I would avoid.”
“In general, it’s probably not worth your time to come up with a unique meta description for every single page on your site,” he adds. “Like when I blog, I don’t bother to do that. Don’t tell anybody. Ooh. I told everybody. But if there are some pages that really matter, like your homepage or pages that have really important return on investment – you know, your most featured products or something like that – or maybe you’ve looked at the search results and there’s a few pages on your site that just have really bad automatically generated snippets. We try to do our best, but we wouldn’t claim that we have perfect snippets all the time.”
No, believe it or not Google is not perfect (as Executive Chairman Eric Schmidt also reminded us).
Cutts concludes, “You know, in those kinds of situations, then it might make sense to go in, and make sure you have a unique handcrafted, lovingly-made metatag description, but in general, rather than have one metatag description repeated over and over and over again for every page on your site, I would either go ahead and make sure that there is a unique one for the pages that really matter or just leave it off, and Google will generate the snippet for you. But I wouldn’t have the duplicate ones if you can help it.”
Some will probably take Matt’s advice, and start spending a lot less time bothering with meta descriptions. Just remember that part about looking at the search results and making sure that Google isn’t displaying something too weird, particularly if it’s an important page.