Thursday, December 6, 2012

8 Changes to Google AdWords in 2012 You Shouldn’t Miss


Every year, Google introduces new ad formats, changes AdWords settings and introduces a few high profile tests into the wild.
This year saw a lot more aggressive monetization by Google, with several high profile changes that increased the real estate for ads at the expense of unpaid listings.
This roundup covers eight key changes introduced to AdWords in 2012:
  1. Google Shopping and Product Listing Ads
  2. Overhauled Location Targeting
  3. Dynamic Search Ads
  4. Enhanced Sitelinks
  5. Offer Extensions
  6. Dynamic Display Ads
  7. Mobile App Extensions
  8. AdWords for Video

1. Google Shopping and Product Listing Ads

Google Product Search, formerly Froogle, was once a free tool to allow anyone with a Merchant Center account to include their products in the visual product listings that sometimes appeared in search results.
This year, Google shifted those results to be entirely commercial and rebranded it Google Shopping. These ads are now powered entirely by Product Listing Ads and take two forms.
First, they appear as a band of sponsored image ads, with product pictures, price and vendor, underneath the top search results listing:
telescopes-google-search-results
Second, certain specific product searches will replace the right column of search results with a product description and links to retailers:
google-shopping-ads-sidebar-results-calestron
This is a dramatic change to the search results, how Google monetizes and paid search in general. It furthers the 2-year-old march towards paid search without keywords. Every retailer must incorporate these ads as a part of their AdWords strategy.
For more education, check out these resources.
How to create Product Listing Ads:
How to optimize your Product Listing Ads:
Watch Rimm-Kauffman Group’s webinar with Google about Product Listing Ads and read Google’sbest practices for Google Shopping (pdf). Google blog post, Google Shopping: momentum and merchant success, details some of the other changes and highlights merchants who have had success.

2. Overhauled Location Targeting

Location Targeting in AdWords got an overhaul with a new location targeting tool and some more sophisticated options for local targeting. This change is particularly relevant for brick-and-mortar retailers with a limited service area and companies that need or want to target specific areas.
The most prominent change is the introduction of ZIP code targeting.
adwords-zip-code-targeting
This targeting works well with location insertions for ads with location extensions, which automatically creates custom ads based on the users location.
adwords-location-insertion-with-location-extensions-someretailer
Other features, like airport targeting, can be a huge boon for car rental companies or local hotels:
adwords-airport-targeting
Politicians got a break with Congressional district targeting:
adwords-congressional-district-targeting
Watch this video to learn the basics:

3. Dynamic Search Ads

If you really want to catch a glimpse of the future of paid search, pay close attention to Dynamic Search Ads. This new technology will automatically crawl your site, according to logic you define, and created dynamic ads that combine information from their crawl with your ad template:
adwords-dynamic-search-ad-example
The ads only trigger for search queries that aren’t eligible to match existing keywords in your account. Theoretically, this allows you to address gaps in your account and more quickly adapt to changing inventory.
google-dynamic-search-ads
There are many technical nuances to setting up and tracking these new ads.

4. Enhanced Sitelinks

In many ways, paid search has become a competitor to organic listings. This has become especially true in the top listing, which appears above organic results and pushes them lower on the page.
Most recently, Google expanded their sitelinks with enhanced sitelinks:
example-pizza-store-enhanced-sitelinks
This creates a block of essentially five ads in the premier results location. Google automatically looks for text ads elsewhere in your account that match the sitelinks for your campaign and pulls in lines 1 and 2.

5. Offer Extensions

Google Offer Extensions add an offer below your text, similar to sitelinks. Offers can be redeemed online (trackable) or offline(not measurable in AdWords).
Offers only appear when you’re ad is in the top position. They’re primarily meant for brick-and-mortar retailers, but offers can be used online as well.
SEER Interactive has a nice write-up on the ins-and-outs of Offers extensions.
adwords-offer-ad

6. Dynamic Display Ads

Three years ago (!) Google acquired a company called Teracent whose technology:
…creates display ads entirely customized to the specific consumer and site. The startup’s proprietary algorithms automatically pick the creative parts of a display ad (images, colors, text) in real-time determined by like geographic location, language, the content of the website, the time of day or the past performance of different ads.
This year, that acquisition finally came to fruition with the introduction of Dynamic Display Ads. The ad is one template whose featured product varies based on where the ad is shown.
Like Dynamic Search Ads, this is a step towards full automating the advertising process. In this case, it makes scaling ecommerce display much more efficient.
Watch Google’s brief overview:

7. Mobile App Extensions

Apple's App Store isn’t very marketer friendly. Google answered the call of advertisers looking to promote their apps for download with new Mobile App Extensions.
This adds a sitelinks-like option underneath your main text ad:
google-mobile-app-ad-food-delivery-san-francis
These are an optional extension to your existing ads:
adwords-mobile-app-extension-and-app-picker
You can even track downloads in iOS if you integrate a snippet of code into your app.
Watch Google’s video for a high level overview:

8. AdWords for Video

YouTube is the second most popular search engine after Google and in the top 5 sites on the entire Internet. Google simplified the buying of video ads on YouTube and the Google Display Network with AdWords for video.
Targeting, measurement, and reporting for video ads on YouTube and the Google Display Network are integrated into AdWords.
To get a general overview read about Google video ads or watch Google’s overview video:
For detailed tactical advice, start with their step-by-step guide to YouTube (pdf).

SES New York
The SES New York Agenda has been posted.

Matt Cutts Talks Google Penguin, Negative SEO, Disavowing Links, Bounce Rate & More


What is Google looking for in a high quality website, worthy of top rankings? Well according to Matt Cutts, head of Google’s web spam team, you must first use as many keywords as possible; the optimal keyword density is actually 77 percent. Definitely link to porn sites. Annoying users: that’s a plus. You do get a boost for running AdSense, he revealed. Oh, and all those other search engines - Bing, Blekko, DuckDuckGo - they’re a bunch of hackers doing illegal things. You heard it hear, folks.
If you bought any of that, I have some icebergs I’d like to sell you. That video was actually a mash-up spoof Google put out, one that was played at the beginning of You & A with Matt Cutts at SMX Advanced 2012.
All jokes aside, Cutts got into some great topics and dispelled some modern-day SEO myths in his session. Here are the highlights.

Is Penguin a Penalty?

No, neither Penguin nor Panda are manual penalties, Cutts said. He explained that Penguin was designed to tackle “the stuff in the middle;” between fantastic, high quality content and spam. Panda was all about spam, but the need for Penguin arose from this middle ground.
“It does demote web results, but it’s an algorithmic change, not a penalty. It’s yet another signal among over 200 signals we look at,” he said.
A penalty is a manual action taken against a site and you will “pretty much always” be notified in Webmaster Tools if it’s a penalty affecting your site.

Will a Reconsideration Request Help You Recover From Penguin?

No. “People who think it should rank higher after Penguin can let us know and we can look at it, and in a couple of instances, it actually helped us make a couple of tweaks to the algorithm.” You should submit a reconsideration request if you receive a warning.

Negative SEO - Will Google Add an Option to Disavow Links?

They sure seem to be thinking about it. People have been asking about negative SEOfor a long time, Cutts said. He noted that Google has changed their documentation over time to reflect that negative SEO is not impossible, but it is difficult. Google is “talking about” being able to enable disavowing links, possibly within a few months.

Did Google Send WMT Notifications About Penguin?

Google is trying to be more transparent by sending out more warnings, he said. Only a single-digit percent of those 700,000 unnatural link warnings that went out around the time of Penguin were actually Penguin-related. The majority were for obvious black-hat tactics.

Is Google Trying to Make a Point About Buying Links?

Yes, they are. According to Cutts, “People don’t realize, when you buy links, you might think you’re very careful, that you have no footprints, but you may be getting into business with someone who’s not as careful. People need to realize as we build new tools, it becomes a higher-risk endeavor.”

Is SEO Going to Get More Difficult?

Yes. He notes that it’s become more challenging over the past five to seven years and SEOs should expect that trend to continue and even increase.

Does That Mean Google Hates SEOs?

Of course not. Though Cutts did hand out a spanking for SEOs who buy or sell links: “There are people who continue to sell links, although they don’t do any good, and that’s part of how SEO has a bad reputation.”
Later, he said he would consider giving link building for non-profits a try to better understand what SEOs are facing. When asked about the war on SEOs, he said, “There’s no war on SEOs!” and that it’s just a war on spam.

Should You NoFollow Affiliate Links?

Yes. While Google does understand the vast majority of network links, you should nofollow them if you’re making money from it and worried about it.

Are Links a Dying Signal?

No. “There’s a perception that everything will go social and links will be obsolete but I wouldn’t write the epitaph for links just yet,” he said.

Is Bounce Rate a Signal in Determining What Content May be Spam?

No. Cutts said the Google web spam team doesn’t use Google Analytics data. It’s not a bad thing when someone finds their answer right away and bounces, he said.

Is Google Ever Going to go Back to the Days Before (Not Provided)?

Nope. Not even if you write about it and stamp your feet up and down. He realizes it’s not good for marketers, but secured search is better for users, Cutts said. As more browsers move towards securing traffic, expect not provided to increase.

Why Isn’t AdWords Blocked From Referrer Data?

According to Cutts, this is because Google would then have to deal with exact matches for every search ever done and the ad database would grow exponentially. He did say that he would like to see that decision revisited, though.
Other interesting tidbits:
  • Hacked sites may abuse rich snippets, in which case Google may demote sites abusing them or take them away.
  • When asked why a site might stay penalized after attempting to remove bad links, Cutts say they are looking at a subsample and want to see an earnest effort to have those removed, so if nothing in that sample changed, you might be SOL.
  • +1s are not the best quality signal yet because it’s still “early days.”
  • Google does not consider any of their sponsored ads paid inclusion because they are clearly marked.
  • In some cases, sites hit by Panda and Penguin might be further ahead just to scrap the site and start over.

Official Google Panda #22 Update: November 21


I am seeing another spike in SEO/Webmaster chatter at the ongoing WebmasterWorld thread of a possible Google Panda update.
It makes sense, Google told us about ten days ago that the Panda update we thought we saw was not a Panda update but we should expect a Panda update in about 7-10 days. Well, it is about ten days and the forum are buzzing about it.
I emailed Google and they told me the Panda update happened around November 21st, so a lot less than 7-10 days from when I asked. In fact, it was less than two days after I asked.
So there was a Panda refresh on November 21, 2012 - version number 22.
Google did not tweet anything about this update, at least not yet.
Update: Google told us 0.8% of queries in English were impacted by this.

Past Google Panda Update:

After Panda & Penguin, is Google Living Up to Its Great Expectations?


evolution-of-penguin
It all starts with Google, doesn’t it? Not really – it’s all about Google today because Google is the most used search engine.
Google, like any other software, evolves and corrects its own bugs and conceptual failures. The goal of the engineers working at Google is to constantly improve its search algorithm, but that’s no easy job.
Google is a great search engine, but Google is still a teenager.
This article was inspired by my high expectations of the Google algorithm that have been blown away in the last year, seeing how Google’s search results “evolved.” If we look at some of the most competitive terms in Google we will see a search engine filled with spam and hacked domains ranking in the top 10.
google-vs-blekko-spam

Why Can Google Still Not Catch Up With the Spammers?

Panda, Penguin, and the EMD update did clear some of the clutter. All of these highly competitive terms have been repeatedly abused for years. I don’t think there was ever a time when these results were clean, in terms of what Google might expect from its own search engine.
Even weirder is that the techniques used to rank this spam are as old as (if not older than) Google itself. And this brings me to a question: 
The only difference between now and then is the period of time a spam result will “survive” in the SERPs. Now it's decreased from weeks to days, or even hours in some cases.
One of the side effects of Google's various updates is a new business model: ranking high on high revenue-generating keywords for a short amount of time. For those people involved in this practice, it scales very well.

How Google Ranks Sites Today: A Quick Overview

These are two of the main ranking signal categories:
  • On-page factors.
  • Off-page factors.
On-page and off-page have existed since the beginning of the search engine era. Now let’s take a deeper look at the most important factors that Google might use.
Regarding the on-page factors Google will try to understand and rate the following:
  • How often a site is updated. A site that isn't updated often doesn't mean it's low quality. This just tells Google how often it should crawl the site and it will compare the update frequency to other sites’ update frequency in the same niche to determine a trend and pattern.
  • If the content is unique. Duplicate content matching applies a negative score)
  • If the content provides interest to the users. Bounce rate and traffic data mixed on-page with off-page).
  • If the site is linking out to a bad neighborhood.
  • If the site is inter-linked with high-quality sites in the same niche.
  • If the site is over-optimized from an SEO point of view.
  • Other various smaller on page related factors.
The off-page factors are mainly the links. The social signals are still in their infancy and there is no exact study yet that clearly shows a true correlation of the social signals without being merged with the link signal. It is all speculation until now.
Talking about links they could be easily classified in two big categories:
  • Natural. Link appeared as a result of:
    • The organic development of a page (meritocracy).
    • A result of a “pure” advertising campaign with no intent of directly changing the SERPs.
  • Unnatural. Link appeared:
    • With the purpose to influencing a search engine ranking.
Unfortunately, the unnatural links represent a very large percentage of what the web is today. This is mainly due to Google’s (and the other search engines’) ranking models. The entire web got polluted because of this concept.
When your unnatural link ratio is way higher than your natural (organic) link ratio, it raises a red flag and Google starts watching your site more carefully.
Google tries to fight the unnatural link patterns with various algorithm updates. Some of the most popular updates, that targeted unnatural link patterns and low quality links, are the Penguin and EMD updates.
Google’s major focus today is on improving the way it handles link profiles. This is another difficult task, which is why Google is having a hard time making its way through the various techniques used by SEO pros (black hat or white hat) to influence positively or negatively the natural ranking of a site.

Google's Stunted Growth

Google is like a young teenager stuck on some difficult math problem. Google's learning process apparently involves trying to solve the problem of web spam by applying the same process in a different pattern – why can’t Google just break the pattern and evolve?
Is Google only struggling to maintain an acceptable ranking formula? Will Google evolve or stick with what it’s doing, just in a largely similar format?
Other search engines like Blekko have taken a different route and have tried to crowdsource content curation. While this works well in a variety of niches, the big problem with Blekko is that this content curation is not too “mainstream” putting the burden of the algorithm on the shoulders of its own users. But the pro users appreciate it and make the Blekko results quite good.
In a perfect, non-biased scenario, Google’s ranked results should be:
  • Ranked by non-biased ranking signals.
  • Impossible to be affected by third parties (i.e., negative SEO or positive SEO).
  • Able to tell the difference between bad and good (remember the JCPenny scandal).
  • More diverse and impossible to manipulate.
  • Giving new quality sites a chance to rank near the “giant” old sites.
  • Maintaining transparency.
There is still a long way to go until Google’s technology evolves from the infancy we know today. Will we have to wait until Google is 18 or 21 years old – or even longer – before Google reaches this level of maturity that it dreams of?
Until then, the SEO community is left with testing and benchmarking the way Google evolves – and maybe try to create a book of best practices about search engine optimization.
Google created an entire ecosystem that started backfiring a long time ago. They basically opened the door to all the spam concepts that they are now fighting today.
Is this illegal or immoral, white or black? Who are we to decide? We are no regulatory entity!

Conclusion

Google is a complicated “piece” of software that is being updated constantly, with each update theoretically bringing new fixes and improvements.
None of us were born smart, but we have learned how to become smart as we’ve grown. We never stop learning.
The same applies to Google. We as human beings are imperfect. How could we create a perfect search engine? Are we able to?
I would love to talk with you more. Share your thoughts or ask questions in the comments below.

Google Updates OneBox Results Design To Match Mobile Interface


Google announced they have updated their OneBox results, also known as quick answer results, to match the mobile and tablet designs they released back in August. The new results are cleaner, take up more space, but also are more interactive.
For example, here is the stock quote for GOOG and the chart on the search results page is interactive:
All the OneBox result designs were updated today. They include definitions, currencies, flight information and more:
Other results include unit conversion, holidays, sunrise times, weather results, time conversions and much more.
Google says they are rolling this out to everyone using Google.com on their desktop over the next few weeks.

Tuesday, September 25, 2012

Google Rolls Out Panda 3.9.2 Refresh


Google has announced on Twitter that they pushed out a Panda refresh yesterday morning that impacted "fewer than 0.7% of queries."
Most refreshes impact less than 1% of the search queries. So this is just a refresh, despite how much I'd love to name it 4.0.
The previous Panda refresh was on August 20, 2012 and labeled 3.9.1.
Danny Sullivan goes really deep into the past updates on Search Engine Land.
So the Panda update we thought we saw last week, was not a Panda update, it was a diversification update. Although, in that update, many Panda victims did claim recovery. So who knows what Google was or was not testing.

Past Google Panda Update:

  • Panda 3.9.2 on September 18th
  • Panda 3.9.1 on August 20th
  • Panda 3.9 on July 24th
  • Panda 3.8 on June 25th
  • Panda 3.7 on June 9th
  • Panda 3.6 on April 27th
  • Panda 3.5 on April 19th
  • Panda 3.4 on March 23rd
  • Panda 3.3 on about February 26th
  • Panda 3.2 on about January 15th
  • Panda 3.1 on November 18th
  • Panda 2.5.3 on October 19/20th
  • Panda 2.5.2 on October 13th
  • Panda 2.5.1 on October 9th
  • Panda 2.5 on September 28th
  • Panda 2.4 in August
  • Panda 2.3 on around July 22nd.
  • Panda 2.2 on June 18th or so.
  • Panda 2.1 on May 9th or so.
  • Panda 2.0 on April 11th or so.
  • Panda 1.0 on February 24th

Another Panda Update 3.92 - 0.7% Effects

Google Panda - a word which most of  SEO's are afraid about. It's always good to know that "Panda and penguin updates are making the web better". Most of the websites will effect by these algorithm changes due to low quality content as well as spamy back links.

Here is the update on twitter regarding Google Panda 3.92 update by a Googler: http://goo.gl/ByvVf



As per the above tweet only 0.7% of the Google search results will be effected. We (SEO's & website owners) might not worry more about this query if you have good quality content on your website. Google has made 20 panda updates till now by targeting poor content.

Do you love Google Panda & Penguin Updates?

Google Now Cards – Get Just the Right Information at Just the Right Time


Google landing and announced Google Now …. It gets you just the right information at just the right time.
Google Now tells you today’s weather before you start your day, how much traffic to expect before you leave for work, when the next train will arrive as you’re standing on the platform, or your favorite team’s score while they’re playing. And the best part? All of this happens automatically. Cards appear throughout the day at the moment you need them.
  • Get just the right information, at just the right time - Just swipe up, and you’ve got the latest information you want to see, when you want to see it.
  • No digging required - Cards appear when they’re needed most, organizing the things you need to know and freeing you up to focus on what’s important to you.
  • You are in control - Choose exactly which cards you see. You control whether you get personalized results from your calendars, locations and searches after opting in.
Google Now is categories with 10 different cards and more to come,
  1. Traffic
  2. Public transit
  3. Next appointment
  4. Flights
  5. Sports
  6. Places
  7. Weather
  8. Translation
  9. Currency
  10. Time at home
For more about Google Now,

Monday, June 18, 2012

New site not indexed on Google yet ? Here’s what to do


Checklist for search engine indexing.
1. Check if your CMS has an opt in option where, it blocks search engines.

Well technically they shouldn’t but some CMS’s like WordPress has a default option which will be checked already where it will block all search engines from crawling the website. This is to make sure that your website is not indexed while you are only setting up the site. But when you forget to uncheck this option, the CMS adds a Nofollow meta tag to the pages whereby search engines will not crawl the site.
2. Check if you have set the correct Robots.txt

Robots.txt files are used to control (not fully though) search engines crawling the website. Make sure you have not blocked bots from crawling parts of your website.
3. Check if your servers are returning OK status

Use tools like header checker tools to ensure that your server is returning a healthy 200 OK status. Anything else should be dealt with immediately.
4. Set up a sitemap

Sitemaps does not guarantee that your site will be up on the SERPs soon, but they are a necessity. They ensure that we have a route map for the bots to follow. So set up a sitemap before even trying to get indexed.
5. If you run plugins check if they are set properly

Setting plugins like SEO plugins are good. But make sure they are set properly. Some plugins have an option to set nofollow meta tags on pages by default to avoid duplicate content issues, if you do not turn this off, it might screw up the indexing process.

I assume that you have done all the basic steps to invite search engines to your site already. Which are -

1 – To submit the website URL to Google via the submission tool.

2 – If possible, get a relevant contextual link from a good website (which has already been indexed and has good reputation).

Summary

Generally Google takes its own time to index new websites, and the best way to make sure Google indexes your site is to get a very relevant text link from a reputed site, but it isn’t easy always. But however, remember to have the basics health check done, so that when the bots are here, you have everything in place.

Monday, June 11, 2012

Google Local Listings with Zagat Rating in Google Plus


Google made some changes to Local Business Listings from May 30th 2012 onwards.

What are the changes?
Google Local Business Listings has a new look i.e Local listings are integrated in Google Plus.
Each local business listing rating system has been improved using “Zagat Rating” system.


What is Zagat Rating?
Zagat Rating system will have 30 point scale. Mostly this system used for hotels where customers will rate food, décor and service. It seems that Google has applied the same rating system to all of the local business listings in Google Places. However Google is displaying only the total score instead of individual Zagat Score of each service. Refer following screenshot for a better idea.

Difference between Old & New Local Listings:


Thursday, May 17, 2012

How can I recover from Google Penguin?


Recovery from Google Penguin is a difficult one for some webmasters, but much easier for others. The recovery time depends on the extent of the authority lost from bad/poor quality links. Depending on your websites link profile makeup will depend on how long it takes to recover, there is no fixed answer.

To get started why not consider:

Evaluate your websites linking profile, if you believe you were unfairly hit then use the following form to submit an appeal to Google. This is a manually reviewed form and will take time to be considered. Click here for the Webspam Form.
Whilst waiting for your websites review, ensure your link profile is clean, and does not contain links that could be perceived as bad/negative or spammy.

Create a new strategy for creating good quality content, media and other useful items which can be shared through social media from your website. Consider everything from useful tools and guides to viral content/media.
Don't rely too heavily on links, do not hit back by buying links or influencing the re-growth of your websites long run doing so could cause major issues in the long run.

If your not already video blogging or using Youtube, consider joining the revolution, video is powerful.
Ensure your social activities and presence remain active, poor or inconsistent efforts will hamper progression.
Ensure that your growing link profile has a diverse link anchor text profile, excessive exact match anchor text can and will cause problems.
From the point of ranking loss onwards, there is no foreseeable update likely to help recovery as linking was one of the leading contributors for website positions. Instead, focus on addressing the situation and putting new strategies into place.
Gain a better understanding of your market place, join social media such as Twitter, Facebook, Youtube, Google +, Reddit and other networks and share your content, knowledge and information.
Any quick fixes are unlikely to work, and could cause problems for the website during future updates