Webmaster Level: All

Redirects are often used by webmasters to help forward visitors from one page to another. They are a normal part of how the web operates, and are very valuable when well used. However, some redirects are designed to manipulate or deceive search engines or to display different content to human users than to search engines. Our quality guidelines strictly forbid these kinds of redirects.

For example, desktop users might receive a normal page, while hackers might redirect all mobile users to a completely different spam domain. To help webmasters better recognize problematic redirects, we have updated our quality guidelines for sneaky redirects with examples that illustrate redirect-related violations.

We have also updated the hacked content guidelines to include redirects on compromised websites. If you believe your site has been compromised, follow these instructions to identify the issues on your site and fix them.

As with any violation of our quality guidelines, we may take manual action, including removal from our index, in order to maintain the quality of the search results. If you have any questions about our guidelines, feel free to ask in our Webmaster Help Forum.

Webmaster level: All

Our quality guidelines warn against running a site with thin or scraped content without adding substantial added value to the user. Recently, we’ve seen this behavior on many video sites, particularly in the adult industry, but also elsewhere. These sites display content provided by an affiliate program—the same content that is available across hundreds or even thousands of other sites.

If your site syndicates content that’s available elsewhere, a good question to ask is: “Does this site provide significant added benefits that would make a user want to visit this site in search results instead of the original source of the content?” If the answer is “No,” the site may frustrate searchers and violate our quality guidelines. As with any violation of our quality guidelines, we may take action, including removal from our index, in order to maintain the quality of our users’ search results. If you have any questions about our guidelines, you can ask them in our Webmaster Help Forum.

Webmaster level: All

We strive to keep spam out of our users’ search results. This includes both improving our webspam algorithms as well as taking manual action for violations of our quality guidelines. Many webmasters want to see if their sites are affected by a manual webspam action, so today we’re introducing a new feature that should help. The manual action viewer in Webmaster Tools shows information about actions taken by the manual webspam team that directly affect that site’s ranking in Google’s web search results. To try it out, go to Webmaster Tools and click on the “Manual Actions” link under “Search Traffic."

You’ll probably see a message that says, “No manual webspam actions found.” A recent analysis of our index showed that well under 2% of domains we've seen are manually removed for webspam. If you see this message, then your site doesn't have a manual removal or direct demotion for webspam reasons.

If your site is in the very small fraction that do have a manual spam action, chances are we’ve already notified you in Webmaster Tools. We’ll keep sending those notifications, but now you can also do a live check against our internal webspam systems. Here’s what it would look like if Google had taken manual action on a specific section of a site for "User-generated spam":

Partial match. User-generated spam affects

In this hypothetical example, there isn’t a site-wide match, but there is a “partial match." A partial match means the action applies only to a specific section of a site. In this case, the webmaster has a problem with other people leaving spam on By fixing this common issue, the webmaster can not only help restore his forum's rankings on Google, but also improve the experience for his users. Clicking the "Learn more" link will offer new resources for troubleshooting.

Once you’ve corrected any violations of Google’s quality guidelines, the next step is to request reconsideration. With this new feature, you'll find a simpler and more streamlined reconsideration request process. Now, when you visit the reconsideration request page, you’ll be able to check your site for manual actions, and then request reconsideration only if there’s a manual action applied to your site. If you do have a webspam issue to address, you can do so directly from the Manual Actions page by clicking "Request a review."

The manual action viewer delivers on a popular feature request. We hope it reassures the vast majority of webmasters who have nothing to worry about. For the small number of people who have real webspam issues to address, we hope this new information helps speed up the troubleshooting. If you have questions, come find us in the Webmaster Help Forum or stop by our Office Hours.

Update (12:50pm PT, August 9th): Unfortunately we've hit a snag during our feature deployment, so it will be another couple days before the feature is available to everyone. We will post another update once the feature is fully rolled out.

Update (10:30am PT, August 12th): The feature is now fully rolled out.

Webmaster level: All

Our quality guidelines prohibit manipulative or deceptive behavior, and this stance has remained unchanged since the guidelines were first published over a decade ago. Recently, we’ve seen some user complaints about a deceptive technique which inserts new pages into users’ browsing histories. When users click the "back" button on their browser, they land on a new page that they've never visited before. Users coming from a search results page may think that they’re going back to their search results. Instead, they’re taken to a page that looks similar, but is actually entirely advertisements:

list of advertisements

To protect our users, we may take action on, including removal of, sites which violate our quality guidelines, including for inserting deceptive or manipulative pages into a user's browser history. As always, if you believe your site has been impacted by a manual spam action and is no longer violating our guidelines, you can let us know by requesting reconsideration.

Webmaster level: advanced

When talking to site owners on Google Webmaster Forums we come across questions on reconsideration requests and how to handle backlink-related issues. Here are some common questions, along with our recommendations.

When should I file a reconsideration request?

If your site violates our Google Quality Guidelines or did in the past, a manual spam action may be applied to your site to prevent spam in our search results. You may learn about this violation from a notification in Google Webmaster Tools, or perhaps from someone else such as a previous owner or SEO of the site. To get this manual action revoked, first make sure that your site no longer violates the quality guidelines. After you've done that, it's time to file a reconsideration request.

Should I file a reconsideration request if I think my site is affected by an algorithmic change?

Reconsideration requests are intended for sites with manual spam actions. If your site’s visibility has been solely affected by an algorithmic change, there's no manual action to be revoked, and therefore no need to file a reconsideration request. If you're unsure if it's an algorithmic change or a manual action, and have found issues that you have resolved, then submitting a reconsideration request is fine.

How can I assess the quality of a site’s backlinks?

The links to your site section of Google Webmaster Tools is a great starting point for an investigation as it shows a significant amount of your site’s inbound links. If you know that you ran an SEO campaign during a particular period of time, downloading the latest links can come handy in slicing links created at that time. Using the links found in Google Webmaster Tools, we recommend looking for patterns that point to general issues that are worth resolving. For example, spammy blog comments, auto generated forum posts or text advertisements with links that pass PageRank are likely to be seen as unnatural links and would violate Google’s quality guidelines. For individual examples and hands-on advice we recommend getting help of peers and expert webmasters on the Google Webmaster Forum.

How do I clean a bad backlink profile?

Make sure to identify poor links first, then make a strong effort to get them either removed or nofollowed. Then use the Disavow Links Tool to deal with remaining unnatural backlinks. We recommend using domain-wide operator for sites with a complicated URL structure, very obvious spam sites, such as gibberish content sites or low quality sites with content that shows no editorial value. See our video on common mistakes when using the disavow tool for more information.

How much information do I need to provide?

Detailed documentation submitted along with a reconsideration request can contribute to its success, as it demonstrates the efforts made by the webmaster and helps Googlers with their investigation. If you are including a link to a shared document, make sure that it’s accessible to anyone with the link.

How long does it take to process reconsideration requests?

Reconsideration requests for sites affected by a manual spam action are investigated by a Googler. We strive to respond in a timely manner, normally within just a few days. However, the volume of incoming reconsideration requests can vary considerably, hence we don't provide a guaranteed turnaround time.

What are the possible outcomes of a reconsideration request?

Upon submitting a reconsideration request, you will first receive an automated confirmation in Google Webmaster Tools. After your request is processed, we'll send you another message to let you know the outcome of the request. In most cases, this message will either inform you that the manual action has been revoked or that your site still violates our quality guidelines.

Where can I get more guidance?

For more information on reconsideration requests, please visit our Help Center. And as always, the Google Webmaster Forum is a great place for further discussions as well as seeking more advice from experienced webmasters and Google guides.

Webmaster level: all

Google has said for years that selling links that pass PageRank violates our quality guidelines. We continue to reiterate that guidance periodically to help remind site owners and webmasters of that policy.

Please be wary if someone approaches you and wants to pay you for links or "advertorial" pages on your site that pass PageRank. Selling links (or entire advertorial pages with embedded links) that pass PageRank violates our quality guidelines, and Google does take action on such violations. The consequences for a linkselling site start with losing trust in Google's search results, as well as reduction of the site's visible PageRank in the Google Toolbar. The consequences can also include lower rankings for that site in Google's search results.

If you receive a warning for selling links that pass PageRank in Google's Webmaster Tools, you'll see a notification message to look for "possibly artificial or unnatural links on your site pointing to other sites that could be intended to manipulate PageRank." That's an indication that your site has lost trust in Google's index.

To address the issue, make sure that any paid links on your site don't pass PageRank. You can remove any paid links or advertorial pages, or make sure that any paid hyperlinks have the rel="nofollow" attribute. After ensuring that no paid links on your site pass PageRank, you can submit a reconsideration request and if you had a manual webspam action on your site, someone at Google will review the request. After the request has been reviewed, you'll get a notification back about whether the reconsideration request was granted or not.

We do take this issue very seriously, so we recommend you avoid selling (and buying) links that pass PageRank in order to prevent loss of trust, lower PageRank in the Google Toolbar, lower rankings, or in an extreme case, removal from Google's search results.

Webmaster level: All

Today we’re happy to announce an updated version of our Webmaster Quality Guidelines. Both our basic quality guidelines and many of our more specific articles (like those on links schemes or hidden text) have been reorganized and expanded to provide you with more information about how to create quality websites for both users and Google.

The main message of our quality guidelines hasn’t changed: Focus on the user. However, we’ve added more guidance and examples of behavior that you should avoid in order to keep your site in good standing with Google’s search results. We’ve also added a set of quality and technical guidelines for rich snippets, as structured markup is becoming increasingly popular.

We hope these updated guidelines will give you a better understanding of how to create and maintain Google-friendly websites.

Webmaster level: Advanced

We’ve gotten several questions recently about whether website testing—such as A/B or multivariate testing—affects a site’s performance in search results. We’re glad you’re asking, because we’re glad you’re testing! A/B and multivariate testing are great ways of making sure that what you’re offering really appeals to your users.

Before we dig into the implications for search, a brief primer:
Website testing is when you try out different versions of your website (or a part of your website), and collect data about how users react to each version. You use software to track which version causes users to do-what-you-want-them-to-do most often: which one results in the most purchases, or the most email signups, or whatever you’re testing for. After the test is finished you can update your website to use the “winner” of the test—the most effective content.

A/B testing is when you run a test by creating multiple versions of a page, each with its own URL. When users try to access the original URL, you redirect some of them to each of the variation URLs and then compare users’ behaviour to see which page is most effective.

Multivariate testing is when you use software to change differents parts of your website on the fly. You can test changes to multiple parts of a page—say, the heading, a photo, and the ‘Add to Cart’ button—and the software will show variations of each of these sections to users in different combinations and then statistically analyze which variations are the most effective. Only one URL is involved; the variations are inserted dynamically on the page.

So how does this affect what Googlebot sees on your site? Will serving different content variants change how your site ranks? Below are some guidelines for running an effective test with minimal impact on your site’s search performance.
  • No cloaking.
    Cloaking—showing one set of content to humans, and a different set to Googlebot—is against our Webmaster Guidelines, whether you’re running a test or not. Make sure that you’re not deciding whether to serve the test, or which content variant to serve, based on user-agent. An example of this would be always serving the original content when you see the user-agent “Googlebot.” Remember that infringing our Guidelines can get your site demoted or removed from Google search results—probably not the desired outcome of your test.
  • Use rel=“canonical”.
    If you’re running an A/B test with multiple URLs, you can use the rel=“canonical” link attribute on all of your alternate URLs to indicate that the original URL is the preferred version. We recommend using rel=“canonical” rather than a noindex meta tag because it more closely matches your intent in this situation. Let’s say you were testing variations of your homepage; you don’t want search engines to not index your homepage, you just want them to understand that all the test URLs are close duplicates or variations on the original URL and should be grouped as such, with the original URL as the canonical. Using noindex rather than rel=“canonical” in such a situation can sometimes have unexpected effects (e.g., if for some reason we choose one of the variant URLs as the canonical, the “original” URL might also get dropped from the index since it would get treated as a duplicate).
  • Use 302s, not 301s.
    If you’re running an A/B test that redirects users from the original URL to a variation URL, use a 302 (temporary) redirect, not a 301 (permanent) redirect. This tells search engines that this redirect is temporary—it will only be in place as long as you’re running the experiment—and that they should keep the original URL in their index rather than replacing it with the target of the redirect (the test page). JavaScript-based redirects are also fine.
  • Only run the experiment as long as necessary.
    The amount of time required for a reliable test will vary depending on factors like your conversion rates, and how much traffic your website gets; a good testing tool should tell you when you’ve gathered enough data to draw a reliable conclusion. Once you’ve concluded the test, you should update your site with the desired content variation(s) and remove all elements of the test as soon as possible, such as alternate URLs or testing scripts and markup. If we discover a site running an experiment for an unnecessarily long time, we may interpret this as an attempt to deceive search engines and take action accordingly. This is especially true if you’re serving one content variant to a large percentage of your users.
The recommendations above should result in your tests having little or no impact on your site in search results. However, depending on what types of content you’re testing, it may not even matter much if Googlebot crawls or indexes some of your content variations while you’re testing. Small changes, such as the size, color, or placement of a button or image, or the text of your “call to action” (“Add to cart” vs. “Buy now!”), can have a surprising impact on users’ interactions with your webpage, but will often have little or no impact on that page’s search result snippet or ranking. In addition, if we crawl your site often enough to detect and index your experiment, we’ll probably index the eventual updates you make to your site fairly quickly after you’ve concluded the experiment.

To learn more about website testing, check out these articles on Content Experiments, our free testing tool in Google Analytics. You can also ask questions about website testing in the Analytics Help Forum, or about search impact in the Webmaster Help Forum.

Webmaster level: All

If your site isn't appearing in Google search results, or it's performing more poorly than it once did (and you believe that it does not violate our Webmaster Guidelines), you can ask Google to reconsider your site. Over time, we’ve worked to improve the reconsideration process for webmasters. A couple of years ago, in addition to confirming that we had received the request, we started sending a second message to webmasters confirming that we had processed their request. This was a huge step for webmasters who were anxiously awaiting results. Since then, we’ve received feedback that webmasters wanted to know the outcome of their requests. Earlier this year, we started experimenting with sending more detailed reconsideration request responses and the feedback we’ve gotten has been very positive!

Now, if your site is affected by a manual spam action, we may let you know if we were able to revoke that manual action based on your reconsideration request. Or, we could tell you if your site is still in violation of our guidelines. This might be a discouraging thing to hear, but once you know that there is still a problem, it will help you diagnose the issue.

If your site is not actually affected by any manual action (this is the most common scenario), we may let you know that as well. Perhaps your site isn’t being ranked highly by our algorithms, in which case our systems will respond to improvements on the site as changes are made, without your needing to submit a reconsideration request. Or maybe your site has access issues that are preventing Googlebot from crawling and indexing it. For more help debugging ranking issues, read our article about why a site may not be showing up in Google search results.

We’ve made a lot of progress on making the entire reconsideration request process more transparent. We aren’t able to reply to individual requests with specific feedback, but now many webmasters will be able to find out if their site has been affected by a manual action and they’ll know the outcome of the reconsideration review. In an ideal world, Google could be completely transparent about how every part of our rankings work. However, we have to maintain a delicate balance: trying to give as much information to webmasters as we can without letting spammers probe how to do more harm to users. We're happy that Google has set the standard on tools, transparency, and communication with site owners, but we'll keep looking for ways to do even better.

Webmaster level: Intermediate

So you’re going global, and you need your website to follow. Should be a simple case of getting the text translated and you’re good to go, right? Probably not. The Google Webmaster Team frequently builds sites that are localized into over 40 languages, so here are some things that we take into account when launching our pages in both other languages and regions.

(Even if you think you might be immune to these issues because you only offer content in English, it could be that non-English language visitors are using tools like Google Translate to view your content in their language. This traffic should show up in your analytics dashboard, so you can get an idea of how many visitors are not viewing your site in the way it’s intended.)
More languages != more HTML templates
We can’t recommend this enough: reuse the same template for all language versions, and always try to keep the HTML of your template simple.

Keeping the HTML code the same for all languages has its advantages when it comes to maintenance. Hacking around with the HTML code for each language to fix bugs doesn’t scale–keep your page code as clean as possible and deal with any styling issues in the CSS. To name just one benefit of clean code: most translation tools will parse out the translatable content strings from the HTML document and that job is made much easier when the HTML is well-structured and valid.
How long is a piece of string?
If your design relies on text playing nicely with fixed-size elements, then translating your text might wreak havoc. For example, your left-hand side navigation text is likely to translate into much longer strings of text in several languages–check out the difference in string lengths between some English and Dutch language navigation for the same content. Be prepared for navigation titles that might wrap onto more than one line by figuring out your line height to accommodate this (also worth considering when you create your navigation text in English in the first place).

Variable word lengths cause particular issues in form labels and controls. If your form layout displays labels on the left and fields on the right, for example, longer text strings can flow over into two lines, whereas shorter text strings do not seem associated with their form input fields–both scenarios ruin the design and impede the readability of the form. Also consider the extra styling you’ll need for right-to-left (RTL) layouts (more on that later). For these reasons we design forms with labels above fields, for easy readability and styling that will translate well across languages.

Screenshots of Chinese and German versions of web forms
click to enlarge

Also avoid fixed-height columns–if you’re attempting to neaten up your layout with box backgrounds that match in height, chances are when your text is translated, the text will overrun areas that were only tall enough to contain your English content. Think about whether the UI elements you’re planning to use in your design will work when there is more or less text–for instance, horizontal vs. vertical tabs.
On the flip side
Source editing for bidirectional HTML can be problematic because many editors have not been built to support the Unicode bidirectional algorithm (more research on the problems and solutions). In short, the way your markup is displayed might get garbled:

<p>ابةتث <img src="foo.jpg" alt=" جحخد"< ذرزسش!</p>

Our own day-to-day usage has shown the following editors to currently provide decent solutions for bidirectional editing: particularly Coda, and also Dreamweaver, IntelliJ IDEA and JEditX.

When designing for RTL languages you can build most of the support you need into the core CSS and use the directional attribute of the html element (for backwards compatibility) in combination with a class on the body element. As always, keeping all styles in one core stylesheet makes for better maintainability.

Some key styling issues to watch out for: any elements floated right will need to be floated left and vice versa; extra padding or margin widths applied to one side of an element will need to be overridden and switched, and any text-align attributes should be reversed.

We generally use the following approach, including using a class on the body tag rather than a html[dir=rtl] CSS selector because this is compatible with older browsers:


<body class="rtl">
<h1><a href=""><img alt="Google" src="" /></a> Heading</h1>

Left-to-right (default) styling:

h1 {
  height: 55px;
  line-height: 2.05;
  margin: 0 0 25px;
  overflow: hidden;
h1 img {
  float: left;
  margin: 0 43px 0 0;
  position: relative;

Right-to-left styling:

body.rtl {
  direction: rtl;
body.rtl h1 img {
  float: right;
  margin: 0 0 0 43px;

(See this in action in English and Arabic.)

One final note on this subject: most of the time your content destined for right-to-left language pages will be bidirectional rather than purely RTL, because some strings will probably need to retain their LTR direction–for example, company names in Latin script or telephone numbers. The way to make sure the browser handles this correctly in a primarily RTL document is to wrap the embedded text strings with an inline element using an attribute to set direction, like this:

<h2>‫עוד ב- <span dir="ltr">Google</span>‬</h2>

In cases where you don’t have an HTML container to hook the dir attribute into, such as title elements or JavaScript-generated source code for message prompts, you can use this equivalent to set direction where &#x202B; and &#x202C;‬ are Unicode control characters for right-to-left embedding:

<title>&#x202B;‫הפוך את Google לדף הבית שלך‬&#x202C;</title>

Example usage in JavaScript code:
var ffError = '\u202B' +'כדי להגדיר את Google כדף הבית שלך ב\x2DFirefox, לחץ על הקישור \x22הפוך את Google לדף הבית שלי\x22, וגרור אותו אל סמל ה\x22בית\x22 בדפדפן שלך.'+ '\u202C';

(For more detail, see the W3C’s articles on creating HTML for Arabic, Hebrew and other right-to-left scripts and authoring right-to-left scripts.)
It’s all Greek to me…
If you’ve never worked with non-Latin character sets before (Cyrillic, Greek, and a myriad of Asian and Indic), you might find that both your editor and browser do not display content as intended.

Check that your editor and browser encodings are set to UTF-8 (recommended) and consider adding a element and the lang attribute of the html element to your HTML template so browsers know what to expect when rendering your page–this has the added benefit of ensuring that all Unicode characters are displayed correctly, so using HTML entities such as &eacute; (é) will not be necessary, saving valuable bytes! Check the W3C’s tutorial on character encoding if you’re having trouble–it contains in-depth explanations of the issues.
A word on naming
Lastly, a practical tip on naming conventions when creating several language versions. Using a standard such as the ISO 639-1 language codes for naming helps when you start to deal with several language versions of the same document.

Using a conventional standard will help users understand your site’s structure as well as making it more maintainable for all webmasters who might develop the site, and using the language codes for other site assets (logo images, PDF documents) is handy to be able to quickly identify files.

See previous Webmaster Central posts for advice about URL structures and other issues surrounding working with multi-regional websites and working with multilingual websites.

That’s a summary of the main challenges we wrestle with on a daily basis; but we can vouch for the fact that putting in the planning and work up front towards well-structured HTML and robust CSS pays dividends during localization!

Webmaster level: All

Everyone on the web knows how frustrating it is to perform a search and find websites gaming the search results. These websites can be considered webspam - sites that violate Google’s Webmaster Guidelines and try to trick Google into ranking them highly. Here at Google, we work hard to keep these sites out of your search results, but if you still see them, you can notify us by using our webspam report form. We’ve just rolled out a new, improved webspam report form, so it’s now easier than ever to help us maintain the quality of our search results. Let’s take a look at some of our new form’s features:

Option to report various search issues
There are many search results, such as sites with malware and phishing, that are not necessarily webspam but still degrade the search experience. We’ve noticed that our users sometimes report these other issues using our webspam report form, causing a delay between when a user reports the issue and when the appropriate team at Google handles it. The new form’s interstitial page allows you to report these other search issues directly to the correct teams so that they can address your concerns in a timely manner.

Simplified form with informative links
To improve the readability of the form, we’ve made the text more concise, and we’ve integrated helpful links into the form’s instructions. Now, the ability to look up our Webmaster Guidelines, get advice on writing actionable form comments, and block sites from your personalized search results is just one click away.

Thank you page with personalization options
Some of our most valuable information comes from our users, and we appreciate the webspam reports you submit to us. The thank you page explains what happens once we’ve received your webspam report. If you want to report more webspam, there’s a link back to the form page and instructions on how to report webspam more efficiently with the Chrome Webspam Report Extension. We also provide information on how you can immediately block the site you’ve reported from your personalized search results, for example, by managing blocked sites in your Google Account.

At Google, we strive to provide the highest quality, most relevant search results, so we take your webspam reports very seriously. We hope our new form makes the experience of reporting webspam as painless as possible (and if it doesn’t, feel free to let us know in the comments).

A popular question on our Webmaster Help Forum is in regard to best practices for organic link building. There seems to be some confusion, especially among less experienced webmasters, on how to approach the topic. Different perspectives have been shared, and we would also like to explain our viewpoint on earning quality links.

If your site is rather new and still unknown, a good way marketing technique is to get involved in the community around your topic. Interact and contribute on forums and blogs. Just keep in mind to contribute in a positive way, rather than spamming or soliciting for your site. Just building a reputation can drive people to your site. And they will keep on visiting it and linking to it. If you offer long-lasting, unique and compelling content -- something that lets your expertise shine -- people will want to recommend it to others. Great content can serve this purpose as much as providing useful tools.

A promising way to create value for your target group and earn great links is to think of issues or problems your users might encounter. Visitors are likely to appreciate your site and link to it if you publish a short tutorial or a video providing a solution, or a practical tool. Survey or original research results can serve the same purpose, if they turn out to be useful for the target audience. Both methods grow your credibility in the community and increase visibility. This can help you gain lasting, merit-based links and loyal followers who generate direct traffic and "spread the word." Offering a number of solutions for different problems could evolve into a blog which can continuously affect the site's reputation in a positive way.

Humor can be another way to gain both great links and get people to talk about your site. With Google Buzz and other social media services constantly growing, entertaining content is being shared now more than ever. We've seen all kinds of amusing content, from ASCII art embedded in a site's source code to funny downtime messages used as a viral marketing technique to increase the visibility of a site. However, we do not recommend counting only on short-lived link-bait tactics. Their appeal wears off quickly and as powerful as marketing stunts can be, you shouldn't rely on them as a long-term strategy or as your only marketing effort.

It's important to clarify that any legitimate link building strategy is a long-term effort. There are those who advocate for short-lived, often spammy methods, but these are not advisable if you care for your site's reputation. Buying PageRank-passing links or randomly exchanging links are the worst ways of attempting to gather links and they're likely to have no positive impact on your site's performance over time. If your site's visibility in the Google index is important to you it's best to avoid them.

Directory entries are often mentioned as another way to promote young sites in the Google index. There are great, topical directories that add value to the Internet. But there are not many of them in proportion to those of lower quality. If you decide to submit your site to a directory, make sure it's on topic, moderated, and well structured. Mass submissions, which are sometimes offered as a quick work-around SEO method, are mostly useless and not likely to serve your purposes.

It can be a good idea to take a look at similar sites in other markets and identify the elements of those sites that might work well for yours, too. However, it's important not to just copy success stories but to adapt them, so that they provide unique value for your visitors.

Social bookmarks on YouTube enable users to share content easily

Finally, consider making linking to your site easier for less tech savvy users. Similar to the way we do it on YouTube, offering bookmarking services for social sites like Twitter or Facebook can help spread the word about the great content on your site and draw users' attention.

As usual, we'd like to hear your opinion. You're welcome to comment here in the blog, or join our Webmaster Help Forum community.

Webmaster Level: All

We thought it might be fun and educational to create a quiz for webmasters about issues we commonly see in the Webmaster Help Forum. Together with our awesome Bionic Posters, we've tried to come up with questions and answers that reflect recurring concerns in the forum and some information that may not be well known. Some things to keep in mind when taking this quiz:
  • The quiz will be available to take from today until Wednesday, January 27 at 5PM PST.
  • It doesn't cover all facets of webmaster problems that arise, and—as with any test—it is at best only a fun way to test your webmaster prowess ;). We leave discussion of specific cases to the forum.
  • We've set up the quiz using our very own Google Docs. This means you won't see results right away, but we plan to write a follow-up blog post explaining answers and listing top scorers. Be sure to save your answers or print out your completed quiz before submitting! This way you can check your answers against the correct ones when we publish them.
  • It's just for fun!

Webmaster level: Intermediate/Advanced

Webmasters who check their incoming links in Webmaster Tools often ask us what they can do when they see low-quality links. Understandably, many site owners are trying to build a good reputation for their sites, and some believe that having poor-quality incoming links can be perceived as "being part of a bad neighbourhood," which over time might harm their site's ranking.

example of low-quality links
If your site receives links that look similarly dodgy, don't be alarmed... read on!

While it's true that linking is a significant factor in Google's ranking algorithms, it's just one of many. I know we say it a lot, but having something that people want to look at or use—unique, engaging content, or useful tools and services—is also a huge factor. Other factors can include how a site is structured, whether the words of a user's query appear in the title, how close the words are on the page, and so on. The point is, if you happen to see some low quality sites linking to you, it's important to keep in mind that linking is just one aspect among many of how Google judges your site. If you have a well-structured and regularly maintained site with original, high-quality content, those are the sorts of things that users will see and appreciate.

That having said, in an ideal world you could have your cake and eat it too (or rather, you could have a high-quality site and high-quality backlinks). You may also be concerned about users' perception of your site if they come across it via a batch of spammy links. If the number of poor-quality links is manageable, and/or if it looks easy to opt-out or get those links removed from the site that's linking to you, it may be worth it to try to contact the site(s) and ask them to remove their links. Remember that this isn't something that Google can do for you; we index content that we find online, but we don't control that content or who's linking to you.

If you run into some uncooperative site owners, however, don't fret for too long. Instead, focus on things that are under your control. Generally, you as a webmaster don't have much control over things like who links to your site. You do, however, have control over many other factors that influence indexing and ranking. Organize your content; do a mini-usability study with family or friends. Ask for a site review in your favorite webmaster forums. Use a website testing tool to figure out what gets you the most readers, or the biggest sales. Take inspiration from your favorite sites, or your competitors—what do they do well? What makes you want to keep coming back to their sites, or share them with your friends? What can you learn from them? Time spent on any of these activities is likely to have a larger impact on your site's overall performance than time spent trying to hunt down and remove every last questionable backlink.

Finally, keep in mind that low-quality links rarely stand the test of time, and may disappear from our link graph relatively quickly. They may even already be being discounted by our algorithms. If you want to make sure Google knows about these links and is valuing them appropriately, feel free to bring them to our attention using either our spam report or our paid links report.

Do you think your site might be penalized because of something that
happened on it? As two leaders of the reconsideration team, we recently made
a video to help you discover how to create a good reconsideration request,
including tips on what we look for on our side. Watch the video and then
let us know if you have questions in the comments!

In a recent blog post highlighting our Webmaster Help Group, I asked for your webmaster-related questions. In our second installment of Popular Picks, we hoped to discover which issues webmasters wanted to learn more about, and then respond with some better documentation on those topics. It looks like it was a success, so please get clicking:
Thanks again for your questions! See you around the group.

As a follow-up to my recent post about how user reports of webspam and paid links help improve Google's search results for millions of users, I wanted to highlight one of the most essential parts of Google Webmaster Central: our Webmaster Help Group. With over 37,000 members in our English group and support in 15 other languages, the group is the place to get your questions answered regarding crawling and indexing or Webmaster Tools. We're thankful for a fabulous group of Bionic Posters who have dedicated their time and energy to making the Webmaster Help Group a great place to be. When appropriate, Googlers, including myself, jump in to clarify issues or participate in the dialogue. One thing to note: we try hard to read most posts in the group, and although we may not respond to each one, your feedback and concerns help drive the features we work on. Here are a few examples:

Sitemap detailsSubmitting a Sitemap through Webmaster Tools is one way to let know Google know about what pages exist on your site. Users were quick to note that even though they submitted a Sitemap of all the pages on their site, they only found a sampling of URLs indexed through a site: search. In response, the Webmaster Tools team created a Sitemaps details page to better tell you how your Sitemap was processed. You can read a refresher about the Sitemaps details page in Jonathan's blog post.

Contextual help
One request we received early on with Webmaster Tools was for better documentation on the data displayed. We saw several questions about meta description and title tag issues using our Content Analysis tool, which led us to beef up our documentation on that page and link to that Help Center article directly from that page. Similarly, we discovered that users needed clarification on the distinction between "top search queries" and "top clicked queries" and how the data can be used. We added an expandable section entitled "How do I use this data?" and placed contextual help information across Webmaster Tools to explain what each feature is and where to get more information about it.

Blog posts
The Webmaster Help Group is also a way for us to keep a pulse on what overarching questions are on the minds of webmasters so we can address some of those concerns through this blog. Whether it's how to submit a reconsideration request using Webmaster Tools, deal with duplicate content, move a site, or design for accessibility, we're always open to hearing more about your concerns in the Group. Which reminds me...

It's time for more Popular Picks!
Last year, we devoted two weeks to soliciting and answering five of your most pressing webmaster-related questions. These Popular Picks covered the following topics:
Seeing as this was a well-received initiative, I'm happy to announce that we're going to do it again. Head on over to this thread to ask your webmaster-related questions. See you there!


If your site does not appear in Google Search results, you might be understandably worried. Here, we've put together some information to help you determine when and how to submit a reconsideration request for your site.

You can follow along as Bergy (the webmaster of in our video) tries to find out whether he needs to submit a reconsideration request for his Ancient Roman Politics blog. Of course, not all webmasters' problems can be traced back to Wysz (-:, but the simple steps outlined below can help you determine the right solution for your particular case.

Check for access issues

You may want to check if there are any access issues with your site - you can do this by logging in to your Webmaster Tools account. On the Overview page you'll be able to see when Googlebot last successfully crawled the home page of your site. Another way to do this is to check the cache date for your site's homepage. For more detailed information about how Googlebot crawls your site, you might want to check the crawl rate graphs (find them in Tools > Set crawl rate).

On the Overview page you can also check whether there are any crawling errors. For example, if your server was busy or unavailable when we tried to access your site, you would get a "URL unreachable" error message. Alternatively, there might be URLs in your site blocked by your robots.txt file. You can see this in "URLs restricted by robots.txt". If there are URLs listed there which you did not expect, you can go to Tools and select "Analyze robots.txt" - there you can make sure that your robots.txt file is properly formatted and only blocking the parts of your site which you don't want Google to crawl.

Other than the examples mentioned above, there are several more types of crawl errors - HTTP errors and URLs timed out errors, just to name a few. Even thought we haven't highlighted them here, you will still see alerts for all of them on the Overview page in your Webmaster Tools account.

Check for messages

If Google has no problems accessing your site, check to see if there is a message waiting for you in the Message Center of your Webmaster Tools account. This is the place Google uses to communicate important information to you regarding your Webmaster Tools account and the sites you manage. If we have noticed there is something wrong with your site, we may send you a message there, detailing some issues which you need to fix to bring your site into compliance with the Webmaster Guidelines.

Read the Webmaster Guidelines

If you don't see a message in the Message Center, check to see if your site is or has at some point been in violation of the Webmaster Guidelines. You can find them, and much more, in our Help Center.

Fix your site

If your site is in violation of the Webmaster Guidelines and you think that this might have affected how your site is viewed by Google, this would be a good time to submit a reconsideration request. But before you do that, make changes to your site so that it falls within our guidelines.

Submit a reconsideration request

Now you can go ahead and submit a request for reconsideration. Log in to your Webmaster Tools account. Under Tools, click on "Request reconsideration" and follow the steps. Make sure to explain what you think was wrong with your site and what steps you have taken to fix it.

Once you've submitted your request, you'll see a message from us in the Message Center confirming that we've received it. We'll then review your site for compliance with the Webmaster Guidelines.

We hope this post has helped give you an idea when and how to submit a reconsideration request. If you're not sure why Google isn't including your site, a great place to look for help is our Webmaster Help Group. There you will find many knowledgeable and friendly webmasters and Googlers, who would be happy to look at your site and give suggestions on how you could fix things. You can find links to both the Help Center and the Webmaster Group at


About a year ago, in response to user feedback, we created a paid links reporting form within Webmaster Tools. User feedback, through reporting paid links, webspam, or suggestions in our Webmaster Help Group, has been invaluable in ensuring that the quality of our index and our tools is as high as possible. Today, I'd like to highlight the impact that reporting paid links and webspam has had on our index. In a future post, I'll showcase how user feedback and concerns in the Webmaster Help Group have helped us improve our Help Center documentation and Webmaster Tools.

Reporting Paid Links

As mentioned in the post Information about buying and selling links that pass PageRank, Google reserves the right to take action on sites that buy or sell links that pass PageRank for the purpose of manipulating search engine rankings. Even though we work hard to discount these links through algorithmic detection, if you see a site that is buying or selling links that pass PageRank, please let us know. Over the last year, users have submitted thousands and thousands of paid link reports to Google, and each report can contain multiple websites that are suspected of selling links. These reports are actively reviewed, and the feedback is invaluable to improve our search algorithms. We also are willing to take manual action on a significant fraction of paid link reports as we continue to improve our algorithms. More importantly, the hard work of users who have already reported paid links has helped improve the quality of our index for millions. For more information on reporting paid links, check out this Help Center article.

Reporting Webspam

Google has also provided a form to report general webspam since November 2001. We appreciate users who alert us to potential abuses for the sake of the whole Internet community. Spam reports come in two flavors: an authenticated form that requires registration in Webmaster Tools, and an unauthenticated form. We receive hundreds of reports each day. Spam reports to the authenticated form are given more weight and are individually investigated more often. Spam reports to the unauthenticated form are assessed in terms of impact, and a large fraction of those are reviewed as well. As Udi Manber, VP of Engineering & Search Quality mentioned in his recent blog post on our Official Google Blog, in 2007 more than 450 new improvements were made to our search algorithms. A number of those improvements were related to webspam. It's not an understatement to say that users who have taken the time to report spam were essential to many of those algorithmic enhancements.

Going forward

As users' expectations of search increase daily, we know it's important to provide a high quality index with relevant results. We're always happy to hear stories in our Webmaster Help Group from users who have have reported spam with noticeable results in our Webmaster Help Group. Now that you know how Google uses feedback to improve our search quality, you may want to tell us about webspam you've seen in our results. Please use our authenticated form to report paid links or other types of webspam. Thanks again for taking the time to help us improve.


Many of you have asked for more information regarding webserving techniques (especially related to Googlebot), so we made a short glossary of some of the more unusual methods.
  • Geolocation: Serving targeted/different content to users based on their location. As a webmaster, you may be able to determine a user's location from preferences you've stored in their cookie, information pertaining to their login, or their IP address. For example, if your site is about baseball, you may use geolocation techniques to highlight the Yankees to your users in New York.

    The key is to treat Googlebot as you would a typical user from a similar location, IP range, etc. (i.e. don't treat Googlebot as if it came from its own separate country—that's cloaking).

  • IP delivery: Serving targeted/different content to users based on their IP address, often because the IP address provides geographic information. Because IP delivery can be viewed as a specific type of geolocation, similar rules apply. Googlebot should see the same content a typical user from the same IP address would see.

    (Author's warning: This 7.5-minute video may cause drowsiness. Even if you're really interested in IP delivery or multi-language sites, it's a bit uneventful.)

  • Cloaking: Serving different content to users than to Googlebot. This is a violation of our webmaster guidelines. If the file that Googlebot sees is not identical to the file that a typical user sees, then you're in a high-risk category. A program such as md5sum or diff can compute a hash to verify that two different files are identical.

  • First click free: Implementing Google News' First click free policy for your content allows you to include your premium or subscription-based content in Google's websearch index without violating our quality guidelines. You allow all users who find your page using Google search to see the full text of the document, even if they have not registered or subscribed. The user's first click to your content area is free. However, you can block the user with a login or payment request when he clicks away from that page to another section of your site.

    If you're using First click free, the page displayed to users who visit from Google must be identical to the content that is shown to the Googlebot.
Still have questions?  We'll see you at the related thread in our Webmaster Help Group.