Breaking News
Showing posts with label google webmaster tools. Show all posts
Showing posts with label google webmaster tools. Show all posts

Thursday, 23 May 2013

Site Health | Webmaster Guidelines tools

See why we sometimes put an alert icon next to your site

To help you identify and prioritize the most important issues on your site, Webmaster Tools will display the Site Health alert next to your site on the Webmaster Tools home page whenever we detect certain problems or events that prevent us from crawling or indexing your site. The following events can trigger a Site Health alert:

* Google detects malware on your site.
* An important page is removed from your site. If you've intentionally removed a page that was previously generating traffic to your site, you can ignore this alert.
* An important page is blocked by robots.txt. If you've intentionally blocked a page (because you don't want it to appear in search results), you can ignore this alert.
Read more ...

Guidelines on user-generated spam | Webmaster Guidelines tools

Google’s Webmaster Guidelines outline best practices for website owners, and the use of techniques that violate our guidelines may cause us to take action on a site. However, not all violations of our Webmaster Guidelines are related to content created intentionally by a site’s owner. Sometimes, spam can be generated on a good site by malicious visitors or users. This spam is usually generated on sites that allow users to create new pages or otherwise add content to the site.

If you receive a warning from Google about this type of spam, the good news is that we generally believe your site is of sufficient quality that we didn’t see a need to take manual action on the whole site. However, if your site has too much user-generated spam on it, that can affect our assessment of the site, which may eventually result in us taking manual action on the whole site.

Some examples of spammy user-generated content include:
- Spammy accounts on free hosts
- Spammy posts on forum threads
- Comment spam on blogs


Since spammy user-generated content can pollute Google search results, we recommend you actively monitor and remove this type of spam from your site. Here are several tips on how to prevent abuse of your site’s public areas.

User-generated spam:
Comments are a great way for webmasters to build community and readership. Unfortunately, they're often abused by spammers and nogoodniks, many of whom use scripts or other software to generate and post spam. If you've ever received a comment that looked like an advertisement or a random link to an unrelated site, then you've encountered comment spam. Here are some ideas for reducing or preventing comment spam on your website.
Use anti-spam tools

Most website development tools, especially blog tools, can require users to prove they're a real live human, not a nasty spamming engine. You'll have seen these: Generally the user is presented with a distorted image (often called a CAPTCHA, for "completely automated public Turing test to tell computers and humans apart") and asked to type the letters or numbers she sees in the image. Some CAPTCHA systems also support audio CAPTCHAs. This is a pretty effective way of preventing user-generated spam. The process may reduce the number of casual readers who leave comments on your pages or create a user profile, but it will definitely improve the quality of the comments and profiles.

Google's free reCAPTCHA's service is easy to implement on your site. In addition, data collected from the service is used to improve the process of scanning text, such as from books or newspapers. By using reCAPTCHA, you're not only protecting your site from spammers; you're helping to digitize the world's books. If you’d like to implement reCAPTCHA for free on your own site, you can sign up here. Plugins are available for easy installation on popular applications and programming environments such as WordPress and PHP.
Turn on comment moderation

Comment moderation means that no comments will appear on your site until you manually review and approve them. This means you'll spend more time monitoring your comments, but it can really help to improve the user experience for your visitors. It's particularly worthwhile if you regularly post about controversial subjects, where emotions can become heated. It's generally available as a setting in your blogging software, such as Blogger.
Use "nofollow" tags

Together with Yahoo! and MSN, Google introduced the "nofollow" HTML microformat a few years ago, and the attribute has been widely adopted. Any link with the rel="nofollow" attribute will not be used to calculate PageRank or determine the relevancy of your pages for a user query. (For example, if a spammer includes a link in your comments like this:
<a href="http://www.example.com/">This is a nice site!</a>
it will get converted to:
<a href="http://www.example.com/" rel="nofollow">This is a nice site! </a>
This new link will not be taken into account when calculating PageRank. This won't prevent spam, but it will avoid problems with passing PageRank.



By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.
Disallow hyperlinks in comments

If you have access to the server, you may want to change its configuration to remove HTML tags from comment links inside your guestbook. Spammers will still be able to leave comments, but they won't be able to publish active hyperlinks.
Block comment pages using robots.txt or META tags

You can use your robots.txt file to block Google's access to certain pages. This won't stop spammers from leaving comments or creating user accounts, but it will mean that links in these comments won't negatively impact your site. For example, if comments are stored in the subdirectory guestbook, you could add the following to your robots.txt file:
Disallow:/guestbook/

This will block Google from indexing the contents of guestbook and any subdirectories.

You can also use the META tag to block access to a single selected page, for example http://www.example.com/article/comments. Like this:
<html>
<head>
<META NAME="googlebot" CONTENT="noindex">


You may wish to use these methods to block profile pages for new and not yet trustworthy users. Once you gain trust in the user, you can remove the crawling or indexing restrictions.
Think twice about enabling a guestbook or comments

A lot of spam doesn't give users a good impression of your site. If this feature isn't adding much value to your users, or if you won't have time to regularly monitor your guestbook or comments, consider turning them off. Most blogging software, such as Blogger, will let you turn comments off for individual posts.

Use a blacklist to prevent repetitive spamming attempts.

Google often sees large numbers of fake profiles on one innocent site all linking to the same domain. Once you find a single spammy profile, make it simple to remove any others.

Add a "report spam" feature to user profiles and friend invitations.

Your users care about your community and are annoyed by spam too. Let them help you solve the problem.
Read more ...

Hacked content | Webmaster Guidelines tools

Hacked content is any content that is placed on your site without your permission due to vulnerabilities in your site’s security. In order to protect our users and to maintain the integrity of our search results, Google tries its best to keep hacked content out of our search results. Hacked content gives our users results that are not useful and can potentially install malicious content on their machines. We recommend that you keep your site secure, and clean up hacked content when you find it.

Some examples of hacking include:
Injected content
When hackers gain access to your website, they may try to inject malicious content into existing pages on your site. This often takes the form of malicious JavaScript injected directly into the site, or into iframes.

Added content
Sometimes, due to security flaws, hackers are able to add new pages to your site that contain spammy or malicious content. These pages are often meant to manipulate search engines. Your existing pages may not show signs of hacking, but these newly-created pages could harm your site’s visitors or your performance in search results.

Hidden content
Hackers may also try to subtly manipulate existing pages on your site. Their goal is to add content to your site that search engines can see but which may be harder for you and your users to spot. This can involve adding hidden links or hidden text to a page by using CSS or HTML, or it can involve more complex changes like cloaking.

Read more ...

Automated queries | Webmaster Guidelines tools

Google's Terms of Service do not allow the sending of automated queries of any sort to our system without express permission in advance from Google. Sending automated queries consumes resources and includes using any software (such as WebPosition Gold) to send automated queries to Google to determine how a website or webpage ranks in Google search results for various queries. In addition to rank checking, other types of automated access to Google without permission are also a violation of our Webmaster Guidelines and Terms of Service.
Read more ...

Rich snippets guidelines | Webmaster Guidelines tools

Rich snippets are designed to summarize the content of a page in a way that makes it even easier for users to understand what the page is about in our search results. If you’re considering taking advantage of rich snippets, think about whether a user would find the information helpful when choosing between search results. The following guidelines may give you a better idea of what we mean.

To get started with rich snippets, review the in-depth information provided in the Webmaster Help Center as well as the technical, design, and quality guidelines below.
Technical guidelines
In order to be eligible for rich snippets, you should mark up your site’s pages using one of three supported formats:
Microdata
Microformats
RDFa

Once your content is marked up, test it using the structured data testing tool. If the tool correctly renders a rich snippet for your pages, they’re eligible to be shown with rich snippets! If rich snippets aren’t appearing in the rich snippets testing tool, refer to our troubleshooting guide.

Once you’ve correctly implemented and tested your markup, it may take some time for rich snippets to appear in search results as we crawl and process the pages. If rich snippets are not appearing in Google’s search results after a few weeks, refer to our troubleshooting guide as well as the design and quality guidelines below.
Design guidelines

Implementing rich snippets allows us to provide an even more useful summary of the content on a page. Markup intended to be used in a rich snippet should:

- Describe and summarize the page’s main content as a user would see it. (There are some exceptions.)
Contain up-to-date information. We won’t show a rich snippet for time-sensitive content that is no longer relevant.

- Be of original content that you and your users have generated and is fully contained on your page. We won’t show a rich snippet for content that is linked or alluded to but not directly available on a page.
Quality guidelines

- While rich snippets are generated algorithmically, we do reserve the right to take manual action (e.g., disable rich snippets for a specific site) in cases where we see abuse, deception, or other actions that hurt the search experience for our users. In particular, you should avoid:
Marking up content that is in no way visible to users.
Marking up irrelevant or misleading content, such as fake reviews or content unrelated to the focus of a page.

- These quality guidelines cover the most common forms of deceptive or manipulative rich snippet behavior, but Google may respond negatively to other misleading practices not listed here. It's not safe to assume that Google approves of a specific deceptive technique just because it isn't included on this page. We strongly advise that webmasters focus on providing a great user experience rather than on looking for loopholes.
Troubleshooting

- If rich snippets are not appearing for your pages in Google’s search results, check that you’ve done each of the following things:
Implemented markup in accordance with the above guidelines
Successfully tested using the structured data testing tool
Reviewed our troubleshooting guide

Read more ...

Creating pages with malicious behavior | Webmaster Guidelines tools

Distributing content or software on your website that behaves in a way other than what a user expected is a violation of Google’s Webmaster Guidelines. This includes anything that manipulates content on the page in an unexpected way, or downloads or executes files on a user’s computer without their consent. Google not only aims to give its users the most relevant search results for their queries, but also to keep them safe on the web.

Some examples of malicious behavior include:
* Changing or manipulating the location of content on a page, so that when a user thinks they’re clicking on a particular link or button the click is actually registered by a different part of the page
* Injecting new ads or pop-ups on pages, or swapping out existing ads on a webpage with different ads; or promoting or installing software that does so
* Including unwanted files in a download that a user requested
* Installing malware, trojans, spyware, ads or viruses on a user’s computer
* Changing a user’s browser homepage or search preferences without the user’s informed consent
Read more ...

Keyword stuffing | Webmaster Guidelines tools

"Keyword stuffing" refers to the practice of loading a webpage with keywords or numbers in an attempt to manipulate a site's ranking in Google search results. Often these keywords appear in a list or group, or out of context (not as natural prose). Filling pages with keywords or numbers results in a negative user experience, and can harm your site's ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.

Examples of keyword stuffing include:
* Lists of phone numbers without substantial added value
* Blocks of text listing cities and states a webpage is trying to rank for
* Repeating the same words or phrases so often that it sounds unnatural, for example:
* We sell custom cigar humidors. Our custom cigar humidors are handmade. If you’re thinking of buying a custom cigar humidor, please contact our custom cigar humidor specialists at custom.cigar.humidors@example.com.
Read more ...

Affiliate programs | Webmaster Guidelines tools

Our Webmaster Guidelines advise you to create websites with original content that adds value for users. This is particularly important for sites that participate in affiliate programs. Typically, affiliate websites feature product descriptions that appear on sites across that affiliate network. As a result, sites featuring mainly content from affiliate networks can suffer in Google's search rankings, because they do not have enough unique content that differentiates them from other sites on the web.

Google believes that pure, or "thin," affiliate websites do not provide additional value for web users, especially if they are part of a program that distributes its content to several hundred affiliates. These sites generally appear to be cookie-cutter sites or templates with no original content. Because a search results page could return several of these sites, all with the same content, thin affiliates create a frustrating user experience.

Some examples of thin affiliates include:
Pages with product affiliate links on which the product descriptions and reviews are copied directly from the original merchant without any original content or added value.

* Not every site that participates in an affiliate program is a thin affiliate. Good affiliates add value, for example by offering original product reviews, ratings, and product comparisons. If you participate in an affiliate program, there are a number of steps you can take to help your site stand out and to help improve your rankings.
* Affiliate program content should form only a small part of the content of your site.
Ask yourself why a user would want to visit your site first rather than visiting the original merchant directly. Make sure your site adds substantial value beyond simply republishing content available from the original merchant.
* When selecting an affiliate program, choose a product category appropriate for your intended audience. The more targeted the affiliate program is to your site's content, the more value it will add and the more likely you will be to rank better in Google search results and make money from the program. For example, a well-maintained site about hiking in the Alps could consider an affiliate partnership with a supplier who sells hiking books rather than office supplies.
* Use your website to build community among your users. This will help build a loyal readership, and can also create a source of information on the subject you are writing about. For example, discussion forums, user reviews, and blogs all offer unique content and provide value to users.
* Keep your content updated and relevant. Fresh, on-topic information increases the likelihood that your content will be crawled by Googlebot and clicked on by users.

Pure affiliate sites consisting of content that appears in many other places on the web are unlikely to perform well in Google search results and may be negatively perceived by search engines. Unique, relevant content provides value to users and distinguishes your site from other affiliates, making it more likely to rank well in Google search results.
Read more ...

Scraped content | Webmaster Guidelines tools

Some webmasters use content taken (“scraped”) from other, more reputable sites on the assumption that increasing the volume of pages on their site is a good long-term strategy regardless of the relevance or uniqueness of that content. Purely scraped content, even from high-quality sources, may not provide any added value to your users without additional useful services or content provided by your site; it may also constitute copyright infringement in some cases. It's worthwhile to take the time to create original content that sets your site apart. This will keep your visitors coming back and will provide more useful results for users searching on Google.

Some examples of scraping include:
* Sites that copy and republish content from other sites without adding any original content or value
* Sites that copy content from other sites, modify it slightly (for example, by substituting synonyms or using automated techniques), and republish it
* Sites that reproduce content feeds from other sites without providing some type of unique organization or benefit to the user

Read more ...

Doorway pages | Webmaster Guidelines tools

Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase. In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination. Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users.

Therefore, Google frowns on practices that are designed to manipulate search engines and deceive users by directing them to sites other than the one they selected, and that provide content solely for the benefit of search engines. Google may take action on doorway sites and other sites making use of these deceptive practices, including removing these sites from Google’s index.

Some examples of doorways include:
* Having multiple domain names targeted at specific regions or cities that funnel users to one page
* Templated pages made solely for affiliate linking
* Multiple pages on your site with similar content designed to rank for specific queries like city or state names

Read more ...

Hidden text and links | Webmaster Guidelines tools

Hiding text or links in your content to manipulate Google’s search rankings can be seen as deceptive and is a violation of Google’s Webmaster Guidelines. Text (such as excessive keywords) can be hidden in several ways, including:
- Using white text on a white background
- Locating text behind an image
- Using CSS to position text off-screen
- Setting the font size to 0
- Hiding a link by only linking one small character—for example, a hyphen in the middle of a paragraph

When evaluating your site to see if it includes hidden text or links, look for anything that's not easily viewable by visitors of your site. Are any text or links there solely for search engines rather than visitors?

However, not all hidden text is considered deceptive. For example, if your site includes technologies that search engines have difficulty accessing, like JavaScript, images, or Flash files, using descriptive text for these items can improve the accessibility of your site. Remember that many human visitors using screen readers, mobile browsers, browsers without plug-ins, and slow connections will not be able to view that content either and will benefit from the descriptive text as well. You can test your site’s accessibility by turning off JavaScript, Flash, and images in your browser, or by using a text-only browser such as Lynx. Some tips on making your site accessible include:
* Images: Use the alt attribute to provide descriptive text. In addition, we recommend using a human-readable caption and descriptive text around the image. See this article for more advice on publishing images.
* JavaScript: Place the same content from the JavaScript in a <noscript> tag. If you use this method, ensure the contents are exactly the same as what’s contained in the JavaScript, and that this content is shown to visitors who do not have JavaScript enabled in their browser.
* Videos: Include descriptive text about the video in HTML. You might also consider providing transcripts. See this article for more advice on publishing videos.
Read more ...

Sneaky redirects | Webmaster Guidelines tools

Redirecting is the act of sending a visitor to a different URL than the one they initially requested. There are many good reasons to redirect one URL to another, for example when moving your site to a new address, or consolidating several pages into one.

However, some redirects are designed to deceive search engines or to display different content to human users than to search engines. It’s a violation of Google’s Webmaster Guidelines to use JavaScript, a meta refresh, or other technologies to redirect a user to a different page with the intent to show the user a different page than a search engine crawler sees. When a redirect is implemented in this way, a search engine may index the original page rather than following the redirect, whereas users are taken to the redirect target. Like cloaking, this practice is deceptive because it attempts to display different content to users and to Googlebot, and can take a visitor somewhere other than where they expected to go.

Using JavaScript to redirect users can be a legitimate practice. When examining JavaScript or other redirects to ensure your site adheres to our guidelines, consider the intent. For example, if you redirect users to an internal page once they’re logged in, you can use JavaScript to do so. Keep in mind that 301 redirects are best when moving your site, but you could use a JavaScript redirect if you don’t have access to your website’s server.
Read more ...

Cloaking | Webmaster Guidelines Tools

Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

Some examples of cloaking include:
Serving a page of HTML text to search engines, while showing a page of images or Flash to users
Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor


If your site uses technologies that search engines have difficulty accessing, like JavaScript, images, or Flash, see our recommendations for making that content accessible to search engines and users without cloaking.

Read more ...

Link schemes | Webmaster Guidelines tools


Your site's ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links influences your ranking. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity.

Any links intended to manipulate a site's ranking in Google search results may be considered part of a link scheme. This includes any behavior that manipulates links to your site, or outgoing links from your site. Manipulating these links may affect the quality of our search results, and as such is a violation of Google’s Webmaster Guidelines.

The following are examples of link schemes which can negatively impact a site's ranking in search results:
Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link
- Excessive link exchanging ("Link to me and I'll link to you")
- Linking to web spammers or unrelated sites with the intent to manipulate PageRank
- Building partner pages exclusively for the sake of cross-linking
- Using automated programs or services to create links to your site

Here are a few common examples of unnatural links that violate our guidelines:
- Text advertisements that pass PageRank

Links that are inserted into articles with little coherence, for example:
most people sleep at night. you can buy cheap blankets at shops. a blanket keeps you warm at night. you can also buy a wholesale heater. It produces more warmth and you can just turn it off in summer when you are going on france vacation.

Low-quality directory or bookmark site links

Links embedded in widgets that are distributed across various sites, for example:
Visitors to this page: 1,472
 car insurance

Widely distributed links in the footers of various sites

Forum comments with optimized links in the post or signature, for example:
Thanks, that’s great info!
 - Paul
paul’s pizza san diego pizza best pizza san diego

- Note that PPC (pay-per-click) advertising links that don’t pass PageRank to the buyer of the ad do not violate our guidelines. You can prevent PageRank from passing in several ways, such as:
- Adding a rel="nofollow" attribute to the <a> tag
- Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file

The best way to get other sites to create relevant links to yours is to create unique, relevant content that can quickly gain popularity in the Internet community. The more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it. Before making any single decision, you should ask yourself: Is this going to be beneficial for my page's visitors?

It is not only the number of links you have pointing to your site that matters, but also the quality and relevance of those links. Creating good content pays off: Links are usually editorial votes given by choice, and the buzzing blogger community can be an excellent place to generate interest.

If you see a site that is participating in link schemes intended to manipulate PageRank, let us know. We'll use your information to improve our algorithmic detection of such links.

Read more ...

Automatically generated content | Webmaster Guidelines tools

Automatically generated—or “auto-generated”—content is content that’s been generated programmatically. Often this will consist of paragraphs of random text that makes no sense to the reader but which may contain search keywords.

* Some examples of auto-generated content include:
* Text translated by an automated tool without human review or curation before publishing
* Text generated through automated processes, such as Markov chains
* Text generated using automated synonymizing or obfuscation techniques
* Text generated from scraping Atom/RSS feeds or search results
* Stitching or combining content from different web pages without adding sufficient value
Read more ...

Wednesday, 22 May 2013

Webmaster Guidelines | Google find, crawl, and index your site

Best practices to help Google find, crawl, and index your site

Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the "Quality Guidelines," which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise impacted by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or on any of Google's partner sites.

- Design and content guidelines
- Technical guidelines
- Quality guidelines


When your site is ready:
Submit it to Google at http://www.google.com/submityourcontent/.

Submit a Sitemap using Google Webmaster Tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages.

Make sure all the sites that should know about your pages are aware your site is online.

Design and content guidelines
Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

* Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.

* Keep the links on a given page to a reasonable number.

* Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

* Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

* Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.

* Make sure that your <title> elements and ALT attributes are descriptive and accurate.

* Check for broken links and correct HTML.

If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

Review our recommended best practices for images, video and rich snippets.

Technical guidelines
Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

- Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

- Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.

- Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.

- Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.

- If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.

- Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.

- Test your site to make sure that it appears correctly in different browsers.

- Monitor your site's performance and optimize load times. Google's goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.

- Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let's Make The Web Faster. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.

Quality guidelines

- These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here. It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

- If you believe that another site is abusing Google's quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. While we may not take manual action in response to every report, spam reports are prioritized based on user impact, and in some cases may lead to complete removal of a spammy site from Google's search results. Not all manual actions result in removal, however. Even in cases where we take action on a reported site, the effects of these actions may not be obvious.

Quality guidelines - basic principles
Make pages primarily for users, not for search engines.

- Don't deceive your users.

- Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"

- Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.

Quality guidelines - specific guidelines

Avoid the following techniques:
Automatically generated content
Participating in link schemes
Cloaking
Sneaky redirects
Hidden text or links
Doorway pages
Scraped content
Participating in affiliate programs without adding sufficient value
Loading pages with irrelevant keywords
Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware
Abusing rich snippets markup
Sending automated queries to Google


Engage in good practices like the following:
- Monitoring your site for hacking and removing hacked content as soon as it appears
- Preventing and removing user-generated spam on your site

Read more ...

Verify your site with webmaster tools | Google Webmaster Analytics Tools

Why verify your site?
See reasons to verify your ownership of a site

Once you've verified your site to Google, you get easy access to a wealth of tools and data from all these Google products:
Webmaster Tools: Improve your site's performance in Google's organic search results.
Google Accounts: Unified sign-in for Google products.
AdPlanner: Get the data to make better-informed advertising decisions.
Profiles: Control how you appear in Google.
Blogger: Publish yourself.
AdSense: Monetize your site by displaying targeted Google ads.
Apps: Get reliable, secure collaboration tools.
Merchant Center: Upload product listings to Google.

Read more ...

Adding a site Webmaster Tools Help | Google Webmaster Analytics Tools

You can add up to 1,000 sites, including news and mobile sites, to your account. In addition, we'll ask you to verify your site. This is because we need to know you own a site before we'll show you certain information about it or enable you to use our tools. Verification doesn't affect PageRank or affect your site's performance in Google's search results.


Add and verify a site:
Sign into Google Webmaster Tools with your Google Account.
Click the Add a site button, and type the URL of the site you want to add. Make sure you type the entire URL, such as http://www.example.com/
Click Continue. The Site verification page opens.
(Optional) In the Name box, type a name for your site (for example, My Blog).
Select the verification method you want, and follow the instructions.

What kind of sites can I add?
Here is a list of the types of URLs you can add as a site:
example.com
www.example.com
http://example.com
https://example.com
bar.example.com
foo.bar.example.com
www.example.com/foo
www.example.com/foo/bar
foo.bar.example.com/catalog/dresses

Webmaster Tools data and reporting work best on a site level. For example, if your site www.example.com has separate sections for different countries, we recommend adding each of those subsites or subfolders as a separate site. For example, if you have a travel site with specific subfolders covering Ireland, France, and Spain, you could add the following sites to your Webmaster Tools account:
http://www.example.com
http://www.example.com/france
http://www.example.com/ireland
http://www.example.com/spain

Similarly, if your site has http:// and https:// versions, you should add each as a separate site.

Webmaster Tools supports Internationalizing Domain Names in Applications (IDNA) when adding a site to your account. Just type your domain name as usual, and it will appear correctly in Webmaster Tools. For example, if you type http://bücher.example.com in the Add Site box, it will appear correctly. This applies only to the host (in this example, bücher.example.com). Therefore, we recommend you create your URL path using only ASCII characters that do not need escaping.

Read more ...
Designed By Published.. Blogger Templates