Breaking News

Thursday, 23 May 2013

Site Health | Webmaster Guidelines tools

See why we sometimes put an alert icon next to your site

To help you identify and prioritize the most important issues on your site, Webmaster Tools will display the Site Health alert next to your site on the Webmaster Tools home page whenever we detect certain problems or events that prevent us from crawling or indexing your site. The following events can trigger a Site Health alert:

* Google detects malware on your site.
* An important page is removed from your site. If you've intentionally removed a page that was previously generating traffic to your site, you can ignore this alert.
* An important page is blocked by robots.txt. If you've intentionally blocked a page (because you don't want it to appear in search results), you can ignore this alert.
Read more ...

Guidelines on user-generated spam | Webmaster Guidelines tools

Google’s Webmaster Guidelines outline best practices for website owners, and the use of techniques that violate our guidelines may cause us to take action on a site. However, not all violations of our Webmaster Guidelines are related to content created intentionally by a site’s owner. Sometimes, spam can be generated on a good site by malicious visitors or users. This spam is usually generated on sites that allow users to create new pages or otherwise add content to the site.

If you receive a warning from Google about this type of spam, the good news is that we generally believe your site is of sufficient quality that we didn’t see a need to take manual action on the whole site. However, if your site has too much user-generated spam on it, that can affect our assessment of the site, which may eventually result in us taking manual action on the whole site.

Some examples of spammy user-generated content include:
- Spammy accounts on free hosts
- Spammy posts on forum threads
- Comment spam on blogs


Since spammy user-generated content can pollute Google search results, we recommend you actively monitor and remove this type of spam from your site. Here are several tips on how to prevent abuse of your site’s public areas.

User-generated spam:
Comments are a great way for webmasters to build community and readership. Unfortunately, they're often abused by spammers and nogoodniks, many of whom use scripts or other software to generate and post spam. If you've ever received a comment that looked like an advertisement or a random link to an unrelated site, then you've encountered comment spam. Here are some ideas for reducing or preventing comment spam on your website.
Use anti-spam tools

Most website development tools, especially blog tools, can require users to prove they're a real live human, not a nasty spamming engine. You'll have seen these: Generally the user is presented with a distorted image (often called a CAPTCHA, for "completely automated public Turing test to tell computers and humans apart") and asked to type the letters or numbers she sees in the image. Some CAPTCHA systems also support audio CAPTCHAs. This is a pretty effective way of preventing user-generated spam. The process may reduce the number of casual readers who leave comments on your pages or create a user profile, but it will definitely improve the quality of the comments and profiles.

Google's free reCAPTCHA's service is easy to implement on your site. In addition, data collected from the service is used to improve the process of scanning text, such as from books or newspapers. By using reCAPTCHA, you're not only protecting your site from spammers; you're helping to digitize the world's books. If you’d like to implement reCAPTCHA for free on your own site, you can sign up here. Plugins are available for easy installation on popular applications and programming environments such as WordPress and PHP.
Turn on comment moderation

Comment moderation means that no comments will appear on your site until you manually review and approve them. This means you'll spend more time monitoring your comments, but it can really help to improve the user experience for your visitors. It's particularly worthwhile if you regularly post about controversial subjects, where emotions can become heated. It's generally available as a setting in your blogging software, such as Blogger.
Use "nofollow" tags

Together with Yahoo! and MSN, Google introduced the "nofollow" HTML microformat a few years ago, and the attribute has been widely adopted. Any link with the rel="nofollow" attribute will not be used to calculate PageRank or determine the relevancy of your pages for a user query. (For example, if a spammer includes a link in your comments like this:
<a href="http://www.example.com/">This is a nice site!</a>
it will get converted to:
<a href="http://www.example.com/" rel="nofollow">This is a nice site! </a>
This new link will not be taken into account when calculating PageRank. This won't prevent spam, but it will avoid problems with passing PageRank.



By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.
Disallow hyperlinks in comments

If you have access to the server, you may want to change its configuration to remove HTML tags from comment links inside your guestbook. Spammers will still be able to leave comments, but they won't be able to publish active hyperlinks.
Block comment pages using robots.txt or META tags

You can use your robots.txt file to block Google's access to certain pages. This won't stop spammers from leaving comments or creating user accounts, but it will mean that links in these comments won't negatively impact your site. For example, if comments are stored in the subdirectory guestbook, you could add the following to your robots.txt file:
Disallow:/guestbook/

This will block Google from indexing the contents of guestbook and any subdirectories.

You can also use the META tag to block access to a single selected page, for example http://www.example.com/article/comments. Like this:
<html>
<head>
<META NAME="googlebot" CONTENT="noindex">


You may wish to use these methods to block profile pages for new and not yet trustworthy users. Once you gain trust in the user, you can remove the crawling or indexing restrictions.
Think twice about enabling a guestbook or comments

A lot of spam doesn't give users a good impression of your site. If this feature isn't adding much value to your users, or if you won't have time to regularly monitor your guestbook or comments, consider turning them off. Most blogging software, such as Blogger, will let you turn comments off for individual posts.

Use a blacklist to prevent repetitive spamming attempts.

Google often sees large numbers of fake profiles on one innocent site all linking to the same domain. Once you find a single spammy profile, make it simple to remove any others.

Add a "report spam" feature to user profiles and friend invitations.

Your users care about your community and are annoyed by spam too. Let them help you solve the problem.
Read more ...

Hacked content | Webmaster Guidelines tools

Hacked content is any content that is placed on your site without your permission due to vulnerabilities in your site’s security. In order to protect our users and to maintain the integrity of our search results, Google tries its best to keep hacked content out of our search results. Hacked content gives our users results that are not useful and can potentially install malicious content on their machines. We recommend that you keep your site secure, and clean up hacked content when you find it.

Some examples of hacking include:
Injected content
When hackers gain access to your website, they may try to inject malicious content into existing pages on your site. This often takes the form of malicious JavaScript injected directly into the site, or into iframes.

Added content
Sometimes, due to security flaws, hackers are able to add new pages to your site that contain spammy or malicious content. These pages are often meant to manipulate search engines. Your existing pages may not show signs of hacking, but these newly-created pages could harm your site’s visitors or your performance in search results.

Hidden content
Hackers may also try to subtly manipulate existing pages on your site. Their goal is to add content to your site that search engines can see but which may be harder for you and your users to spot. This can involve adding hidden links or hidden text to a page by using CSS or HTML, or it can involve more complex changes like cloaking.

Read more ...

Automated queries | Webmaster Guidelines tools

Google's Terms of Service do not allow the sending of automated queries of any sort to our system without express permission in advance from Google. Sending automated queries consumes resources and includes using any software (such as WebPosition Gold) to send automated queries to Google to determine how a website or webpage ranks in Google search results for various queries. In addition to rank checking, other types of automated access to Google without permission are also a violation of our Webmaster Guidelines and Terms of Service.
Read more ...

Rich snippets guidelines | Webmaster Guidelines tools

Rich snippets are designed to summarize the content of a page in a way that makes it even easier for users to understand what the page is about in our search results. If you’re considering taking advantage of rich snippets, think about whether a user would find the information helpful when choosing between search results. The following guidelines may give you a better idea of what we mean.

To get started with rich snippets, review the in-depth information provided in the Webmaster Help Center as well as the technical, design, and quality guidelines below.
Technical guidelines
In order to be eligible for rich snippets, you should mark up your site’s pages using one of three supported formats:
Microdata
Microformats
RDFa

Once your content is marked up, test it using the structured data testing tool. If the tool correctly renders a rich snippet for your pages, they’re eligible to be shown with rich snippets! If rich snippets aren’t appearing in the rich snippets testing tool, refer to our troubleshooting guide.

Once you’ve correctly implemented and tested your markup, it may take some time for rich snippets to appear in search results as we crawl and process the pages. If rich snippets are not appearing in Google’s search results after a few weeks, refer to our troubleshooting guide as well as the design and quality guidelines below.
Design guidelines

Implementing rich snippets allows us to provide an even more useful summary of the content on a page. Markup intended to be used in a rich snippet should:

- Describe and summarize the page’s main content as a user would see it. (There are some exceptions.)
Contain up-to-date information. We won’t show a rich snippet for time-sensitive content that is no longer relevant.

- Be of original content that you and your users have generated and is fully contained on your page. We won’t show a rich snippet for content that is linked or alluded to but not directly available on a page.
Quality guidelines

- While rich snippets are generated algorithmically, we do reserve the right to take manual action (e.g., disable rich snippets for a specific site) in cases where we see abuse, deception, or other actions that hurt the search experience for our users. In particular, you should avoid:
Marking up content that is in no way visible to users.
Marking up irrelevant or misleading content, such as fake reviews or content unrelated to the focus of a page.

- These quality guidelines cover the most common forms of deceptive or manipulative rich snippet behavior, but Google may respond negatively to other misleading practices not listed here. It's not safe to assume that Google approves of a specific deceptive technique just because it isn't included on this page. We strongly advise that webmasters focus on providing a great user experience rather than on looking for loopholes.
Troubleshooting

- If rich snippets are not appearing for your pages in Google’s search results, check that you’ve done each of the following things:
Implemented markup in accordance with the above guidelines
Successfully tested using the structured data testing tool
Reviewed our troubleshooting guide

Read more ...

Creating pages with malicious behavior | Webmaster Guidelines tools

Distributing content or software on your website that behaves in a way other than what a user expected is a violation of Google’s Webmaster Guidelines. This includes anything that manipulates content on the page in an unexpected way, or downloads or executes files on a user’s computer without their consent. Google not only aims to give its users the most relevant search results for their queries, but also to keep them safe on the web.

Some examples of malicious behavior include:
* Changing or manipulating the location of content on a page, so that when a user thinks they’re clicking on a particular link or button the click is actually registered by a different part of the page
* Injecting new ads or pop-ups on pages, or swapping out existing ads on a webpage with different ads; or promoting or installing software that does so
* Including unwanted files in a download that a user requested
* Installing malware, trojans, spyware, ads or viruses on a user’s computer
* Changing a user’s browser homepage or search preferences without the user’s informed consent
Read more ...

Keyword stuffing | Webmaster Guidelines tools

"Keyword stuffing" refers to the practice of loading a webpage with keywords or numbers in an attempt to manipulate a site's ranking in Google search results. Often these keywords appear in a list or group, or out of context (not as natural prose). Filling pages with keywords or numbers results in a negative user experience, and can harm your site's ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.

Examples of keyword stuffing include:
* Lists of phone numbers without substantial added value
* Blocks of text listing cities and states a webpage is trying to rank for
* Repeating the same words or phrases so often that it sounds unnatural, for example:
* We sell custom cigar humidors. Our custom cigar humidors are handmade. If you’re thinking of buying a custom cigar humidor, please contact our custom cigar humidor specialists at custom.cigar.humidors@example.com.
Read more ...

Affiliate programs | Webmaster Guidelines tools

Our Webmaster Guidelines advise you to create websites with original content that adds value for users. This is particularly important for sites that participate in affiliate programs. Typically, affiliate websites feature product descriptions that appear on sites across that affiliate network. As a result, sites featuring mainly content from affiliate networks can suffer in Google's search rankings, because they do not have enough unique content that differentiates them from other sites on the web.

Google believes that pure, or "thin," affiliate websites do not provide additional value for web users, especially if they are part of a program that distributes its content to several hundred affiliates. These sites generally appear to be cookie-cutter sites or templates with no original content. Because a search results page could return several of these sites, all with the same content, thin affiliates create a frustrating user experience.

Some examples of thin affiliates include:
Pages with product affiliate links on which the product descriptions and reviews are copied directly from the original merchant without any original content or added value.

* Not every site that participates in an affiliate program is a thin affiliate. Good affiliates add value, for example by offering original product reviews, ratings, and product comparisons. If you participate in an affiliate program, there are a number of steps you can take to help your site stand out and to help improve your rankings.
* Affiliate program content should form only a small part of the content of your site.
Ask yourself why a user would want to visit your site first rather than visiting the original merchant directly. Make sure your site adds substantial value beyond simply republishing content available from the original merchant.
* When selecting an affiliate program, choose a product category appropriate for your intended audience. The more targeted the affiliate program is to your site's content, the more value it will add and the more likely you will be to rank better in Google search results and make money from the program. For example, a well-maintained site about hiking in the Alps could consider an affiliate partnership with a supplier who sells hiking books rather than office supplies.
* Use your website to build community among your users. This will help build a loyal readership, and can also create a source of information on the subject you are writing about. For example, discussion forums, user reviews, and blogs all offer unique content and provide value to users.
* Keep your content updated and relevant. Fresh, on-topic information increases the likelihood that your content will be crawled by Googlebot and clicked on by users.

Pure affiliate sites consisting of content that appears in many other places on the web are unlikely to perform well in Google search results and may be negatively perceived by search engines. Unique, relevant content provides value to users and distinguishes your site from other affiliates, making it more likely to rank well in Google search results.
Read more ...

Scraped content | Webmaster Guidelines tools

Some webmasters use content taken (“scraped”) from other, more reputable sites on the assumption that increasing the volume of pages on their site is a good long-term strategy regardless of the relevance or uniqueness of that content. Purely scraped content, even from high-quality sources, may not provide any added value to your users without additional useful services or content provided by your site; it may also constitute copyright infringement in some cases. It's worthwhile to take the time to create original content that sets your site apart. This will keep your visitors coming back and will provide more useful results for users searching on Google.

Some examples of scraping include:
* Sites that copy and republish content from other sites without adding any original content or value
* Sites that copy content from other sites, modify it slightly (for example, by substituting synonyms or using automated techniques), and republish it
* Sites that reproduce content feeds from other sites without providing some type of unique organization or benefit to the user

Read more ...

Doorway pages | Webmaster Guidelines tools

Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase. In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination. Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users.

Therefore, Google frowns on practices that are designed to manipulate search engines and deceive users by directing them to sites other than the one they selected, and that provide content solely for the benefit of search engines. Google may take action on doorway sites and other sites making use of these deceptive practices, including removing these sites from Google’s index.

Some examples of doorways include:
* Having multiple domain names targeted at specific regions or cities that funnel users to one page
* Templated pages made solely for affiliate linking
* Multiple pages on your site with similar content designed to rank for specific queries like city or state names

Read more ...

Hidden text and links | Webmaster Guidelines tools

Hiding text or links in your content to manipulate Google’s search rankings can be seen as deceptive and is a violation of Google’s Webmaster Guidelines. Text (such as excessive keywords) can be hidden in several ways, including:
- Using white text on a white background
- Locating text behind an image
- Using CSS to position text off-screen
- Setting the font size to 0
- Hiding a link by only linking one small character—for example, a hyphen in the middle of a paragraph

When evaluating your site to see if it includes hidden text or links, look for anything that's not easily viewable by visitors of your site. Are any text or links there solely for search engines rather than visitors?

However, not all hidden text is considered deceptive. For example, if your site includes technologies that search engines have difficulty accessing, like JavaScript, images, or Flash files, using descriptive text for these items can improve the accessibility of your site. Remember that many human visitors using screen readers, mobile browsers, browsers without plug-ins, and slow connections will not be able to view that content either and will benefit from the descriptive text as well. You can test your site’s accessibility by turning off JavaScript, Flash, and images in your browser, or by using a text-only browser such as Lynx. Some tips on making your site accessible include:
* Images: Use the alt attribute to provide descriptive text. In addition, we recommend using a human-readable caption and descriptive text around the image. See this article for more advice on publishing images.
* JavaScript: Place the same content from the JavaScript in a <noscript> tag. If you use this method, ensure the contents are exactly the same as what’s contained in the JavaScript, and that this content is shown to visitors who do not have JavaScript enabled in their browser.
* Videos: Include descriptive text about the video in HTML. You might also consider providing transcripts. See this article for more advice on publishing videos.
Read more ...

Sneaky redirects | Webmaster Guidelines tools

Redirecting is the act of sending a visitor to a different URL than the one they initially requested. There are many good reasons to redirect one URL to another, for example when moving your site to a new address, or consolidating several pages into one.

However, some redirects are designed to deceive search engines or to display different content to human users than to search engines. It’s a violation of Google’s Webmaster Guidelines to use JavaScript, a meta refresh, or other technologies to redirect a user to a different page with the intent to show the user a different page than a search engine crawler sees. When a redirect is implemented in this way, a search engine may index the original page rather than following the redirect, whereas users are taken to the redirect target. Like cloaking, this practice is deceptive because it attempts to display different content to users and to Googlebot, and can take a visitor somewhere other than where they expected to go.

Using JavaScript to redirect users can be a legitimate practice. When examining JavaScript or other redirects to ensure your site adheres to our guidelines, consider the intent. For example, if you redirect users to an internal page once they’re logged in, you can use JavaScript to do so. Keep in mind that 301 redirects are best when moving your site, but you could use a JavaScript redirect if you don’t have access to your website’s server.
Read more ...

Cloaking | Webmaster Guidelines Tools

Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

Some examples of cloaking include:
Serving a page of HTML text to search engines, while showing a page of images or Flash to users
Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor


If your site uses technologies that search engines have difficulty accessing, like JavaScript, images, or Flash, see our recommendations for making that content accessible to search engines and users without cloaking.

Read more ...

Link schemes | Webmaster Guidelines tools


Your site's ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links influences your ranking. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity.

Any links intended to manipulate a site's ranking in Google search results may be considered part of a link scheme. This includes any behavior that manipulates links to your site, or outgoing links from your site. Manipulating these links may affect the quality of our search results, and as such is a violation of Google’s Webmaster Guidelines.

The following are examples of link schemes which can negatively impact a site's ranking in search results:
Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link
- Excessive link exchanging ("Link to me and I'll link to you")
- Linking to web spammers or unrelated sites with the intent to manipulate PageRank
- Building partner pages exclusively for the sake of cross-linking
- Using automated programs or services to create links to your site

Here are a few common examples of unnatural links that violate our guidelines:
- Text advertisements that pass PageRank

Links that are inserted into articles with little coherence, for example:
most people sleep at night. you can buy cheap blankets at shops. a blanket keeps you warm at night. you can also buy a wholesale heater. It produces more warmth and you can just turn it off in summer when you are going on france vacation.

Low-quality directory or bookmark site links

Links embedded in widgets that are distributed across various sites, for example:
Visitors to this page: 1,472
 car insurance

Widely distributed links in the footers of various sites

Forum comments with optimized links in the post or signature, for example:
Thanks, that’s great info!
 - Paul
paul’s pizza san diego pizza best pizza san diego

- Note that PPC (pay-per-click) advertising links that don’t pass PageRank to the buyer of the ad do not violate our guidelines. You can prevent PageRank from passing in several ways, such as:
- Adding a rel="nofollow" attribute to the <a> tag
- Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file

The best way to get other sites to create relevant links to yours is to create unique, relevant content that can quickly gain popularity in the Internet community. The more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it. Before making any single decision, you should ask yourself: Is this going to be beneficial for my page's visitors?

It is not only the number of links you have pointing to your site that matters, but also the quality and relevance of those links. Creating good content pays off: Links are usually editorial votes given by choice, and the buzzing blogger community can be an excellent place to generate interest.

If you see a site that is participating in link schemes intended to manipulate PageRank, let us know. We'll use your information to improve our algorithmic detection of such links.

Read more ...

Automatically generated content | Webmaster Guidelines tools

Automatically generated—or “auto-generated”—content is content that’s been generated programmatically. Often this will consist of paragraphs of random text that makes no sense to the reader but which may contain search keywords.

* Some examples of auto-generated content include:
* Text translated by an automated tool without human review or curation before publishing
* Text generated through automated processes, such as Markov chains
* Text generated using automated synonymizing or obfuscation techniques
* Text generated from scraping Atom/RSS feeds or search results
* Stitching or combining content from different web pages without adding sufficient value
Read more ...

Wednesday, 22 May 2013

Webmaster Guidelines | Google find, crawl, and index your site

Best practices to help Google find, crawl, and index your site

Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the "Quality Guidelines," which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise impacted by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or on any of Google's partner sites.

- Design and content guidelines
- Technical guidelines
- Quality guidelines


When your site is ready:
Submit it to Google at http://www.google.com/submityourcontent/.

Submit a Sitemap using Google Webmaster Tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages.

Make sure all the sites that should know about your pages are aware your site is online.

Design and content guidelines
Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

* Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.

* Keep the links on a given page to a reasonable number.

* Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

* Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

* Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.

* Make sure that your <title> elements and ALT attributes are descriptive and accurate.

* Check for broken links and correct HTML.

If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

Review our recommended best practices for images, video and rich snippets.

Technical guidelines
Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

- Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

- Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.

- Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.

- Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.

- If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.

- Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.

- Test your site to make sure that it appears correctly in different browsers.

- Monitor your site's performance and optimize load times. Google's goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.

- Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let's Make The Web Faster. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.

Quality guidelines

- These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here. It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

- If you believe that another site is abusing Google's quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. While we may not take manual action in response to every report, spam reports are prioritized based on user impact, and in some cases may lead to complete removal of a spammy site from Google's search results. Not all manual actions result in removal, however. Even in cases where we take action on a reported site, the effects of these actions may not be obvious.

Quality guidelines - basic principles
Make pages primarily for users, not for search engines.

- Don't deceive your users.

- Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"

- Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.

Quality guidelines - specific guidelines

Avoid the following techniques:
Automatically generated content
Participating in link schemes
Cloaking
Sneaky redirects
Hidden text or links
Doorway pages
Scraped content
Participating in affiliate programs without adding sufficient value
Loading pages with irrelevant keywords
Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware
Abusing rich snippets markup
Sending automated queries to Google


Engage in good practices like the following:
- Monitoring your site for hacking and removing hacked content as soon as it appears
- Preventing and removing user-generated spam on your site

Read more ...

Verify your site with webmaster tools | Google Webmaster Analytics Tools

Why verify your site?
See reasons to verify your ownership of a site

Once you've verified your site to Google, you get easy access to a wealth of tools and data from all these Google products:
Webmaster Tools: Improve your site's performance in Google's organic search results.
Google Accounts: Unified sign-in for Google products.
AdPlanner: Get the data to make better-informed advertising decisions.
Profiles: Control how you appear in Google.
Blogger: Publish yourself.
AdSense: Monetize your site by displaying targeted Google ads.
Apps: Get reliable, secure collaboration tools.
Merchant Center: Upload product listings to Google.

Read more ...

Adding a site Webmaster Tools Help | Google Webmaster Analytics Tools

You can add up to 1,000 sites, including news and mobile sites, to your account. In addition, we'll ask you to verify your site. This is because we need to know you own a site before we'll show you certain information about it or enable you to use our tools. Verification doesn't affect PageRank or affect your site's performance in Google's search results.


Add and verify a site:
Sign into Google Webmaster Tools with your Google Account.
Click the Add a site button, and type the URL of the site you want to add. Make sure you type the entire URL, such as http://www.example.com/
Click Continue. The Site verification page opens.
(Optional) In the Name box, type a name for your site (for example, My Blog).
Select the verification method you want, and follow the instructions.

What kind of sites can I add?
Here is a list of the types of URLs you can add as a site:
example.com
www.example.com
http://example.com
https://example.com
bar.example.com
foo.bar.example.com
www.example.com/foo
www.example.com/foo/bar
foo.bar.example.com/catalog/dresses

Webmaster Tools data and reporting work best on a site level. For example, if your site www.example.com has separate sections for different countries, we recommend adding each of those subsites or subfolders as a separate site. For example, if you have a travel site with specific subfolders covering Ireland, France, and Spain, you could add the following sites to your Webmaster Tools account:
http://www.example.com
http://www.example.com/france
http://www.example.com/ireland
http://www.example.com/spain

Similarly, if your site has http:// and https:// versions, you should add each as a separate site.

Webmaster Tools supports Internationalizing Domain Names in Applications (IDNA) when adding a site to your account. Just type your domain name as usual, and it will appear correctly in Webmaster Tools. For example, if you type http://bücher.example.com in the Add Site box, it will appear correctly. This applies only to the host (in this example, bücher.example.com). Therefore, we recommend you create your URL path using only ASCII characters that do not need escaping.

Read more ...

Connect with Google+ | Google Webmaster Analytics Tools


With Google+, you can create an identity and presence on Google. Use Google+ and other social networks to connect with friends, family members, and start a conversation around your site.
Create a Google profile

Your profile is the way you represent yourself on Google products and across the web. With your profile, you can manage the information that people see—such as your bio, contact details, and links to other sites about you or created by you. You can also link your profile to other social networks such as Twitter or Facebook. Create a Google profile.

Add the Google +1 button to your site


Adding the +1 button to pages on your own site lets users recommend your content, knowing that their friends and contacts will see their recommendation when it's most relevant—in the context of Google search results. If a user wants to share your content right away, they can also use the +1 button to add a comment, choose what friends (circles) to share it with, and post to Google+—all without leaving your site. All it takes is a snippet of code.

Link your Google profile to your content
Google may display authorship information in search results to help users discover great content by writers they enjoy.

If you want your authorship information to appear in search results for your content, make sure your Google+ profile has a good, recognizable headshot as your profile photo. Then, verify authorship of your content by associating it with your profile.
Add friends and colleagues to your circles

Reference BY: Support.google.com/webmasters/?hl=en

The Internet is a big place, but you can make it a bit smaller by adding connections to customizable social circles. Circles are a great way of organizing the people in your life, and they make it easy to choose who you'd like to share posts and updates with. You can get started by importing your contacts from Gmail, or from entering in your contacts manually. 
Read more ...

Images and video | Google Webmaster Analytics Tools

Help Google understand your content

People love images and video, but search engines are designed for text. The more information you give us about your images and video, the better. Check out our guidelines for publishing images and video, but in general, follow these guidelines:

  * Use descriptive file names. The file name black-fender-guitar.jpg tells us a lot more than image1.jpg.

  * Create great alt text. Alt text is used to describe the contents of an image file. It’s great for human readers, but it also provides search engines with useful information about the target image or video.

  * Give your images and video context. Google can infer a great deal about your image or video from the content surrounding it. For example, a picture of a guitar on a page about the history of guitars sends a strong signal to search engines that black-fender-guitar.jpg is about guitars.

  * Provide a great user experience. Try not to make users scroll to see your images and video, and use high-quality source files.

Upload your video content to YouTube to reach a wider audience.

Essentially, a Sitemap is a list of the pages, images, or video on your website, but it can also include additional information. For example, you can include title, description, playpage URL, thumbnail URL and the video URL for each video on your site, or provide name and file location, the caption, the title, the location where the photo was taken, and any licensing information for your images. More information about Sitemaps for video and for images.

Reference BY: Support.google.com/webmasters/?hl=en 
Read more ...

Create great content | Google Webmaster Analytics Tools

One key element of creating a successful site is not to worry about Google's ranking algorithms or signals, but to concentrate on delivering the best possible experience for your user by creating content that other sites will link to naturally—just because it's great.
When you're writing a post or article, think about:
  • Would you trust the information in this article?
  • Is the article useful and informative, with content beyond the merely obvious? Does it provide original information, reporting, research, or analysis?
  • Does it provide more substantial value than other pages in search results?
  • Would you expect to see this in a printed magazine, encyclopedia or book?
  • Is your site a recognized authority on the subject?

Keep an eye out for the following problems:

  • Does this article have spelling, stylistic, or factual errors?
  • Does the site generate content by attempting to guess what might rank well in search engines?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites?
  • Does this article have an excessive number of ads that interfere with the main content?
  • Are the articles short or lacking in helpful specifics
Google’s Webmaster Guidelines outline practices that could negatively impact your performance in the search results, or remove you from the search results entirely. If we detect problematic content on your site, we’ll notify you using Webmaster Tools. We strongly recommend becoming familiar with our guidelines, as well as our tips for creating Google-friendly sites.
If your site contains user generated content, make sure to have a firm spam policy in place early. Check out this video for more tips on keeping sites with user generated content spam-free:
Reference BY: Support.google.com/webmasters/?hl=en
Read more ...

Influence your site's listing in search | Google Webmaster Analytics Tools

A user’s experience with your site begins with its listing in the search results. While our search is algorithmic and automated, you can have a lot of influence over how your site is listed. Here's some ways you can help create compelling listings that users are more likely to click:

Create useful page titles. Make sure that your title is useful, descriptive, and relevant to the actual page itself. More information.

Use informative URLs. The URL (web address) of a page appears below the title, with words from the user’s query in bold. Your URLs should be simple and human readable. Which do you find more informative: http://example.com/products/shoes/high_heels/pumps.html or http://example.com/product_id=123458?

Provide relevant page descriptions. The descriptive text under the URL is usually taken from the description meta tag on the page. Descriptions should be different and unique to each area of your site. More information.

Add your business to Google Places, to help Google display location information in results.

Manage your sitelinks. Sitelinks (sub-links to individual pages on your site) are meant to help users navigate your site. Sitelinks are automatically generated. This means that you can't specify a sitelink, but you can use Webmaster Tools to ask Google to demote sitelinks you don't like.
Reference BY: Support.google.com/webmasters/?hl=en
Read more ...

Make sure Google knows about your site | Google Webmaster Analytics Tools

Check that your site is indexed
* To see if Google already knows about your site, do a "site:" search, like this: [ site:example.com ].
If pages from your site show up, your site (or a part of it) is already in Google’s index.

* If it doesn't show up, and it's very new, it's possible that Google hasn't discovered it yet. Use the Submit Your Content page to expedite our discovery of it. Be sure to check out specific Google products and services for businesses, publishers, and public agencies. More information.

* If your site doesn't show up, and it used to, it may be in violation of Google's Webmaster Guidelines. Our guidelines are designed to help webmasters create useful, Google-friendly sites that are good for users and the web.

* If it's showing up, but pages appear lower than they used to.


Reference BY: Support.google.com/webmasters/?hl=en
Read more ...

How Google Works | Google Webmaster Analytics

Algorithmic search

Search is about giving people the answer they’re looking for—whether it’s a news article, sports score, stock quote, a video or a map. Google’s search engineers design algorithms that analyze millions of pages and return timely, high-quality, relevant answers to people’s questions.

What about ads?

Search results (sometimes called "organic" results) appear in the middle of the Google.com results page, and are never paid for. Paid ads appear on the right hand side and sometimes at the top, and are always clearly labeled.

Google maintains a strict separation between our search business and our advertising business, and doesn't give special treatment to our advertisers. Our view is that if we provide the best search results, people will continue to choose to use Google over other search engines.

Openness

Google's committed to transparency. That's why we created Webmaster Tools: to give webmasters as much information as we can about how we crawl and index sites. More information about transparency.
Reference BY: Support.google.com/webmasters/?hl=en
Read more ...

Thursday, 16 May 2013

Collecting mobile web analytics data (Mobile web analytics)

Collection of mobile web analytics data requires a different approach from collecting traditional web analytics data. A number of solutions are available and the best results are obtained through the use of more than one technology.

Packet sniffing:
Also known as tagless data capture or passive network capture, this technique uses a tap between the mobile users and the web server to capture the full content of the client-server exchange. Tagless data capture techniques are increasing in popularity for mobile web analytics because they capture all users, work with all devices and do not require JavaScript, cookies, server logs, or plugins.

Image tags or beacons:
Images can be forced to work for mobile web analytics, provided that the transmitted image is always unique. The level of information recorded from these transmissions depends on the architecture provided by the supplier, and not all image beacon solutions are the same.

Link redirection:
Link redirection is an important method of tracking mobile visitor activities. It is the only reliable way to record clicks from advertising, search, and other marketing activities. It also records visitors clicking on links to leave a site. This method helps address the lack of HTTP referrer information on mobile.

HTTP header analysis:
This tells you a number of basic facts about the mobile phone and the browser. It can be used in conjunction with a device database such as WURFL.

IP address analysis:
An operator database is used to identify operators and their countries based on the IP addresses of their internet gateway devices. IP addresses alone do not identify all operators and countries, as some operators share their mobile networks with virtual network operators (MVNO). Boost Wireless, for example, uses the Sprint network. Because these two operators have very different customer demographics, clear differentiation between operators is critical for good mobile marketing campaigns. Carriers may also share their mobile internet gateways, sometimes across multiple countries, and many change or add gateways on a regular basis.

WAP Gateway Traffic logs:
The WAP gateway logs are the mine of information that can be analysed to get relevant information as all the mobile traffic goes through these servers. There are companies like OPENWAVE which have tools that can analyse these logs and provide the information required.

Reference By: En.wikipedia.org/wiki/Mobile_Web_Analytics#Drawbacks_of_applying_traditional_web_analytics
Read more ...

Problems with tracking (Mobile web analytics)

Visitor identification:
Visitor identification is the most important aspect of usable mobile web analytics and one of the hardest technical aspects to accomplish, primarily because JavaScript and HTTP cookies are so unreliable on mobile browsers. As a result, some mobile web analytics solutions only detect or count user visits per day. The best solutions provide reliable, persistent, and unique user identities, allowing accurate measurement of repeat visits and long-term customer loyalty.

JavaScript page tagging:
Javascript-based Page tagging notifies a third-party server when a page is rendered by a web browser. This method assumes that the end user browser has JavaScript capabilities and that JavaScript is enabled, though it is possible that neither may be true. At this time, most mobile web browsers do not support JavaScript sufficiently for this to work.

HTTP cookies:
HTTP cookies are commonly used to mark and identify visitors. Cookies are a standard capability for all desktop web browsers. With the prevalence of iPhones and Androids, HTTP cookies are now supported by most smartphones, because by default, iPhones and Android phones will accept browser cookies from web sites. As with desktop browsers, the mobile device user may choose to disable cookies.

HTTP referrer:
HTTP referrer information showing where a visitor navigated from is generally not provided for mobile web browsing. This is either because the device manufacturer has disabled sending such information in the HTTP request to save bandwidth during network usage, or because the mobile network operator's internet gateway removes or alters the original HTTP header due to the gateway software or use of mobile web transcoding software.

Image tags:
Handset caching mechanisms impact the use of images for page tagging. In some cases, image caching on handsets is performed regardless of any anti-caching headers output by the remote server.

IP address:
For desktop web browsing, the network address of the client machine usually gives some form of user identification and location. For mobile web browsing, the client IP address refers to the internet gateway machine owned by the network operator. For devices such as the BlackBerry or for phones using Opera Mini browser software, the IP address refers to an operator-owned internet gateway machine in Canada or Norway.

Reference By: en.wikipedia.org/wiki/Mobile_Web_Analytics#Drawbacks_of_applying_traditional_web_analytics
Read more ...

Drawbacks of applying traditional web analytics (Mobile web analytics)

Mobile web analytics data together with a few defined factors has proven to be easier than traditional web analytics.

Mobile website of traditional analytics software for the most advanced mobile web browser in the HTTP requests to provide information such as the iPhone and other smart phones and PDAs that can be found, browsing the site with information on other mobile devices.

"Unique visits" to different IP addresses associated with the use of server log analysis really unique visitors fail to recognize that a common web analytics software. The network access provider, a gateway IP address from the IP addresses originating from each of the mobile device.

Many dynamic server-side platform is used to develop mobile sites. Server-side analytics reporting is recommended for more accurate tracking code.
Read more ...

Mobile web analytics

Mobile Web Analytics mobile website visitors behave in a manner similar to conventional web analytics study. Economic context, a mobile phone to access the mobile web analytics website visitors, refers to the use of collected data. The mobile traffic and the mobile marketing campaign, mobile advertising, mobile search marketing, text campaigns and mobile sites and services, including desktop, promotion, business to work for the best of the best websites to help determine which aspects.
Read more ...

Tuesday, 14 May 2013

On-site web analytics - definitions

Hit - A request for a file from the web server. Available only in log analysis. The number of hits received by a website is frequently cited to assert its popularity, but this number is extremely misleading and dramatically overestimates popularity. A single web-page typically consists of multiple (often dozens) of discrete files, each of which is counted as a hit as the page is downloaded, so the number of hits is really an arbitrary number more reflective of the complexity of individual pages on the website than the website's actual popularity. The total number of visits or page views provides a more realistic and accurate assessment of popularity.

Page view - A request for a file, or sometimes an event such as a mouse click, that is defined as a page in the setup of the web analytics tool. An occurrence of the script being run in page tagging. In log analysis, a single page view may generate multiple hits as all the resources required to view the page (images, .js and .css files) are also requested from the web server.

Visit / Session - A visit or session is defined as a series of page requests or, in the case of tags, image requests from the same uniquely identified client. A visit is considered ended when no requests have been recorded in some number of elapsed minutes. A 30 minute limit ("time out") is used by many analytics tools but can, in some tools, be changed to another number of minutes. Analytics data collectors and analysis tools have no reliable way of knowing if a visitor has looked at other sites between page views; a visit is considered one visit as long as the events (page views, clicks, whatever is being recorded) are 30 minutes or less closer together. Note that a visit can consist of one page view, or thousands.

First Visit / First Session - (also called 'Absolute Unique Visitor' in some tools) A visit from a uniquely identified client that has theoretically not made any previous visits. Since the only way of knowing whether the uniquely identified client has been to the site before is the presence of a persistent cookie that had been received on a previous visit, the First Visit label is not reliable if the site's cookies have been deleted since their previous visit.

Visitor / Unique Visitor / Unique User - The uniquely identified client that is generating page views or hits within a defined time period (e.g. day, week or month). A uniquely identified client is usually a combination of a machine (one's desktop computer at work for example) and a browser (Firefox on that machine). The identification is usually via a persistent cookie that has been placed on the computer by the site page code. An older method, used in log file analysis, is the unique combination of the computer's IP address and the User Agent (browser) information provided to the web server by the browser. It is important to understand that the "Visitor" is not the same as the human being sitting at the computer at the time of the visit, since an individual human can use different computers or, on the same computer, can use different browsers, and will be seen as a different visitor in each circumstance. Increasingly, but still somewhat rarely, visitors are uniquely identified by Flash LSO's (Local Shared Object), which are less susceptible to privacy enforcement.

Repeat Visitor - A visitor that has made at least one previous visit. The period between the last and current visit is called visitor recency and is measured in days.

New Visitor - A visitor that has not made any previous visits. This definition creates a certain amount of confusion (see common confusions below), and is sometimes substituted with analysis of first visits.

Impression - The most common definition of "Impression" is an instance of an advertisement appearing on a viewed page. Note that an advertisement can be displayed on a viewed page below the area actually displayed on the screen, so most measures of impressions do not necessarily mean an advertisement has been viewable.

Single Page Visit / Singleton - A visit in which only a single page is viewed (a 'bounce').

Bounce Rate - The percentage of visits that are single page visits.

Exit Rate / % Exit - A statistic applied to an individual page, not a web site. The percentage of visits seeing a page where that page is the final page viewed in the visit.

Page Time Viewed / Page Visibility Time / Page View Duration - The time a single page (or a blog, Ad Banner...) is on the screen, measured as the calculated difference between the time of the request for that page and the time of the next recorded request. If there is no next recorded request, then the viewing time of that instance of that page is not included in reports.

Session Duration / Visit Duration - Average amount of time that visitors spend on the site each time they visit. This metric can be complicated by the fact that analytics programs can not measure the length of the final page view.

Average Page View Duration - Average amount of time that visitors spend on an average page of the site.

Active Time / Engagement Time - Average amount of time that visitors spend actually interacting with content on a web page, based on mouse moves, clicks, hovers and scrolls. Unlike Session Duration and Page View Duration / Time on Page, this metric can accurately measure the length of engagement in the final page view, but it is not available in many analytics tools or data collection methods.

Average Page Depth / Page Views per Average Session - Page Depth is the approximate "size" of an average visit, calculated by dividing total number of page views by total number of visits.

Frequency / Session per Unique - Frequency measures how often visitors come to a website in a given time period. It is calculated by dividing the total number of sessions (or visits) by the total number of unique visitors during a specified time period, such as a month or year. Sometimes it is used interchangeable with the term "loyalty."

Click path - the chronological sequence of page views within a visit or session.

Click - "refers to a single instance of a user following a hyperlink from one page in a site to another".
Site Overlay is a report technique in which statistics (clicks) or hot spots are superimposed, by physical location, on a visual snapshot of the web page.
Read more ...

Customer lifecycle analytics

Customer lifecycle analytics is a visitor-centric approach to measuring that falls under the umbrella of lifecycle marketing. Page views, clicks and other events (such as API calls, access to third-party services, etc.) are all tied to an individual visitor instead of being stored as separate data points. Customer lifecycle analytics attempts to connect all the data points into a marketing funnel that can offer insights into visitor behavior and website optimization.

Reference By: en.wikipedia.org/wiki/Web_analytics#On-site_web_analytics_technologies
Read more ...

Click analytics

Click analytics is a special type of web analytics that gives special attention to clicks.

Commonly, click analytics focuses on on-site analytics. An editor of a web site uses click analytics to determine the performance of his or her particular site, with regards to where the users of the site are clicking.

Also, click analytics may happen real-time or "unreal"-time, depending on the type of information sought. Typically, front-page editors on high-traffic news media sites will want to monitor their pages in real-time, to optimize the content. Editors, designers or other types of stakeholders may analyze clicks on a wider time frame to aid them assess performance of writers, design elements or advertisements etc.

Data about clicks may be gathered in at least two ways. Ideally, a click is "logged" when it occurs, and this method requires some functionality that picks up relevant information when the event occurs. Alternatively, one may institute the assumption that a page view is a result of a click, and therefore log a simulated click that led to that page view.

Reference By: en.wikipedia.org/wiki/Web_analytics#On-site_web_analytics_technologies
Read more ...

Geolocation of visitors analytics

With IP geolocation, it is possible to track visitors location. Using IP geolocation database or API, visitors can be geolocated to city, region or country level.

IP Intelligence, or Internet Protocol (IP) Intelligence, is a technology that maps the Internet and catalogues IP addresses by parameters such as geographic location (country, region, state, city and postcode), connection type, Internet Service Provider (ISP), proxy information, and more. The first generation of IP Intelligence was referred to as geotargeting or geolocation technology. This information is used by businesses for online audience segmentation in applications such online advertising, behavioral targeting, content localization (or website localization), digital rights management, personalization, online fraud detection, geographic rights management, localized search, enhanced analytics, global traffic management, and content distribution.

Reference By: en.wikipedia.org/wiki/Web_analytics#On-site_web_analytics_technologies
Read more ...

Hybrid methods logfiles and page tagging analyze

Some companies produce solutions that collect data through both logfiles and page tagging and can analyze both kinds. By using a hybrid method, they aim to produce more accurate statistics than either method on its own. 

Reference By: en.wikipedia.org/wiki/Web_analytics#On-site_web_analytics_technologies
Read more ...

Economic factors Logfile analysis

Logfile analysis is almost always performed in-house. Page tagging can be performed in-house, but it is more often provided as a third-party service. The economic difference between these two models can also be a consideration for a company deciding which to purchase.

Logfile analysis typically involves a one-off software purchase; however, some vendors are introducing maximum annual page views with additional costs to process additional information. In addition to commercial offerings, several open-source logfile analysis tools are available free of charge.
For Logfile analysis you have to store and archive your own data, which often grows very large quickly. Although the cost of hardware to do this is minimal, the overhead for an IT department can be considerable.
For Logfile analysis you need to maintain the software, including updates and security patches.
Complex page tagging vendors charge a monthly fee based on volume i.e. number of pageviews per month collected.

Which solution is cheaper to implement depends on the amount of technical expertise within the company, the vendor chosen, the amount of activity seen on the web sites, the depth and type of information sought, and the number of distinct web sites needing statistics.

Reference By: en.wikipedia.org/wiki/Web_analytics#On-site_web_analytics_technologies
Read more ...

Advantages of page tagging

The main advantages of page tagging over logfile analysis are as follows:
Counting is activated by opening the page (given that the web client runs the tag scripts), not requesting it from the server. If a page is cached, it will not be counted by the server. Cached pages can account for up to one-third of all pageviews. Not counting cached pages seriously skews many site metrics. It is for this reason server-based log analysis is not considered suitable for analysis of human activity on websites.

Data is gathered via a component ("tag") in the page, usually written in JavaScript, though Java can be used, and increasingly Flash is used. Ajax can also be used in conjunction with a server-side scripting language (such as PHP) to manipulate and (usually) store it in a database, basically enabling complete control over how the data is represented.

The script may have access to additional information on the web client or on the user, not sent in the query, such as visitors' screen sizes and the price of the goods they purchased.

Page tagging can report on events which do not involve a request to the web server, such as interactions within Flash movies, partial form completion, mouse events such as onClick, onMouseOver, onFocus, onBlur etc.

The page tagging service manages the process of assigning cookies to visitors; with logfile analysis, the server has to be configured to do this.

Page tagging is available to companies who do not have access to their own web servers.
Lately page tagging has become a standard in web analytics

Reference By: en.wikipedia.org/wiki/Web_analytics#On-site_web_analytics_technologies
Read more ...

Advantages of logfile analysis

The main advantages of logfile analysis over page tagging are as follows:

The web server normally already produces logfiles, so the raw data is already available. No changes to the website are required.

The data is on the company's own servers, and is in a standard, rather than a proprietary, format. This makes it easy for a company to switch programs later, use several different programs, and analyze historical data with a new program.

Logfiles contain information on visits from search engine spiders, which generally do not execute JavaScript on a page and are therefore not recorded by page tagging. Although these should not be reported as part of the human activity, it is useful information for search engine optimization.

Logfiles require no additional DNS lookups or TCP slow starts. Thus there are no external server calls which can slow page load speeds, or result in uncounted page views.

The web server reliably records every transaction it makes, e.g. serving PDF documents and content generated by scripts, and does not rely on the visitors' browsers cooperating.

Reference By: en.wikipedia.org/wiki/Web_analytics#On-site_web_analytics_technologies
Read more ...

Logfile analysis vs page tagging

LogFile analysis programs and page tagging solutions to both web analytics that companies are easily available. In some cases, the same web analytics company will offer both approaches. The question then should choose a company that is causing the system. Each approach has advantages and disadvantages.
Read more ...

Web Page tagging

LogFile accuracy analysis in the presence of caching, and other data collection method, page tagging or 'Web bugs' leads to an outsourced service, web analytics as a desire to be able to worry about

Web analytics service during his visit and subsequent visits to uniquely identify the user to manage the process of assigning a cookie.

Third party data storage server (or even an in-house data storage server), using the Web site to determine the IP address of the server to store the data collected from the user's computer to the DNS look-up is required. On occasion, a successful delay or failure to complete DNS look-ups can result in data not being collected.
Read more ...

Web server logfile analysis

Web servers LogFile some records of their transactions. It soon popularity in the logfiles on the website to provide information that can be read by the program was realized. This web log analysis software have emerged.

The page views and visits (or sessions) said. After a certain amount of inactivity expired visit that uniquely identified client, usually 30 minutes was defined as a sequence of requests, the page View, as opposed to a graphic of a page, the Web server is defined as a request was. The page views and visits are still commonly displayed metrics, but rather by those who are considered to be primary.
Read more ...

On-site web analytics technologies

Many different vendors provide on-site web analytics software and services.

The data collected are two main technical ways.

The first and older method, the server log file analysis, web server requests a file recorded by browsers which reads logfiles.

The second method, page tagging, if desired page when the mouse is clicked, a web browser or by a third party analytics - image requests to a dedicated server in order to use this site to embed JavaScript in the page code. Both to generate web traffic reports can proceed to collect the data.
Read more ...

Web Analytics

What is Google Web Analytics?

Web Analytics measurement, collection, analysis and reporting of Internet data for the purposes of understanding and optimizing web usage is

Web analytics programs to measure the results of traditional print or broadcast advertising campaign can help. It launched a new ad campaign for changes to the website, which helps to estimate how traffic. Web analytics and page views of a website that provides information about the number of visitors. Market research can be used to gauge the traffic and popularity trends which helps.

Off-site On-site web analytics, two categories of web analytics.

Off-site web analytics, regardless of ownership or maintenance of a website that takes you to a web measurement and analysis. If a website's potential audience (opportunity), share voice (visibility), and is happening on the Internet Buzz (tips) are included in the measure.

On-site web analytics measures the behavior of visitors to your website at a time. For example, different landing pages that are associated with the amount of online shopping; This includes its drivers and conversions. On-site web analytics professional to measure your website's performance. Against key performance indicators for the performance of this data, in relation to the web site or marketing campaign's audience is used to improve performance. Google Analytics is the most widely used on-site web analytics service, new equipment, including heat maps, and session replay levels to provide additional information, which is emerging.
Read more ...

Thursday, 9 May 2013

Google Webmaster


What is Google webmaster?

Google Webmaster Tools for Webmasters no charge web service by Google. Webmasters indexing status and optimize visibility of their websites that allow you to check.

Webmasters have the tools to do it:

* Submit a sitemap and check
* Check and set the crawl rate, and how to access a particular site statistics Googlebot view
* Create and check a robots.txt file. It is by chance that the pages are blocked in robots.txt to find help.
* List of internal and external pages that link to the site
* Broken links to this site to get a list of
* Google search on the keywords listed in the SERPs is going to see it on the site, and click through the lists of
* View statistics about how Google indexes the site, and when they do not get any errors
Read more ...
Designed By Published.. Blogger Templates