Sunday, February 26, 2012

Site Using Google Webmaster Tools To Diagnose

Diagnostic site each SEO must grasp one of the courses, how fast and accurate diagnosis of a website can be tricky, planning a step-by-step routine will not miscellaneous but no chapter. I SEO blog will share with you how to diagnose the site, what should pay attention to what in the diagnostic process.

First, we need to use Google Webmaster tools, this software is very powerful and free, one of the essential tool for SEO personnel can be described as the tool of choice for diagnostic site. Below for the system-by-direction detection.

First, the robots.txt file

The entire site can not be included under fire a directory, all pages are not included, often because of the robots.txt file is a mistake caused by network management tools crawlers permissions section shows the robots file content can be crawled by Google that enables website owners friends at a glance, can determine whether clerical errors, and modify in accordance with this feature. And there are automatically generated function of the robots.txt file, one by one according to his method lists the robots file's contents.

Second, set the preferred domain

There will be general site with www and without www URL of the two, and want to SEO then you need to do a 301 redirect, that is, to make of them a URL automatically jump to another URL in Google Webmaster Tools an option that is preferred domain set, that is not a 301 turn to set the preferred domain, Google will this URL as the main URL included, of course this is only for Google's approach, if you want all search engines are only included a URL would need to use a 301 redirect.

Third, the keywords

Diagnosis of a website is not just the site the surface, keyword ranking analysis will bring a great harvest, keyword ranking means that the extent of site optimization, keyword ranking also decided that this time effort performance, so must make a detailed analysis for keywords Keywords query the Google Webmaster tools, analyzed for each keyword that enables website owners at a glance, the data do not tightly only the top, as well as click number a collection of click-through rate analysis.

Fourth, the external links

External links to test the quality of your work, another foreign chain can significantly improve your site's ranking, of course, the premise is the need for extensive outside the chain of The external links listed in the network management tools can make SEO a glance to see what pages are most popular on your site to attract the most external links.

Fifth, the site content

Keywords part of the network management tools are actually listed the most common keywords is crawled by Google on the site. Clearly these are the most common keywords that reflect the theme of the site's content. Here are the keywords, page, home page copywriting and modification of important significance.

Sixth, the internal links
Webmaster from the website internal link structure can largely determine the whether there are significant deficiencies. If the station is the main navigation Category Home very low number of internal links, it is likely there is a problem with the navigation system.

The number of links within the role is to reflect the site contains a number of pages. Google's site: command is not very accurate, but now more and more inaccurate, and often do not reflect the collection of figures. The total number of internal links in the network management tools section lists the home page internal links, generally equivalent to the total number of pages indexed by Google, because each page of the website should have links to the home page.

Seven, to crawl errors and statistics


Crawl errors section lists 404 robots file ban can not be included in the page. 404 whether there is an error link is useful to check the website. For each 404, the network management tools are listed in the link to this web site that enables website owners to easily solve the 404 error links.

If the link to a nonexistent page, issued within the site, these links on the page link error. If the link to the page does not exist, other website owners can now try to contact each other to change the error link to the correct location.

Network management tools are also listed Crawl stats. According to the speed and crawl statistics can view the site from being crawled, so easy to adjust the structure of the site so that optimization is more convenient.

Eight, an HTML recommendations

Is to find the site may copy the content of the easiest way to view Google's HTML recommendations. If you check out the title tag is repeated, then the title tag repeat in practice often means that the page itself is repetitive, often caused by the structure of the site. It should be noted, sometimes the data listed in the network management tool is not complete, in general, the duplicate page title tag on the blog far more than two.

Nine to simulate spider

A strong tool, owners can now enter any URL on your site, network management tools will be issued to the Google spiders crawl the page content in real time and displays to grab the HTML code, including server header information and page code. Owners confirm the steering settings and check the server correctly returns the content helpful.

In addition, this tool can be used to check whether the page was hacked. Sometimes hacking into the code checks the type of browser, if it is a common browser to access user to return to normal, if it is the search engine spiders visit before I return to the hacker added by spam and link spam. So website owners to visit the Web site can not see the strange, the Google spider to catch not see the contents of the website owners. This tool can help website owners to check whether the page is a security vulnerability.

Ten, the Web site performance testing

Site speed is now more and more attention, not only may have an impact on the rankings, also have a significant impact on the user experience. Network management tool for web site performance section shows the pages average load time.

Site open time is not the Google spiders crawl the document to spend time, but the ordinary Google toolbar records a user opens the page take time. Server location does not affect the site performance section lists the data. Google spiders are sent from the United States, but the web server in China that does not mean than in the load time is longer, because Google is a record of the speed of ordinary users to access the site.

Often use Google Webmaster Tools to check web site that will let your SEO way to go more easily.

No comments:

Post a Comment