2013-09-07

SEO terminology

.htaccess
  • The .htaccess file is a configuration file for your web server. In the context of SEO, it is used to help your web server determine how to route HTTP traffic. In the world of SEO, the .htaccess file is most commonly discussed in the context of URL aliases,which are often used to create search engine friendly URLs.
301 redirect (also known as Permanent Redirect)
  • A 301 redirect is an instruction given to the web server, informing it that a page that was previously located at one URL has been moved permanently to a new URL. The 301 redirect is most commonly used in situations where a site has been rebuilt and the URLs have changed. By adding 301 redirects to the site, you are able to avoid missed connections caused by traffic going to the old URL. When a 301 redirect is used, the search engines will also update their indexes to remove the old URL for the page and substitute the new one, thereby preserving the page's indexing
302 redirect (also known as Temporary Redirect or Found)
  • A 302 redirect, like a 301 redirect, informs the web server that a page has moved.Unlike a 301 redirect, a 302 redirect indicates that the move is temporary. This option is a disfavored option as some search engines will penalize for the use of this sort of redirect.
404 error (also known as Page Not Found)
  • When a person visits a URL to a page that no longer exists (or has been moved), or types in an incorrect URL, the visitor will automatically be shown a 404 error message. The default message informs the visitor that the page cannot be found. Many sites build custom pages specifically designed to be displayed when a 404 error occurs.
AdSense
  • AdSense is a Google advertising program aimed at website owners. Site owners can sign up for the AdSense program and then display it on their site. (The ad inventory is provided by Google, often from the AdWords program, discussed next). The website owner will be paid a percentage of the revenues generated when someone clicks on one of the ads displayed on his or her site.
AdWords
  • AdWords is a Google commercial advertising program aimed at advertisers. If you want to advertise on the Google network, you can sign up for the AdWords program, build an ad and set a daily budget for the display of that ad. The ad will then appear in the Google network and you will be charged when someone clicks on one of the ads (or, alternatively, you can elect to be charged according to the number of views of the ad).

Alexa Rank
  • Alexa.comprovides a website ranking service that attempts to rate all the sites on the Web in order of their popularity. Like a golf score, the lower the score, the better. The most popular site on the Web (typically Google.com) has an Alexa Rank of 1. The service, though not 100 percent accurate and the subject of some criticism, is yet another way of tracking the success of your efforts to raise your site's profile. To learn more visit http://alexa.com.
Alt attribute
  • The HTML image tag (img) is used to place images on the page. The tag includes an option to specify a value for the attribute alt. This attribute is intended to allow webmasters to specify an alternative description for the image, typically for the benefit of users who are using screen readers or browsers with the image display disabled.
Anchor

  • Anchors are hyperlinks that allow a user to jump from one place to another within the same page.

Back link (also known as an "inbound link")
  • A back link is a link on an external site that points to your site.
Bing Webmaster
  • The Bing Webmaster service is provided by Microsoft to enable site owners to gain access to some basic tools that help you diagnose and track your site. Registration is free of charge.
Black hat
  • Black hat is a label used to describe the use of SEO techniques that are illegal, unethical, or of questionable propriety.

Bot (also known as Robot, Spider, or Crawler)
  • A robot, or "bot" for short, is a software agent that indexes web pages. It is also called a "spider" or a "crawler".
Canonical URLs
  • Canonical URLs are URLs that have been standardized into a consistent form. For the search engines, this typically implies making sure all your pages use consistent URL structures, for example, making sure all your URLs start with "www".
Cloaking
  • Cloaking is a black hat SEO technique that involves presenting the search engine spider with different content than you show a normal site visitor.
Crawl depth
  • Crawl depth is a measure of how deeply the search engine spider has indexed a website. This is typically an issue relevant for sites with a complex hierarchy of pages. The deeper the spider indexes the site, the better.
Deep link
  • Deep link is a hyperlink that points to something other than the front page of a website
Doorway page (also known as a "gateway page")
  • Doorway pageis a page built specifically to point users to another page. This technique is used legitimately when a site owner holds multiple domain names and wishes to channel all the traffic into a primary domain. The technique is often used inappropriately by some black hat SEO practitioners as a way to create highly optimized pages targeting a specific term or terms, then push the users to another site—an online variation of the old bait and switch routine.
Duplicate content penalty
  • Duplicate content penalty is a theory that the search engines penalize sites that repeat content, or use content that is duplicated from another source. The theory is controversial, with many believing that the penalty may not exist, or may only be enforced in situations where there are other factors that indicate bad intent.
Google Webmaster
  • The Google Webmaster service is provided by Google to enable site owners to gain access to some basic tools that help you diagnose and track your site. Registration is free of charge.
Internal link density
  • Internal link density is the number of self-referential links on a site; that is, the number of links on a site pointing to other pages on the same site.
KEI
  • KEI is an acronym standing for Keyphrase Effectiveness Index. KEI is normally used during keyphrase research in an attempt to find the optimal keyphrases for a site. It is a simple ratio, most often defined as, "Frequency of search engine queries for the term/number of pages competing for the term".More the number of searches, more the potential traffic. The lower the competition, the easier it is to rank highly in the SERP. The most ideal term will have low competition and a high number of searches.
Keyphrase density (also known as "keyword density")
  • Keyphrase density is a calculation done by looking at all the text on a page, then calculating a ratio that represents the total number of words to the number of times a particular keyphrase or keyword appears on that page.
Keyword (or Keyphrase)
  • A keyword is a word being targeted for site's SEO efforts. A keyphrase is simply the targeting of a phrase instead of a single word.
  • Keyphrase stuffing is the over-optimizing of a page for a particular keyphrase. This is a disfavored practice that can have a negative impact on your site's ranking as it is viewed by the search engines as an attempt to exert inappropriate influence on the rankings for the page.
Landing page
  • A landing page is a web page that has been optimized to capture a customer, and is typically used as the target for an ad or other promotional campaign, or simply for capturing leads.
Link building
  • Link building is the process of seeking out or creating links to a site for the purpose of increasing the site's search engine relevance or inbound traffic.
Link farm
  • Link farm is a site that includes an excessive number of links. These sites are typically built purely to generate links for SEO purposes. Sites of this nature are disfavored by the search engines, which view them as inappropriate attempts to exert influence over rankings

Link text (also known as "anchor text")
  • When you create a hyperlink on a page by wrapping a text string with an <a>tag, the text wrapped by the tag is referred to as the link or anchor text. There is a search engine optimization benefit to using text for hyperlinks, as the text can then be indexed in conjunction with the hyperlink.
Long tail
  • In general terms, the long tail of a distribution is the trailing end of the distribution. In the context of SEO, the term is used to refer to targeting longer and more specific search queries, where there is usually less competition.
Meta tags
  • Metadata is, quite literally, data about data. On the Web, meta tags are the most common implementation of metadata and in the past were a key part of search engine indexing. Today, meta tags are still in use on the Web and can be found in the head section of web pages.
MozRank
  • MozRank is a site ranking algorithm formulated by SeoMoz. Often used in SEO circles as an alternative to Google's PageRank.
nofollow
  • nofollowis a possible value for the relattribute inside the <a>tag. If the value of the relattribute for a link is set to nofollow, the search engines' spiders will not follow or index the link.
Organic rank
  • Organic rank refers to natural search engine ranking, as opposed to paid ranking.
Outbound link
  • Outbound link is a hyperlink on one site pointing to an external site
PageRank
  • PageRank is a ranking algorithm created by and named for Larry Page at Google. The ranking criteria is unknown, but the scale ranges from zero at the low end to ten at the high end. The higher the score the more persuasive a website is deemed to be. There is argument, however, that the rank is no longer in use at Google and may not continue to evolve.

PPC
  • PPC is an acronym for Pay Per Click advertising. If you use a PPC advertising scheme, you pay every time someone clicks on one of your ads. The most popular PPC system is the Google AdWords program. It is also sometimes called "pay for performance advertising".
Reciprocal link
  • Reciprocal link is a link from one site to another, given in exchange for a link back. It is a link exchange between webmasters, done in hopes of boosting both sites' rankings.
Redirect
  • is an instruction given to the web server to redirect traffic seeking one URL to a different URL. There are different types of redirects, such as 301 redirect and 302 redirect, as we have seen earlier in this chapter.
Robots.txt
  • Robots.txt is a file containing instructions for search engine robots. This file is located on the server but is not used by the human visitors to the website.
SEF URLs
  • SEF URLs is an acronym for Search Engine Friendly URLs. The term refers to the creation of URLs that use natural words and phrases, rather than query strings and other abstract values (such as numbers) not associated with the page content.
SEM
  • SEM is an acronym for Search Engine Marketing. The term is broad and applies to not only search engine optimization, but also to other techniques, such as social media, pay per click advertising, and other marketing techniques focused on search engines.
SEOMoz
  • SEOMoz is a popular commercial SEO consultancy service. Learn more at http://www.seomoz.org.
SERP
  • SERP is an acronym for Search Engine Results Page.
SMO
  • SMO is an acronym for Social Media Optimization. The process of using social media to drive traffic to your site and the related process of making your site suitable for social media, for example, by including social bookmarking tools and other social sharing devices on the site's pages.
Splash page
  • Splash page is an entry page, typically decorative, used to greet visitors to a website.
Stop word
  • Stop words are words included in search queries that are not actively indexed, unless included in quotations (phrase search). Typical examples include articles and conjunctions such as the, a, and or.
Title attribute
  • The titleattribute is available on a number of HTML elements. It is used to provide a description for a link, a table, a frame, an image, or other elements. Some search engines index the titleattribute and it therefore provides another option for on page optimization. Some browsers will also display the content of the titleattribute as a tool tip when you move your mouse over the object.
White hat
  • White hat is a label used to describe the use of SEO techniques that are legal, ethical, or exhibit best practices.
XML sitemap
  • XML sitemaps lists the pages on a website in a format that is easily digestible by search engine agents. The sitemaps follow a standard convention agreed upon by all the major search engines. The XML sitemap is typically not visible to site visitors, and should not be confused with the normal sitemaps often used on the frontend of websites.

Google, Bing ?

Google provides the following guidance:
•Make pages primarily for users, not for search engines.
•Avoid tricks intended to improve search engine rankings.

Bing also emphasizes the importance of content and advises as follows:
•Ensure content is built based on keyword research to match content to what users are searching for
•Produce deep, content-rich pages; be an authority for users by producing excellent content
•Set a schedule and produce new content frequently
•Be sure content is unique—don't reuse content from other sources

What is SEO?

SEO is a process—a series of planning and execution steps that lead to a website being optimized to perform its best on the search engines.
SEO:
• Keywords in the domain name
• Keywords in a page's URL
• Keywords in the content title
• Keyword placement on page
• Keyword repetition on page
• Uniqueness of content
• Freshness of content
• Facebook activity
• Twitter activity, including influence of account tweeting
• Google+ activity
• Social media up votes and comments
• Click through rate for the site
• Bounce rate for the site
• Number, quality, and content of links to this site
• Number of internal links
• Number of errors on site
• Speed of the site

2013-09-05

Webmaster Tools API

What is the Webmaster Tools API?
   Google Webmaster Tools is Google's free service for webmasters. See your site as Google sees it, find out any problems we had crawling your site, and share info with us that will help improve your site's visibility in        Google's search results.Your client application can use the Webmaster Tools Data API to view sites in your Webmaster Tools account, add and remove sites, verify site ownership, and submit and delete Sitemaps.

How do I start using the Webmaster Tools Data API?
   1.If you're new to the Webmaster Tools Data API, here's how we recommend you get started:
Read the Developer's Guide to learn how the API works and how to use best coding and testing practices.
   2.Read the Reference Guide to learn about the web service operations and data objects that are available through the Webmaster Tools Data API.

Webmaster Guidelines

  • Design and content guidelines
  • Technical guidelines
  • Quality guidelines
When your site is ready:
  • Submit it to Google at http://www.google.com/submityourcontent/.
  • Submit a Sitemap using Google Webmaster Tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages.
  • Make sure all the sites that should know about your pages are aware your site is online.

Design and content guidelines

  • Make a site with a clear hierarchy and text links. 
  • Offer a site map to your users with links that point to the important parts of your site. 
  • Keep the links on a given page to a reasonable number.
  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
  • Try to use text instead of images to display important names, content, or links. 
  • Make sure that your <title> elements and ALT attributes are descriptive and accurate.
  • Check for broken links and correct HTML.
  • If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages.
  • Review our recommended best practices for images, video and rich snippets.
Quality guidelines - basic principles
  • Make pages primarily for users, not for search engines.
  • Don't deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"
  • Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.
Quality guidelines - specific guidelines
  • Automatically generated content
  • Participating in link schemes
  • Cloaking
  • Sneaky redirects
  • Hidden text or links
  • Doorway pages
  • Scraped content
  • Participating in affiliate programs without adding sufficient value
  • Loading pages with irrelevant keywords
  • Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware
  • Abusing rich snippets markup
  • Sending automated queries to Google

Google-friendly sites

Things to do

  • Design.
  • Technical.
  • Quality .
  1. Provide high-quality content on your pages, especially your homepage.
  2. Make sure that other sites link to yours
  3. Make your site easily accessible

Things to avoid

  • Don't fill your page with lists of keywords, attempt to "cloak" pages, or put up "crawler only" pages.
  • Don't feel obligated to purchase a search engine optimization service.
  • Don't use images to display important names, content, or links.
  • Don't create multiple copies of a page under different URLs.

Search Engine Optimization

  • Review of your site content or structure
  • Technical advice on website development: for example, hosting, redirects, error pages, use of JavaScript
  • Content development
  • Management of online business development campaigns
  • Keyword research
  • SEO training
  • Expertise in specific markets and geographies.