Friday 20 December 2013

A Suggested Approach to Link Building

Obtaining a portfolio of good quality links can be a time-consuming task. There are
proprietary software packages to help you in your task of tracking down web sites with relevant content. Many organisations work on a reciprocal basis. Sites with high Page
Rank may even charge for links. There are also link marketing and exchanging
specialists. However, according to Google, any attempt to exchange or buy links with the
explicit attempt to influence the ranking of your web site is considered link spamming.
My suggestion is to start close to home, exchanging links with businesses you may have a
trading relationship with. Do not rush into the link building process and do not trade links
with just anyone. Poor quality links may have a negative impact on your site.

Originating Site has been Crawled and Indexed

It may sound obvious, but for search engine purposes a link is not a link if the search
engines are not aware of it. The link will only exist in the records of the search engine if
the page on which it is situated has been crawled and indexed whilst the link was there.

Link Age

A long established link is deemed by Google to have more value than a recent link. A
rapid build up in links may also be deemed spam. However Google apparently makes an
allowance for a rapid build-up of links generated by news stories.

Anchor Text

Anchor text is the text that contains or sits alongside a link. This text provides additional
relevance to the quality of a link. Anchor text is written in HTML. On-screen part of the
text shows up as highlighted (usually coloured) or underlined type and part in normal
type. The anchor text for your site could be written in HTML code as follows:

<a href="http://www.yoursite.com"> Your Site Title </a> - A short description of what
you do. <BR>

Link Density

Links from pages with fewer outbound links have more influence than from pages where
there are huge numbers of links – see FFAs. Additional outbound links dilute the value of
existing links on a page. My suggestion is to accept links from pages with no more than
10 to 12 links. Avoid pages with 20+ external links.

Site and Page Relevance

A link from a site and page carrying similar content would carry more influence than
from a site without the similar content.

Google Page Rank

For Google ranking purposes a link from a high Page Rank site has even greater
influence. A link from a PR 6+ site is extremely valuable. At the other extreme, I suggest
you are prudent when exchanging links with sites of a PR of zero. The PR0 category
contains a number of banned sites.

Key Factors Effecting Link Quality

According to SEO convention and the information gleaned from the Google patents, there
are a number of factors affecting the quality of your inbound links.

SEO 3 – The Off-Site Phase

The off-site phase deals primarily with inbound link building. Amongst the major engines
Google places the greatest emphasis on links. The relevance and quality of these links has a significant influence on the ranking of your site in all of the major engines. The search
engine algorithm interprets each inbound link as a “vote” for a site. Not all links are equal
and therefore the quality of the vote is important in determining the value of the vote.

CSS Format

CSS stands for Cascading Style Sheet. It is a way of compressing HTML code, allowing
a site to load faster and, in the SEO context, improves the density and priority of
keywords.

Frames & File Size

Frames
Frames cannot be read by the major engines. So in terms of search they are an absolute
no. To find out if your site is utilising frames carry out the cache test on Google.
File Size
Do not make your opening page too large. Even if an engine can read your site many
internet users are still on a dial up connection.

Other File Formats

As at May 2005, Google claims that it is able to read 13 different file types apart from
HTML. The most common non-HTML formats are PDF and MS Office files. From my
experience documents in these two formats can all rank highly. I do, however, have
reservations about some of the other formats and, as mentioned above, particularly Flash.

HTML Code

As has been explained previously, search engines were originally designed to read via
HTML code or code related to it such as XHTML and PHP.

Technical Issues – Site Design and Construction

This section is about avoiding the technical mistakes or pitfalls that may hamper search engine visibility

Business Address and Telephone Number

It is believed that engines give an additional weighting to sites that carry an address and
telephone number. In many categories there are a large number of searches made using a
national discriminator in the search term, so include your country in the address.

Content Change

Engines apparently respond positively to a degree of content change – this is why some
blogs appear high in the rankings. Apparently, Google responds positively towards
“fresh” web sites and negatively towards “stale” web sites. If content has changed
between crawling cycles it signals to the spider to return again at more frequent intervals.

Site Map

Engines respond positively to site maps especially on larger sites with several levels. The
site map is also a useful way of aiding the navigation of a spider for deep crawl purposes.

Outbound Links (Forward)

These are apparently growing in influence as the engines realise that inbound links (IBLs
– see below) are being widely spammed. Rather link internal links, external links provide
the opportunity to include keywords in the hyperlink text.

Internal Links

Inbound links are important for two reasons. Firstly, their content is highlighted with a
hyperlink and this is given special emphasis by the search engines and secondly it is a
way of ensuring the engines can navigate and deep crawl into a site.

Bold and Cursive Script

Both bold and cursive script are given extra emphasis by the search engines. A subtle use
of bold or cursive script, when using a keyword, will enhance its presence.

Alt Tags

Search engines can read the alt tags that accompany JPEG, and GIF images. Every
relevant image should have an alt tag and this tag should be written to comply with your
keyword objectives. The text in an alt tag is believed to be given additional weight.

Titles Tags

Text within title tags has a greater weighting than ordinary copy. Within HTML code
titles are marked up <H1>, <H2>, <H3> etc. Therefore whenever a paragraph title is used
it is wise to use a keyword or keyword phrase.

Content Density

There is much debate about density. Too little and the keyword or phrase won’t be picked
up. Too much and your site may fail the spamming test. Some SEOs suggest repeating
keywords no more than 7 times on any one page. Density is always measured in relative
terms. A page with a lot of copy will have more word repetition than one with few words.

Content Relevance

Keywords and their surrounding copy should be relevant to one another. Certain words
and combinations of words go together and the search engine algorithms know this. So
advertising goes with marketing. Food with drink. Photographs with film. Also derivative
words with different utilisation go well. So market and marketed with marketing etc.

Content Location

Where should content with keywords be located? High up on the first page is the general
rule. Certainly get keywords into the opening sentence or paragraph. The latest MSN
engine picks out and uses a selection of text from the opening paragraph in their site
description. This implies that the MSN algorithm is placing additional emphasis on this
text. Keywords should then be spread throughout the first page and the rest of the site.

Writing Content

Content is deemed to be increasingly important by many in the SEO field. This is
apparently because with all the spamming and other optimisation techniques becoming increasingly sophisticated, only content can give the search engines a true indication of a
site’s content. There are some general rules, amongst them is the more copy the better –
aim for 250 words. Secondly look to use the keywords in two or three word phrases.

Keywords Meta Tag

You would have read in the previous section on search history that the keywords meta tag
is, today, ignored by most of the search engines. So, if the spiders do not take them that
seriously, why do we still have them? In my opinion the main role they perform is one of
internal guidance and discipline for the web master. If you know what keywords you are
looking for it is easy to test your content to ensure there’s a match.

The Site Description Meta Tag

The site description is the second most important meta tag. It is read by the engines
Yahoo and MSN and still plays a significant role in their searches. The site description
should tell the engine about the nature of the web site. It is recommended that this is done
in no more than 200 characters including spaces. It should be presented using good
grammar and avoiding repetition. The site description should include relevant keywords.

Meta Tag Priorities:- The Site Title Meta Tag

Meta Tag Priorities
The Site Title Meta Tag

The site title tag is the most important meta tag. The site title meta tag is still read and
indexed by all the major engines. How do we know this? Because it appears at the top of
each organic search entry in the search engine results pages. However, some SEOs
dispute whether it is really a meta tag at all – because the information the tag contains is
clearly visible in the top left corner of the blue area surrounding the screen. The
recommendation of RFC 1866, the international standard for HTML, is that the tag
should contain no more than 64 characters, including spaces. There is nothing physically
stopping you exceeding this limit. I have seen some major sites with 150 characters in
this tag. However the typical, browser can only show 70 or so characters and secondly,
and with more characters, the impact of keywords within the tag is progressively diluted.
From my experience the keywords in the early part of the tag carry more weight. I
personally prefer a limit of 50 to 55 characters. Checking the quality of the title meta tag
is the quickest way of assessing whether a site has been optimised.
A key debate, given the character limitations, is whether you should include the
organisation’s name in the title meta tag. Much depends on the names length and whether
it includes desired keywords. My view is that with limited space, you are wasting a
valuable resource if you use your organisation name here.

SEO 2 - The On-Site Phase (Writing Meta Tags)

SEO 2 - The On-Site Phase
Writing Meta Tags
There is much debate about the current value of meta tags. I still find them very effective
– both as an end in themselves and also as a guide to producing better and more search
friendly content. Although Google apparently ignore their contents, MSN and Yahoo both still utilise the site title and description meta tags in their search algorithms. MSN’s newly launched web site still makes reference to the value of the key words meta tag. Meta tags are so called because they sit above the site – in the “Head” section – and are not visible to the casual site visitor. The meta tags can be found between the <Head> and </Head> lines of HTML code, as the description suggests, at the top of the page.

Wednesday 18 December 2013

Making your keyword choice

In essence, you must synthesise all of the above five factors in selecting and refining your
keywords. Ignoring any one of the factors could create problems. Do not rush into this
process. Test out your keywords by making trial searches on the major engines and see
what company results you might keep. Getting it wrong may involve a large amount of
reworking.

Relevance

The keyword terms you select must be relevant, salient and part of the vocabulary used
by the audience you are seeking to attract. If that audience is a consumer one it is unlikely
to use jargon. The opposite may be true if you are seeking B2B prospects. My experience
suggests that consumers will often use entirely different vocabulary from marketing,
advertising and IT people. To avoid confusion use simpler but more specific terms.

Competition

You may have decided on your own keyword priorities but you must also check out the
competition for those keywords. Selecting a word or phrase already prioritised by a
multitude of competitive sites will see you struggle for visibility. Try to find words or
phrases that appear ignored or underutilised by your competitors. An alternative but
higher risk approach is to see what keywords are used by competitor sites and then
attempt to outmanoeuvre them by better use of links, content and meta tags.

Competitive Advantage

A place to look for keywords is where you enjoy some competitive advantage. How are
your products or services differentiated? What are the real strengths of your business
compared to your closest competitors? What proprietary advantages do you enjoy? What
is it you do better that may persuade prospective purchasers to visit your site?

Search Volumes

You should use a word or phrases that have sufficient search volumes for your needs.
You can find out about search volumes by checking with Word Tracker software or
Yahoo’s Overture keyword suggestion tool. Read more about these tools below.

Category Priorities

The first thing to remember is that the number of keywords you can use on any one site
or page has a finite limit. A general recommendation is that there is an overall limit of 20 individual words. In my opinion – due to other factors – the limit should be drawn much
tighter than this. Rather than a limit of words, I prefer, a limit of characters – including
spaces - of no more than 64. In essence, you must be sufficiently focused to sum up the
key priorities of your business within this limit – typically no more than 6 to 8 words.
The only way around this limit is to have an endless number of pages on an endless
number of sites – all optimised, monitored and updated on a regular basis.

Keyword Selection - Factors

Keyword selection is the first search specific discipline. Having explained that spiders
read and index text, we find that some text is more important than others. That text is
keywords. Valuable keywords are the words or phrases that prospective customers use
when searching in your market category. Keyword selection is therefore crucial and has
implications for so much else within search. I have drawn up a list of factors that should
be taken into account when selecting keywords.

Researching your Market Category, Customers and Competitors

Good SEO also requires a thorough understanding of the market category within which
the search project and web site will compete. What is the category size and how is it
developing. What other channels to market are there? What information is available
regarding their behaviour and attitude of customers? What role in the buying process is
played by the search marketing? Who are current and likely competitors? Once the above
is fully grasped you can proceed to the first real activity of SEO; Keyword selection.

SEO 1 - The Pre-Site Phase

Search engine optimisation is a marketing discipline. It is not a stand alone function.
Before any specific optimisation activity is undertaken it is essential that two areas are
non-search areas are appraised:

Understanding your Organisation’s Online Business Strategy
Good SEO requires a through understanding of your organisation’s overall business
strategy. How does search fit in with activities such as advertising, e-mail and direct
marketing? Is there a marketing plan? What does it say about objectives, strategy and
budgets? What is the overall direction of the business and what can search contribute?

How Search Engines Gather Information?

Search engines gather information by crawling web sites. They crawl from page to page
visiting sites already known and by following the links that they find. Whilst crawling,
the robots, or spiders, gather information from the source code of each site and then send
back that information for indexing. The Spiders were designed to read HTML code or
code related to it such as XHTML or PHP. The Spiders find it difficult to read pages
written in Flash and some other popular web programmes. Spiders cannot directly read
Java Script or images. They can however read the alt tags which may be provided with
GIF, JPEG or PNG images.

The Four Phases of an SEO Project

In addition to definitive information about the workings of search engines, there is much
speculation, myth and rumour. There are many spurious ideas in circulation and applying
them may do more harm than good. In this section, I will try to stick to tried and trusted
conventions.

How to Optimise Your Site

Introduction
This section describes the key processes undertaken to obtain a higher organic ranking
with the major search engines.
How search engines work is part of their proprietary knowledge. The exact workings of
their algorithms are closely guarded commercial secrets. However, guidance to how these
algorithms (or algos) work can be found or deduced from various sources. Some general
guidance is available free, directly from the search engines’ own web sites. Some
guidance can be found from examining the various Google and related patents. Some
general guidance can be found from authoritative articles on SEO forum sites. However,
real world applications of this knowledge can only be found by experimentation and trial
and error.
There are some general rules. Applying them will provide a route to improved search
engine visibility. The guidance in this section could be broadly applied to the three main
engines – Google, Yahoo and MSN. However, given its dominance, much of the advice
is derived from my interpretation of the Google “Hilltop” patent of 2001. The patent is
believed by SEOs to have been the basis of the so-called Google “Florida” update of
November 2003.

Haw To Comment Spam?

Related to link spamming is comment spam. Comment spam is where a spammer visits a
publicly accessible site and deposits a comment with an anchor text link back to a
designated site. Forums and blogs are typical target. This activity became identified as a
major problem in January 2005 when Google took steps to prevent it from the blogs of
Blogger.com. The reason was that spammers working for so called PPC (Pills, Porn and
Casino) web sites were trawling legitimate blogs and posting uninvited comment
advertisements with their web site’s anchor text. Blogs were vulnerable because they
typically possess a comment section that can be accessed without the need for passwords
or even registration.

Haw To Link Spamming?

In many respects, due to the increasing influence of links, it was inevitable that link
spamming would become an issue. Spamming of links has been a growing problem as
many people have realised the importance that Google, in particular, places on links. As a
significant issue it raised its head in April 2005 when Google’s new release appeared to
ban one of the leading SEO firms from its rankings. Few people outside of Google and
the SEO firm concerned are entirely sure why this is the case. But the industry consensus
is that Google are cracking down on web sites and organisations that accumulate vast
numbers of irrelevant links with the sole intention of climbing the rankings.

Tiny Text

Tiny text is a technique of using very small text that is barely visible to the human eye.
This text can be read by the engines. However, the engines will also attribute this text as
spam.

Hidden Text


The technique here is to fill or “stuff” a page with keywords invisible to the naked eye.
This is done by using the same colour for text as for the background page. This technique
is sometimes referred to as WOW, short for white on white.

Mirror Sites

Mirror sites use an alternative URL to the target site but contain identical content. With
automated page production, there maybe hundreds of different URLs all with the same
content. This technique is sometimes referred to as domain duplication.

Throwaway Sites

Throwaway sites are almost always doorway sites. They are web sites built by spammers
to provide a short-term and artificial boost to traffic. Once their traffic objectives are
achieved they are often switched off or left to decay – hence throwaway. Throwaway
sites are stuffed with links and keywords to attract and then re-direct traffic to a target
web site. Typically, the spammers retain ownership of the throwaway domain. The
spammers’ clients initially receive large amounts of traffic. But once the throwaway site
is switched off – or thrown away – the traffic comes to an abrupt halt and the clients
business suffers. The clients are then effectively blackmailed into spending vast sums to
retain traffic. The target web site receives no long term ranking benefits.

Doorway Sites


A doorway site is a site that acts as a referring page for another site. The doorway page is
highly optimised – containing hidden links and keywords that the ordinary web user
never sees. The doorway site then climbs the search engine rankings but re-directs all of
its traffic to the target – and perhaps poorly optimised site.

Cloaking

Cloaking is the technique whereby the web site visible to a site visitor is entirely different
from that seen by a search engine spider. The ordinary user may see one set of text and
images but underneath that image, or “cloak”, the site is “stuffed” with keywords. By
examining the cache of a cloaked site on the Google search results, we can see that the
site shows entirely different information to the spider, from that shown to the human eye.

Keyword Stuffing

Keyword stuffing is the technique of excessively using lots of keywords with express
intention of influencing the search engines. Quite often this use appears in an
incomprehensible or ungrammatical manner. Keyword stuffing is often used in
conjunction with other spamming techniques such as cloaking, doorway sites, hidden text
and tiny text.

What to avoid in SEO

In recent years a number of illicit techniques have grown up to artificially manipulate a
web site’s ranking. These techniques are referred to as spamming or sometimes “Black
hat” techniques. The “black hat” description refers to the fact that in the old western
movies the bad guys always wore black hats. The core of any spamming technique is the
attempt to deceive the search engine, and ultimately the site visitor, about the true nature
of a web site’s content.
The question is whether spamming techniques actually deliver any long term benefit. In
addition, it is known that using proscribed spamming techniques can get the spammer,
their client sites and organisations delisted by the major search engines. It has happened
publicly in the past and the search engines particularly Google place great emphasis on
their warnings. Google even has a page for reporting spamming offenders.
I have identified a list of nine types of illicit SEO or spamming techniques.
 Keyword Stuffing
 Cloaking
 Doorway Sites
 Throwaway sites
 Mirror Sites
 Hidden Text
 Tiny Text
 Link Spamming
 Comment Spam

Changing Industry – History of SEO

The 10-year history of search engine optimisation is closely tied to the underlying growth
of the internet and the development of its attendant search technologies. The three driving
forces have been:
1. The growth and commercial value of the internet consumer base.
2. The rising intensity of competition in online market categories.
3. The increasing sophistication of search technologies.
With the growth in value, the rewards for success in the marketing battle have risen
significantly. With an increasingly crowded internet, search has become a more important
component of commercial success. Without search, how is your site found? As a result,
attempts to both legitimately manage or illegitimately manipulate search results have
become motivated by the greater rewards on offer.
The early days of search engine optimisation go back to mid-1990s when the internet first
began to attract significant numbers of web sites and users. In those early days, emphasis
was on the submission stage – getting your site placed into as many search engines as
possible. The most important aspect of a search engine algorithm appeared to be entirely
“on-page” based and was focused almost exclusively around meta tags and their related
text.
Search algorithms could be decoded simply by analysing the results pages. During the
late 1990s, ethical SEOs and spammers alike realised that search engine results could be
manipulated by the simple process of adjusting a site’s meta tags to match the desired
keywords. During this period there were many crude attempts by spammers to stuff meta
tags with irrelevant but popular search terms. Famous spamming keyword meta tags have
included “Britney Spears” on sites with nothing to do with Britney Spears. It just
happened to be that Britney was one of the most searched for terms.
Google’s arrival in 1998 and the introduction of its “off-page”, link based, approach
signalled the beginning of the end for the exclusively meta tag driven approach. Google
was really the first engine to establish that sites carrying similar content had a propensity
to be linked. Google’s strength appeared that the relevance of its results was less
vulnerable to the orthodox spamming techniques of its day. Search users were attracted
by its relevance to their search needs. In essence the key to success under the Google
algorithm was not what your site said about itself but what the links from other sites said.
The Google spider apparently ignores keyword meta tags entirely and only the MSN
spider apparently places any emphasis on them at all. Abuse of the keyword meta tag by
spammers led to its downfall. Google’s subsequent rise to dominance eventually
transformed the SEO industry. Google’s rise in popularity forced many competitor search
engines to fall by the wayside or to be consolidated with larger parents such as Yahoo.
Due to Google’s success, both Yahoo and Microsoft, through its newly independent and
revised MSN search engine, have had to take on board many of the features of Google’s
approach. The influence of inbound links continues to increase.

What is SEO?

The successful execution of a search engine optimisation project requires skills in the
areas of analysis, research, planning, copy writing and communication. A comprehensive
search engine optimisation project is divided into four interrelated phases.
1. Pre-site activities – The research and planning activities undertaken before an
existing or new site or page is actually touched or built.
 Understanding your organisation’s online business strategy
 Researching your market category, customers and competitors
 Keyword research and selection
2. On-site activities – The activities directly involved in the content and design of
web pages.
 Writing the title, description and keyword meta tags
 Writing content – Body copy, titles, image tags, outbound links that reflect and
enhance keywords.
 Building internal links – Helping the search engines navigate the site
 Site design and construction - Ensuring the web page utilises design and code that
can be properly crawled and indexed by the search engines.
3. Off-site activities – Building a portfolio of quality inbound links to your web site.
4. Post –site activities – Analysing and responding to site traffic and user feedback
once a web site has been optimised. Effective SEO is a continuous activity.

SEO Book ( What is SEO)

The Basics of Search Engine Optimisation
Introduction to SEO - What is SEO
Search engine optimisation – commonly abbreviated to SEO – is the process whereby a
web site, or more specifically a web page or document, is constructed or amended in such
a way as to improve its placement in the search engine results pages or SERPs. Search
engine optimisation should not be seen as an end in itself. It is a function that should be
undertaken to improve the overall commercial performance of a web site.
Good search engine optimisation will ensure that a page appears higher in the search
engine results for a range of relevant, specific and valuable search terms or queries. The
simple objective of SEO is to generate more valuable web site traffic. The achievement of
a higher ranking against relevant search terms has commercial value for a web site
because it will attract more traffic than a lower ranking. In an increasingly crowded
online environment, search engine optimisation is therefore a crucial online marketing
discipline.
The role of SEO is to legitimately influence the process of improving rankings. There are
few genuine guarantees of a top placement, particularly for highly competitive search
terms. Good SEO will improve a web site’s ranking across a range of selected terms.
However, any process whereby a search engine is illicitly manipulated in order to
guarantee a high placement is referred to as spamming.

ODesk Cover Letter Samples

25 oDesk Cover Letter Samples
SAMPLE #1 DATA ENTRY
Hello Sir,
I am writing in response to your advertisement for a “Data Entry Assistant & SEO Good English”. After carefully reviewing the experience requirements of the job description, I feel that I am a suitable match for the job. I’ve held several data entry positions that entail inputting customer requests, inquire, and tracking codes of products I also perform administrative duties including copying and faxing documents, answering telephones, transferring data, web research and reports to immediate supervisor. I feel that I can add professionalism and accuracy to your current team of professionals. With extensive experience supporting all levels of a department and working directly with external vendors, I take direction well and can complete a heavy workload and complete projects under minimal supervision. If you feel there is a mutual interest, I would welcome the opportunity to meet with you to learn more about your company, the requirements of the position, and how my qualifications would be a good fit. Thank you in advance for your time and consideration. I look forward to Hiring from you soon.
 Your Name Here

SAMPLE #2 GRAPHIC DESIGNS
Dear Sir,
This is Your Name Here from India, I'm a professional graphics designer. I have 4 Years experience in graphics design. I have a site where I a website where I publish various graphics design tutorial. I have excellent experience in logo and banner design. I can create high quality design in a few time. I have confident that I will do your work very well. I hope you would like my work and you will give me a good feedback which is very important for my future carrier in odesk. Here I attached some of my logo and banner design projects. Hope you would like those. If you want to see more sample of my work you can visit my site-www.photoshoprain.com. I'm available in Skype, G.talk, YM or MSN for chat. Thanks and Regards Your Name Here
 

Sunday 15 December 2013

300 Directory Submission Site

6    http://www.holtinternational.org/forums/index.php
6    http://www.dogsindistress.org/forum/index.php
6    http://www.drunkard.com/bbs/index.php
6    http://www.engardelinux.net/forums/index.php
6    http://www.familypride.org/phpBB2/index.php
6    http://www.fh-giessen.de/fachschaft/mni/phpBB2/index.php
6    http://www.fija.org/forums/
6    http://www.filmspotting.net/boards/index.php
6    http://www.finisinc.com/forum/category-view.asp
6    http://www.floraweb.de/foren/index.php
6    http://www.aquahobby.com/board/index.php
6    http://www.foex.gob.sv/phpBB3/ucp.php?mode=register&sid=44da02cf7a2da04579d549cd17601a7b
6    http://www.forum.aktuell.ru/index.php
6    http://www.forum.gildia.pl/index.php
6    http://www.forum.rhein-kreis-neuss.de/index.php
6    http://www.forum.stuve-wien.at/index.php
6    http://www.forum.umlub.pl/index.php
6    http://www.forums.kolobok.us/index.php
6    http://www.forums.meteobelgium.be/index.php
6    http://www.foundation300.com/Forum13-1.aspx
6    http://www.freerangeinc.com/support/forum/index.php
6    http://www.free-scores.com/forum/index.php
6    http://www.freespamfilter.org/forum/index.php
6    http://qemu-forum.ipi.fi/index.php
6    http://www.fsma.org/bbfsma/index.php
6    http://www.fstph.at/smf/

150 Do-follow Forum site

This summary is not available. Please click here to view the post.

500 Backlinking Site List

http://1.bjnel1a.webtaskr.com/0325023408123/
http://1.bjnel1a.webtaskr.com/0325023408123/
http://1.bjnel1a.webtaskr.com/0325023408123/
http://1.bjnel1a.webtaskr.com/032502340822/
http://1.bjnel1a.webtaskr.com/032502340853/
http://1.bjnel1a.webtaskr.com/032502340879/
http://1.bjnel1a.webtaskr.com/032502340881/
http://1.bjnel1a.webtaskr.com/032502340882/
http://1.bjnel1a.webtaskr.com/0325023409147/
http://1.bjnel1a.webtaskr.com/0325023409159/
http://1.bjnel1a.webtaskr.com/0325023409186/
http://1.bjnel1a.webtaskr.com/0325023409199/
http://1.bjnel1c.webtaskr.com/0325023453103/
http://1.bjnel1c.webtaskr.com/0325023453103/
http://1.bjnel1c.webtaskr.com/032502345314/
http://1.bjnel1c.webtaskr.com/0325023453181/
http://1.bjnel1c.webtaskr.com/0325023453183/
http://1.bjnel1c.webtaskr.com/0325023453207/
http://1.bjnel1c.webtaskr.com/0325023453207/
http://1.bjnel1c.webtaskr.com/0325023453210/
http://1.bjnel1c.webtaskr.com/032502345325/
http://1.bjnel1c.webtaskr.com/032502345344/
http://1.bjnel1c.webtaskr.com/032502345378/
http://1.bjnel1d.dnsdojo.com/0325023512204/
http://1.bjnel1d.webtaskr.com/0325023510104/
http://1.bjnel1d.webtaskr.com/0325023510134/
http://1.bjnel1d.webtaskr.com/0325023510139/
http://1.bjnel1d.webtaskr.com/0325023510160/
http://1.bjnel1d.webtaskr.com/0325023510168/
http://1.bjnel1d.webtaskr.com/032502351019/
http://1.bjnel1d.webtaskr.com/032502351026/
http://1.bjnel1d.webtaskr.com/032502351042/
http://1.bjnel1d.webtaskr.com/032502351065/
http://1.bjnel1d.webtaskr.com/032502351086/
http://1.bkliu1a.dnsalias.org/0326034900164/
http://1.blhir1a.doesntexist.com/0327094222197/
http://1.blhir1b.doesntexist.org/032709435667/
http://1.blhir1c.dnsdojo.org/0327095128112/
http://1.bmirl1d.dontexist.com/032814261793/
http://1.bnorl1a.dnsdojo.com/0329073956103/
http://1.bnorl1a.doesntexist.com/0329073959201/
http://1.bsiln2a.no-ip.co.uk/0404134539154/
http://1.bthdu2a.no-ip.co.uk/0405212714124/
http://1.bvnij2a.dnsalias.net/04091729376/
http://1.bvnij2b.dnsdojo.net/040917341014/
http://1.bvnij2d.dontexist.com/04091741401/
http://10000songs.wordpress.com/2008/08/25/rb-singers-out-about-27/
http://10000songs.wordpress.com/2008/10/09/rb-singers-out-about-mariah-carey-jennifer-hudson-christina-milian-alicia-keys-michelle-williams-aubrey-donnie-jlo-toni-braxton/
http://1000awesomethings.com/2009/01/26/844-celebrities-on-sesame-street/
http://1000monkeys.com/2007/08/cheating
http://1000petals.wordpress.com/2009/02/05/finally-a-proper-photoblog/
http://1000petals.wordpress.com/2009/02/16/love-is-a-collective-energy/
http://1001filmer.bloggie.se/2009/03/22/den-engelske-patienten/

100 Social Bookmarking List

1.http://delicious.com
2.http://slashdot.org
3.http://www.bebo.com
4.http://www.ask.com
5.http://technorati.com
6.http://reddit.com
7.http://www.propeller.com
8.http://current.com
9.http://www.digg.com
10.http://www.connotea.org
11http://newsvine.com
12.http://linkarena.com
13.http://famous.livejournal.com/profile
14.http://www.blinklist.com
15.http://www.fsdaily.com
16.http://www.furl.net
17.http://www.mybloglog.com
18.http://public.sitejot.com
19.http://www.wikio.com
20.http://multiply.comhttp://delicious.com
21.http://www.xanga.com
22.http://www.blogcatalog.com
23.http://www.citeulike.org
24.http://www.corank.com
25.http://www.folkd.com

Tuesday 3 December 2013

What Should You Do Now?

What Should You Do Now?
It is worth cataloging the basic principles to be enforced to increase website traffic and search engine rankings.1. Create a site with valuable content, products or services.
1. Place primary and secondary keywords within the first 25 words in your page content and spread them evenly throughout the document.
2. Research and use the right keywords/phrases to attract your target customers.
3. Use your keywords in the right fields and references within your web page. Like Title, META tags, Headers, etc.
4. Keep your site design simple so that your customers can navigate easily between web pages, find what they want and buy products and services.
5. Submit your web pages i.e. every web page and not just the home page, to the most popular search engines and directory services. Hire someone to do so, if required. Be sure this is a manual submission. Do not engage an automated submission service.
6. Keep track of changes in search engine algorithms and processes and accordingly modify your web pages so your search engine ranking remains high. Use online tools and utilities to keep track of how your website is doing.
7. Monitor your competitors and the top ranked websites to see what they are doing right in the way of design, navigation, content, keywords, etc.
8. Use reports and logs from your web hosting company to see where your traffic is coming from. Analyze your visitor location and their incoming sources whether search engines or links from other sites and the keywords they used to find you.
9. Make your customer visit easy and give them plenty of ways to remember you in the form of newsletters, free reports, reduction coupons etc.
10. Demonstrate your industry and product or service expertise by writing and submitting articles for your website or for article banks so you are perceived as an expert in your field.
11. When selling products online, use simple payment and shipment methods to make your customer‟s experience fast and easy.
12. When not sure, hire professionals. Though it may seem costly, but it is a lot less expensive than spending your money on a website which no one visits.
13. Don‟t look at your website as a static brochure. Treat it as a dynamic, ever-changing sales tool and location, just like your real store to which your customers with the same seriousness.

Link Spamming

Link Spamming
Realizing the importance of links and link analysis in search engine results, several link farms and Free for All sites have appeared that offer to provide links to your site. This is also referred to as link spamming. Most search engines are smarter to this obvious tactic and know how to spot this. Such FFA sites, as they are known, do not provide link quality or link context, two factors that are important in link analysis. Thus the correct strategy is to avoid link spamming and not get carried away by what seems to be too simple a solution.

Tables

Tables
When you use tables on the key pages and if some columns have descriptions while others have numbers, it is possible that this may push your keywords down the page. Search engines break up the table and read them for the content the columns have. The first column is read first, then the next and so on. Thus if the first column had numbers, and the next one had useful descriptions, the positioning of these descriptions will suffer. The strategy is to avoid using such tables near the top of the key pages. Large sections of Java scripts also will have the same effect on the search engines. The HTML part will be pushed down. Thus again, place your long Javascripts lower down on key pages.

Frames

Frames
There are some engines whose spiders won‟t work with frames on your site. A web page that is built using frames is actually a combination of content from separate “pages” that have been blended into a single page through a „frameset‟ instruction page. The frameset page does not have any content or links that would have promoted spidering. The frameset page could block the spider‟s movement. The workaround is by placing a summary of the page content and relevant description in the frameset page and also by placing a link to the home page on it.

Image Maps Without ALT Text

Image Maps Without ALT Text
Avoid image maps without text or with links. Image maps should have alt text (as also required under the American Disabilities Act, for public websites) and the home page should not have images as links. Instead HTML links should be used. This is because search engines would not read image links and the linked pages may not get crawled.

Re-Direct Pages

Re-Direct Pages
Sometimes pages have a Meta refresh tag that redirects any visitor automatically to another page. Some search engines refuse to index a page that has a high refresh rate. The meta refresh tag however does not affect Google.

Dynamic URLs

Dynamic URLs
Several pages in e-commerce and other functional sites are generated dynamically and have? or & sign in their dynamic URLs. These signs separate the CGI variables. While Google will crawl these pages, many other engines will not. One inconvenient solution is to develop static equivalent of the dynamic pages and have them on your site. Another way to avoid such dynamic URLs is to rewrite these URLs using a syntax that is accepted by the crawler and also understood as equivalent to the dynamic URL by the application server. The Amazon site shows dynamic URLs in such syntax. If you are using Apache web server, you can use Apache rewrite rules to enable this conversion.

Keyword Stuffing & Spamming

Keyword Stuffing & Spamming
Important keywords and descriptions should be used in your content in visible Meta tags and you should choose the words carefully and position them near the top and have proper frequency for such words. However it is very important to adopt moderation in this. Keyword stuffing or spamming is a No-No today. Most search engine algorithms can spot this, bypass the spam and some may even penalize it.

Invisible & Tiny Text

Invisible & Tiny Text
Invisible text is content on a web site that is coded in a manner that makes it invisible to human visitors, but readable by search engine spiders. This is done in order to artificially inflate the keyword density of a web site without affecting the visual appearance of it. Hidden text is a recognized spam tactic and nearly all of the major search engines recognize and penalize sites that use this tactic.
This is the technique of placing text on a page in a small font size. Pages that are predominantly heavy in tiny text may be dismissed as spam. Or, the tiny text may not be indexed. As a general guideline, try to avoid pages where the font size is predominantly smaller than normal. Make sure that you're not spamming the engine by using keyword after keyword in a very small font size. Your tiny text may be a copyright notice at the very bottom of the page, or even your contact information. If so, that's fine.

Image Alt Tag Descriptions

Image Alt Tag Descriptions
Search engines are unable to view graphics or distinguish text that might be contained within them. For this reason, most engines will read the content of the image ALT tags to determine the purpose of a graphic. By taking the time to craft relevant, yet keyword rich ALT tags for the images on your web site, you increase the keyword density of your site.
Although many search engines read and index the text contained within ALT tags, it's important NOT to go overboard in using these tags as part of your SEO campaign. Most engines will not give this text any more weight than the text within the body of your site.

STOP Words

STOP Words
Stop words are common words that are ignored by search engines at the time of searching a key phrase. This is done in order to save space on their server, and also to accelerate the search process.
When a search is conducted in a search engine, it will exclude the stop words from the search query, and will use the query by replacing all the stop words with a marker. A marker is a symbol that is substituted with the stop words. The intention is to save space. This way, the search engines are able to save more web pages in that extra space, as well as retain the relevancy of the search query.
Besides, omitting a few words also speeds up the search process. For instance, if a query consists of three words, the Search Engine would generally make three runs for each of the words and display the listings. However, if one of the words is such that omitting it does not make a difference to search results, it can be excluded from the query and consequently the search process becomes faster. Some commonly excluded "stop words" are: after, also, an, and, as, at, be, because, before, between, but, before, for, however, from, if, in, into, of, or, other, out, since, such, than, that, the, these, there, this, those, to, under, upon, when, where, whether, which, with, within, without

Making Frames Visible

Making Frames Visible To Search Engines
We discussed earlier the prominence of frames based websites. Many amateur web designers do not understand the drastic effects frames can have on search engine visibility. Such ignorance is augmented by the fact that some Search Engines such as Google and Ask.com are actually frames capable. Ask.com spiders can crawl through frames and index all web pages of a website. However, this is only true for a few Search Engines.
The best solution as stated above is to avoid frames all together. If you still decide to use frames another remedy to this problem is using Javascript. Javascript can be added anywhere and is visible to Search Engines. These would enable spiders to crawl to other web pages, even if they do not recognize frames.
With a little trial and error, you can make your frame sites accessible to both types of search engines.

Use Frames

Should You Use Frames?
Many websites make use of frames on their web pages. In some cases, more than two frames would be used on a single web page. The reason why most websites use frames is because each frame‟s content has a different source. A master page known as a “frameset” controls the process of clubbing content from different sources into a single web page. Such frames make it easier for webmasters to club multiple sources into a single web page. This, however, has a huge disadvantage when it comes to Search Engines.
Some of the older Search Engines do not have the capability to read content from frames. These only crawl through the frameset instead of all the web pages. Consequently web pages with multiple frames are ignored by the spider. There are certain tags known as “NOFRAMES” (Information ignored by frames capable browser) that can be inserted in the HTML of these web pages. Spiders are able to read information within the NOFRAMES tags. Thus, Search Engines only see the Frameset. Moreover, there cannot be any links to other web pages in the NOFRAMES blocks. That means the search engines won't crawl past the frameset, thus ignoring all the content rich web pages that are controlled by the frameset. Hence, it is always advisable to have web pages without frames as these could easily make your website invisible to Search Engines.

How many Pages To Submit

How many Pages To Submit?
You do not have to submit all the pages of your site. As stated earlier, many sites have restrictions on the number of pages you submit. A key page or a page that has links to many inner pages is ideal, but you must submit some inner pages. This insures that even if the first page is missed, the crawler does get to access other pages and all the important pages through them. Submit your key 3 to 4 pages at least. Choose the ones that have the most relevant content and keywords to suit your target search string and verify that they link to other pages properly.

Page Size Can Be A Factor

Page Size Can Be A Factor
We have written above that the spiders may bypass long and “difficult” pages. They would have their own time-out characteristics or other controls that help them come unstuck from such pages. So you do not want to have such a page become your “gateway” page. One tip is to keep the page size below 100 kb.

Affiliate Sites & Dynamic URLs

Affiliate Sites & Dynamic URLs
In affiliate programs, sites that send you traffic and visitors, have to be paid on the basis of per click or other parameters (such as number of pages visited on your site, duration spent, transactions etc). Most common contractual understanding revolves around payment per click or click throughs. Affiliates use tracking software that monitors such clicks using a redirection measurement system. The validity of affiliate programs in boosting your link analysis is doubtful. Nevertheless, it is felt that it does not actually do any harm. It does provide you visitors, and that is important. In the case of some search engines re-directs
may even count in favor of your link analysis. Use affiliate programs, but this is not a major strategy for optimization. Several pages in e-commerce and other functional sites are generated dynamically and have “?” or “&” sign in their dynamic URLs. These signs separate the CGI variables. While Google will crawl these pages, many other engines will not. One inconvenient solution is to develop static equivalent of the dynamic pages and have them on your site.
Another way to avoid such dynamic URLs is to rewrite these URLs using a syntax that is accepted by the crawler and also understood as equivalent to the dynamic URL by the application server. The Amazon site shows dynamic URLs in such syntax. If you are using Apache web server, you can use Apache rewrite rules to enable this conversion.
One good tip is that you should prepare a crawler page (or pages) and submit this to the search engines. This page should have no text or content except for links to all the important pages that you wished to be crawled. When the spider reaches this page it would crawl to all the links and would suck all the desired pages into its index. You can also break up the main crawler page into several smaller pages if the size becomes too large. The crawler shall not reject smaller pages, whereas larger pages may get bypassed if the crawler finds them too slow to be spidered.
You do not have to be concerned that the result may throw up this “site-map” page and would disappoint the visitor. This will not happen, as the “site-map” has no searchable content and will not get included in the results, rather all other pages would. We found the site wired.com had published hierarchical sets of crawler pages. The first crawler page lists all the category headlines, these links lead to a set of links with all story headlines, which in turn lead to the news stories.

One Site – One Theme

One Site – One Theme
It's important to note that you shouldn't try to optimize your home page for more than one theme. They just end up weakening each other's strength when you do that. By using simple links to your alternative content, a link to your humor page can get folks where they want to go, and then you can write your humor page as a secondary index optimized toward a humor theme. In the end, each page should be optimized for search engines for the main topic of that page or site section.
Search engine optimization is made up of many simple techniques that work together to create a comprehensive overall strategy. This combination of techniques is greater as a whole than the sum of the parts. While you can skip any small technique that is a part of the overall strategy, it will subtract from the edge you'd gain by employing all the tactics.

Step By Step Page Optimization

Step By Step Page Optimization
Starting at the top of your index/home page something like this: (After your logo or header graphic)
1) A heading tag that includes a keyword(s) or keyword phrases. A heading tag is bigger and bolder text than normal body text, so a search engine places more importance on it because you emphasize it.
2) Heading sizes range from h1 - h6 with h1 being the largest text. If you learn to use just a little Cascading Style Sheet code you can control the size of your headings. You could set an h1 sized heading to be only slightly larger than your normal text if you choose, and the search engine will still see it as an important heading.
3) Next would be an introduction that describes your main theme. This would include several of your top keywords and keyword phrases. Repeat your top 1 or 2 keywords several times, include other keyword search terms too, but make it read in sentences that makes sense to your visitors.
4) A second paragraph could be added that got more specific using other words related to online education.
5) Next you could put smaller heading.
6) Then you'd list the links to your pages, and ideally have a brief decision of each link using keywords and keyword phrases in the text. You also want to have several pages of quality content to link to. Repeat that procedure for all your links that relate to your theme.
7) Next you might include a closing, keyword laden paragraph. More is not necessarily better when it comes to keywords, at least after a certain point. Writing "online education" fifty times across your page would probably result in you being caught for trying to cheat. Ideally, somewhere from 3% - 20% of your page text would be keywords. The percentage changes often and is different at each search engine. The 3-20 rule is a general guideline, and you can go higher if it makes sense and isn't redundant.
8) Finally, you can list your secondary content of book reviews, humor, and links. Skip the descriptions if they aren't necessary, or they may water down your theme too much. If you must include descriptions for these non-theme related links, keep them short and sweet. You also might include all the other site sections as simply a link to another index that lists them all. You could call it Entertainment, Miscellaneous, or whatever. These can be sub-indexes that can be optimized toward their own theme, which is the ideal way to go.
Now you've set the all important top of your page up with a strong theme. So far so good, but this isn't the only way you can create a strong theme so don't be compelled into following this exact formula. This was just an example to show you one way to set up a strong site theme. Use your imagination, you many come up with an even better way.

The Acid Test

The Acid Test
Here is the acid test that will prove what we just said is right: Carefully examine the body text of your existing homepage. Then, attempt to insert three to five different keywords and key phrases three to four times each, somewhere within the actual body of your existing page. In doing that, chances are you will end up with a homepage that is next to impossible to understand and read.
One mistake some people do is to force their prospects to wade through endless key phrase lists or paragraphs, in an attempt to describe their features and benefits. The other reason they do that is in trying to please the search engines at the same time. Writing a powerful and effective homepage around carefully defined keywords and key phrases is a sure way you can drive targeted traffic to your web site and keep them there once you do.
If some people still say re-writing a homepage takes too much time and costs too much money, think of the cost of losing prospective clients and the real cost of lost sales and lost opportunities. In the end, writing a strong homepage that will achieve all your desired goals will largely justify your time invested and the efforts you will have placed in the re-writing of your homepage.
This section presents a recommended layout for your homepage in order to make it as search engine friendly as possible. This is where you set the theme of your site. Let's suppose the primary focus of your site is about online education. You also have secondary content that is there as alternative content for those not interested online education. There is also other content that you would like to share with your visitors. For example, this might include book reviews, humor, and links.
The top of your homepage, as discussed earlier is the most important. This is where you set the keywords and theme for the most important part of your site, the thing you really want to be found for.

The Home Page

The Home Page
Your homepage is the most important page on your web site. If you concentrate your most important keywords and key phrases in your homepage many times, the search engines will surely notice and index it accordingly. But will it still read easily and will the sentences flow freely to your real human visitors? There are some good chances that it might not. As a primer, having just 40 or 50 words on your homepage will not deliver the message effectively. To be powerful and effective, a homepage needs at least 300 to 400 words for maximum search engine throughput and effectiveness. One way to do that is to increase your word count with more value-added content. This often means rewriting your whole homepage all over again. The main reason to this is you will probably never have enough room to skillfully work your important keywords and key phrases into the body text of your homepage. This may not please your boss or marketing department, but a full re-write is often necessary and highly advisable to achieve high rankings in the engines, while at the same time having a homepage that will please your site visitors and convert a good proportion of them into real buyers.

Website Give Enough Contact Information

Does Your Website Give Enough Contact Information?
When you sell from a website, your customer can buy your products 24 hrs a day and also your customers may be from other states that are thousands of miles away. Always provide contact information, preferably on every page of your website, complete with mailing address, telephone number and an email address that reaches you. People may need to contact you about sales, general information or technical problems on your site. Also have your email forwarded to another email address if you do not check your website mailbox often. When customer wants to buy online provide enough options like credit card, PayPal or other online payment service.
In the field of search engine optimization (SEO), writing a strong homepage that will rank high in the engines and will read well with your site visitors can sometimes present a challenge, even to some seasoned SEO professionals. Once you have clearly identified your exact keywords and key phrases, the exact location on your homepage where you will place those carefully researched keywords will have a drastic impact in the end results of your homepage optimization. One thing we keep most people say is that they don‟t want to change the looks or more especially the wording on their homepage. Understandably, some of them went to great lengths and invested either a lot of time and/or money to make it the best it can be. Being the best it can be for your site visitors is one thing. But is it the best it can be for the search engines, in terms of how your site will rank?
If you need powerful rankings in the major search engines and at the same time you want to successfully convert your visitors and prospects into real buyers, it's important to effectively write your homepage the proper way the first time! You should always remember that a powerfully optimized homepage pleases both the search engines and your prospects.
In randomly inserting keywords and key phrases into your old homepage, you might run the risk of getting good rankings, but at the same time it might jeopardize your marketing flow. That is a mistake nobody would ever want to do with their homepage.
Even today, there are still some people that will say you can edit your homepage for key phrases, without re-writing the whole page. There are important reasons why that strategy might not work.

Understanding Your Target Customer

Understanding Your Target Customer
If you design a website you think will attract clients, but you don‟t really know who your customers are and what they want to buy, it is unlikely you make much money. Website business is an extension or replacement for a standard storefront. You can send email to your existing clients and ask them to complete a survey or even while they are browsing on your website. Ask them about their choices. Why do they like your products? Do you discount prices or offer coupons? Are your prices consistently lower than others? Is your shipping price cheaper? Do you respond faster to client questions? Are your product descriptions better? Your return policies and guarantees better than your competitor‟s? To know your customer you can check credit card records or ask your customer to complete a simple contact form with name, address, age, gender, etc. when they purchase a product.

What Your Website Absolutely Needs

What Your Website Absolutely Needs
This section will go over some of the most important elements that a page that hopes to get high research engine rankings needs. Make sure that you go through this while section very carefully as each of these can have a dramatic impact on the rankings that your website will ultimately achieve. Don‟t focus solely on the home page, keywords and titles. The first step to sales when customers visit your site to see the products they were looking for. Of course, search engine optimization and better rankings can‟t keep your customer on your site or make them buy. The customer having visited your site, now ensure that he gets interested in your products or services and stays around. Motivate him to buy the product by providing clear and unambiguous information. Thus if you happen to sell more than one product or service, provide all necessary information about this, may be by keeping the information at a different page. By providing suitable and easily visible links, the customer can navigate to these pages and get the details.

Blanket Policy,

Blanket Policy On Doorway Pages And Cloaking
Many search engines are opposed to doorway pages and cloaking. They consider doorway and cloaked pages to be spam and encourage people to use other avenues to increase the relevancy of their pages. We‟ll talk about doorway pages and cloaking a bit later.
Meta Tags (Ask.Com As An Example)
Though Meta tags are indexed and considered to be regular text, Ask.com claims it doesn't give them priority over HTML titles and other text. Though you should use meta tags in all your pages, some webmasters claim their doorway pages for Ask.com rank better when they don't use them. If you do use Meta tags, make your description tag no more than 150 characters and your keywords tag no more than 1,024 characters long.
Keywords In The URL And File Names
It's generally believed that Ask.com gives some weight to keywords in filenames and URL names. If you're creating a file, try to name it with keywords.
Keywords In The ALT Tags
Ask.com indexes ALT tags, so if you use images on your site, make sure to add them. ALT tags should contain more than the image's description. They should include keywords, especially if the image is at the top of the page. ALT tags are explained later.
Page Length
There's been some debate about how long doorway pages for AltaVista should be. Some webmasters say short pages rank higher, while others argue that long pages are the way to go. According to AltaVista's help section, it prefers long and informative pages. We've found that pages with 600-900 words are most likely to rank well.
Frame Support
AltaVista has the ability to index frames, but it sometimes indexes and links to pages intended only as navigation. To keep this from happening to you, submit a frame-free site map containing the pages that you want indexed. You may also want to include a "robots.txt" file to prohibit AltaVista from indexing certain pages.

Query-Dependent Factors

Query-Dependent Factors
1.The HTML title.
2. The first lines of text.
3. Query words and phrases appearing early in a page rather than late.
4. Meta tags, which are treated as ordinary words in the text, but like words that appear early in the text (unless the meta tags are patently unrelated to the content on the page itself, in which case the page will be penalized)
5. Words mentioned in the "anchor" text associated with hyperlinks to your pages. (E.g., if lots of good sites link to your site with anchor text "breast cancer" and the query is "breast cancer," chances are good that you will appear high in the list of matches.)

Values

Values
 Long pages, which are rich in meaningful text (not
randomly generated letters and words).
 Pages that serve as good hubs, with lots of links to pages that that have related content (topic similarity, rather than random meaningless links, such as those generated by link exchange programs or intended to generate a false impression of "popularity").
 The connectivity of pages, including not just how many links there are to a page but where the links come from: the number of distinct domains and the "quality" ranking of those particular sites. This is calculated for the site and also for individual pages. A site or a page is "good" if many pages at many different sites point to it, and especially if many "good" sites point to it.
 The level of the directory in which the page is found. Higher is considered more important. If a page is buried too deep, the crawler
simply won't go that far and will never find it.
These static factors are recomputed about once a week, and new good pages slowly percolate upward in the rankings. Note that there are advantages to having a simple address and sticking to it, so others
15
can build links to it, and so you know that it's in the index

Ranking Rules Of Thumb

Ranking Rules Of Thumb
The simple rule of thumb is that content counts, and that content near the top of a page counts for more than content at the end. In particular, the HTML title and the first couple lines of text are the most important part of your pages. If the words and phrases that match a query happen to appear in the HTML title or first couple lines of text of one of your pages, chances are very good that that page will appear high in the list of search results. A crawler/spider search engine can base its ranking on both static factors (a computation of the value of page independent of any particular query) and query-dependent factors.

Crawler/Spider Considerations

Crawler/Spider Considerations
Also, consider technical factors. If a site has a slow connection, it might time-out for the crawler. Very complex pages, too, may time out before the crawler can harvest the text. If you have a hierarchy of directories at your site, put the most important information high, not deep. Some search engines will presume that the higher you placed the information, the more important it is. And crawlers may not venture deeper than three or four or five directory levels. Above all remember the obvious - full-text search engines such index text. You may well be tempted to use fancy and expensive design techniques that either block search engine crawlers or leave your pages with very little plain text that can be indexed. Don‟t fall prey to that temptation.

Don’ts

Don’ts
 When making a site, do not cheat your users, i.e. those people who will surf your website. Do not provide them with irrelevant content or present them with any fraudulent schemes.
1. Avoid tricks or link schemes designed to increase your site's ranking.
2.Do not employ hidden texts or hidden links.
3.  Google frowns upon websites using cloaking
technique. Hence, it is advisable to avoid that.
4. Automated queries should not be sent to Google.
5. Avoid stuffing pages with irrelevant words and
content. Also don't create multiple pages, sub-
domains, or domains with significantly duplicate content.
6. Avoid "doorway" pages created just for search engines or other "cookie cutter" approaches such as affiliate programs with hardly any original content.

Google Guidelines


Google Guidelines
Here are some of the important tips and tricks that can be employed while dealing with Google.
Do’s
 A website should have crystal clear hierarchy and links and should
preferably be easy to navigate.
 A site map is required to help the users go around
your site and in case the site map has more than
100 links, then it is advisable to break it into
several pages to avoid clutter.
 Come up with essential and precise keywords and make sure that your website features relevant and informative content.
12
 The Google crawler will not recognize text hidden in the images, so when describing important names, keywords or links; stick with plain text.
 The TITLE and ALT tags should be descriptive and accurate and the website should have no broken links or incorrect HTML.
 Dynamic pages (the URL consisting of a „?‟ character) should be kept to a minimum as not every search engine spider is able to crawl them.
 The robots.txt file on your web server should be current and should not block the Googlebot crawler. This file tells crawlers which directories can or cannot be crawled.

Cloaking

Cloaking
Sometimes, a webmaster might program the server in such a way that it returns different content to Google than it returns to regular users, which is often done to misrepresent search engine rankings. This process is referred to as cloaking as it conceals the actual website and returns distorted web pages to search engines crawling the site. This can mislead users about what they'll find when they click on a search result. Google highly disapproves of any such practice and might place a ban on the website which is found guilty of cloaking.

Submitting your URL to Google

Submitting your URL to Google
Google is primarily a fully-automatic search engine with no human-intervention involved in the search process. It utilizes robots known as „spiders‟ to crawl the web on a regular basis for new updates and new websites to be included in the Google Index. This robot software follows hyperlinks from site to site. Google does not require that you should submit your URL to its database for inclusion in the index, as it is done anyway automatically by the „spiders‟. However, manual submission of URL can be done by going to the Google website and clicking the related link. One important thing here is that Google does not accept payment of any sort for site submission or improving page rank of your website. Also, submitting your site through the Google website does not guarantee listing in the index.

GOOGLE Dance Tool

GOOGLE Dance Tool
This Google Dance Tool allows you to check your rankings on all three tools www, www2 and www3 and on all 9 datacenters simultaneously.
The Google Web Directory works in combination of the Google Search Technology and the Netscape Open Directory Project which makes it possible to search the Internet organized by topic. Google displays the pages in order of the rank given to it using the Page Rank Technology. It not only searches the titles and descriptions of the websites, but searches the entire content of sites within a related category, which ultimately delivers a comprehensive search to the users. Google also has a fully functional web directory which categorizes all the searches in order.

GOOGLE Dance

The Algorithm Shuffle
Because of the nature of Page Rank, the calculations need to be performed about 40 times and, because the index is so large, the calculations take several days to complete. During this period, the search results fluctuate; sometimes minute-by minute. It is because of these fluctuations that the term, Google Dance, was coined. The dance usually takes place sometime during the last third of each month. Google has two other servers that can be used for searching. The search results on them also change during the monthly update and they are part of the Google dance. For the rest of the month, fluctuations sometimes occur in the search results, but they should not be confused with the actual dance. They are due to Google's fresh crawl and to what is known "Everflux". Google has two other searchable servers apart from www.google.com. They are www2.google.com and www3.google.com. Most of the time, the results on all 3 servers are the same, but during the dance, they are different. For most of the dance, the rankings that can be seen on www2 and www3 are the new rankings that will transfer to www when the dance is over. Even though the calculations are done about 40 times, the final rankings can be seen from very early on. This is because, during the first few iterations, the calculated figures merge to being close to their final figures. You can see this with the Page Rank Calculator by checking the Data box and performing some calculations. After the first few iterations, the search results on www2 and www3 may still change, but only slightly. During the dance, the results from www2 and www3 will sometimes show on the www server, but only briefly. Also, new results on www2 and www3 can disappear for short periods. At the end of the dance, the results on www will match those on www2 and www3.

Hypertext-Matching Analysis

Hypertext-Matching Analysis
Unlike its conventional counterparts, Google is a search engine which is hypertext-based. This means that it analyzes all the content on each web page and factors in fonts, subdivisions, and the exact positions of all terms on the page. Not only that, Google also evaluates the content of its nearest web pages. This policy of not disregarding any subject matter pays off in the end and enables Google to return results that are closest to user queries. Google has a very simple 3-step procedure in handling a query submitted in its search box:
1. When the query is submitted and the enter key is pressed, the web server sends the query to the index servers. Index server is exactly what its name suggests. It consists of an index much like the index of a book which displays where is the particular page containing the queried term is located in the entire book.
2. After this, the query proceeds to the doc servers, and these servers actually retrieve the stored documents. Page descriptions or “snippets” are then generated to suitably describe each search result.
3. These results are then returned to the user in less than a one second!
(Normally.)
Approximately once a month, Google updates their index by recalculating the Page Ranks of each of the web pages that they have crawled. The period during the update is known as the Google dance.

Back Links

Back Links Are Considered Popularity Votes
Quintessentially, Google calculates the importance of a page by the number of such „votes‟ it receives. Not only that, Google also assesses the importance of the pages that are involved in the voting process. Consequently, pages that are themselves ahead in ranking and are important in that way also help to make other pages important. One thing to note here is that Google‟s technology does not involve human intervention in anyway and uses the inherent intelligence of the internet and its resources to determine the ranking and importance of any page.

Page Rank Based On Popularity


Page Rank Based On Popularity
The web search technology offered by Google is often the technology of choice of the world‟s leading portals and websites. It has also benefited the advertisers with its unique advertising program that does not hamper the web surfing experience of its users but still brings revenues to the advertisers.
When you search for a particular keyword or a phrase, most of the search engines return a list of page in order of the number of times the keyword or phrase appears on the website. Google web search technology involves the use of its indigenously designed Page Rank Technology and hypertext-matching analysis which makes several instantaneous calculations undertaken without any human intervention. Google‟s structural design also expands simultaneously as the internet expands.
Page Rank technology involves the use of an equation which comprises of millions of variables and terms and determines a factual measurement of the significance of web pages and is calculated by solving an equation of 500 million variables and more than 3 billion terms. Unlike some other search engines, Google does not calculate links, but utilizes the extensive link structure of the web as an organizational tool. When the link to a Page, let‟s say Page B is clicked from a Page A, then that click is attributed as a vote towards Page B on behalf of Page A.

Earth Is An Algorithm

What On Earth Is An Algorithm?
Each search engine has something called an algorithm which is the formula that each search engine uses to evaluate web pages and determine their relevance and value when crawling them for possible inclusion in their search engine. A crawler is the robot that browses all of these pages for the search engine.
GOOGLE Algorithm Is Key
Google has a comprehensive and highly developed technology, a straightforward interface and a wide-ranging array of search tools which enable the users to easily access a variety of information online. Google users can browse the web and find information in various languages, retrieve maps, stock quotes and read news, search for a long lost friend using the phonebook listings available on Google for all of US cities and basically surf the 3 billion odd web pages on the internet!
Google boasts of having world‟s largest archive of Usenet messages, dating all the way back to 1981. Google‟s technology can be accessed from any conventional desktop PC as well as from various wireless platforms such as WAP and i-mode phones, handheld devices and other such Internet equipped gadgets.

Advanced SEO Techniques


Introduction

This ebook is a hard-hitting guide that gives you the information you need to make the adjustments to your site right away to help improve your search rankings and benefit from the increase in organic search traffic. Search Engine Optimization or SEO is simply the act of manipulating the pages of your website to be easily accessible by search engine spiders so they can be easily spidered and indexed. A spider is a robot that search engines use to check millions of web pages very quickly and sort them by relevance. A page is indexed when it is spidered and deemed appropriate content to be placed in the search engines results for people to click on.
The art and science of understanding how search engines identify pages that are relevant to a query made by a visitor and designing marketing strategies based on this is called search engine optimization. Search engines offer the most cost effective mechanism to acquire “real” and “live” business leads. It is found that in most cases, search engine optimization delivers a better ROI than other forms such as online advertisements, e-mail marketing and newsletters, affiliate and pay per click advertising, and digital campaigns and promotions.