Algorithms are sets of rules according to which search engines rank web pages. Figuring out the algorithms is a major part of SEO. The thinking is that if you understand how they calculate relevance, you can make specific pages on your site super relevant for specific search terms.
Text on a web page designed to be visible to spiders but not to human visitors. The aim is to load the page with keywords without deterring from the visitor’s experience. Of the various techniques of hiding text, the most common is to set the text color to exactly or nearly the background color. Most search engines can now detect hidden text and consider it a form of spamdexing. Pages that contain hidden text are penalized or even de-listed.
Index / indexed
Referring to the searchable database of documents stored by a search engine – often simply referred to as a search engine’s database. When used as a verb, it describes the process of converting a collection into a searchable database. The term is sometimes also used to refer to directories like ODP.
A word used in a query. In SEO, pages are typically optimised for specific keywords. Keywords are targeted based on what users looking for the specific information or product are most likely to use as part of a query. Accurate keyword targeting is considered by most to be essential to effective SEO.
An HTML tag placed in the head section of a web page. The tag provides additional information that is not displayed on the page itself. These tags help search engines index the page correctly by providing an accurate description of the page content and a list of keywords associated with the page.
Referring to sites that offer authorized duplicates of content also found on other sites. In the context of SEO, the term is mostly used to refer to sites that attempt to deceive search engines into indexing more than one instance of a site by duplicating it on another server and domain. Most search engines now have filters in place to detect mirror sites and many of them penalize these sites by de-listing both the original site and the mirror site.
A page is said to be optimised when it has been structured in such a way that it ranks well (on the SERPs) for those keywords it targets. It is a fairly subjective concept. In the strictest sense, optimisation means simply making a page spider-friendly by, for example, using text links rather than image links. In the SEO industry the term is more often used as a collective name for all the "tricks" webmasters use to improve a page’s ranking.
- Search software
- Web interface
Typically, a search engine works by sending out a spider to fetch as many documents as possible. Examples of search engines include Google and AllTheWeb. The term is often used to specifically describe systems like Alta Vista and Excite that enable users to search for documents on the World Wide Web and USENET newsgroups.
Search Engine Results Page(s). The term refers to the page of search results a search engine displays in response to a query. The Web page that a search engine returns with the results of its search.
A collective name for those marketing techniques that are intrusive, offensive and/or unethical in some way. Electronic junk mail or junk newsgroup postings. Some people define spam even more generally as any unsolicited e-mail. In the search engine world, regular mass submission of web pages to search engines is also referred to as spam or spamdexing.
A program that automatically fetches Web pages. ‘Spidering’ simply is a term where a search engine will send out crawlers (think of these as spiders in a web) which travel the internet collecting website’s data to return and store in the search engines database. Another term for these programs is webcrawler. Spidering is what spiders do.
The title of a page is displayed in the title bar right at the top of the browser window. Almost all search engines consider the title when determining a document’s relevance to a query and most search engines consider the title the most important element.