Search Engine Optimization (SEO) is the process of affecting the visibility of a website or a web page in a search engine's "natural" or un-paid ("organic") search results
Articles in This Category
Search Engine Optimization (SEO) is the process of affecting the visibility of a website or a web page in a search engine's "natural" or un-paid ("organic") search results. In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users.
A Uniform Resource Locator (URL), colloquially termed a web address, is a reference to a web resource that specifies its location on a computer network and a mechanism for retrieving it. A URL is a specific type of Uniform Resource Identifier (URI), although many people use the two terms interchangeably.
A Site Map or Sitemap is file that contains a list of pages of the website to inform to search engines about the organization of the content of the site together with some additional information, such as how often the page changes its contents, when it was last updated and how important it is to the rest of the pages of the site site.
KWIC is an acronym for Key Word In Context, the most common format for concordance lines. The term KWIC was first coined by Hans Peter Luhn. The system was based on a concept called keyword in titles which was first proposed for Manchester libraries in 1864 by Andrea Crestadoro.
Spamdexing is one of several methods of manipulating the relevance or prominence of search engine indexed resources, usually in a way inconsistent with the purpose of the indexing system. It could be considered to be a part of search engine optimization, though there are many search engine optimization methods that improve the quality and appearance of the content of web sites and serve content useful to many users.
Page Hijacking Involves compromising legitimate web pages in order to redirect users to a malicious web site or an Exploit kit via XSS. A Hacker may use an exploit framework such as sqlmap to search for SQL vulnerabilities in the database and insert an Exploit kit such as MPack in order to compromise legitimate users who visit the now compromised web server.
It is known as Doorway Pages to those web pages included (in some cases hidden) in trusted or ranked domains, with the function of acting as gateway to ad pages. In cases where doorway pages actually lead to sites with useful content, these are well-rated by search engines, but if they only lead to pages designed to make money with ads as part of a Black Hat SEO strategy, for example, the search engines will not give them relevance and may even remove them from the index.
To understand what Cloaking is, you first need to know how the process of recognition and indexing websites of search engines. These use web spiders (robots to find and index sites) to carry out their task. For example, Google uses one called Googlebot, which has two different versions: deepbot and freshbot. While the first one will be in charge of visiting all the sites following the links that they contain, freshbot will look for new content, visiting well-known sites that change of content frequently
You May Also Like