Defining “Organic SEO”
Organic SEO is optimizing your website so it qualifies as the most relevant result for the search(es) you intend it to show up for.
~Brandon Na, 2012
Yes, this definition was created by our founder of Seattle Organic SEO, but will encompass most, if not all definitions shared by the top search engine optimization experts’ opinions in the world. How so? Let’s break it down.
First, “organic” means “developing naturally” for the purpose of defining it for SEO marketers. It’s why many marketing professionals call the “organic results” that show up in search engines as the “natural results.” SEO (search engine optimization) as defined by this wikipedia article is:
“…the process of improving the visibility of a website or a web page in search engines via the “natural,” or un-paid (“organic” or “algorithmic”), search results.”
While this definition creates circular reasoning: i.e., “SEO” equals “organic SEO” and organic SEO equals SEO, it explains why the wiki article about “SEO” shows up for “organic seo” searches in position #4 (as of 3/16/2012) above the article about “Organic Search” which would be more precise. However, if relevance was defined as simply the most qualified “reference” website for each search conducted in the search engines, wikipedia or other reference results would ALWAYS show up above and beyond other websites. Instead, Google and other Organic Search Engines order the results organically by the most relevant result showing up at top.
Despite “SEO” having this intrinisic relation to the “natural” results versus the paid results, it is many times called “Organic SEO” to emphasize it’s distinct difference from the artificial results that can be manipulated by the highest bidder. Also, in the ideal world, it would match up with the intention of each person entering in a search in the many organic search engines.
While there is not a “perfectly organic” search engine in the world, Google is considered the most “organic” by most and has the largest market share in the world with people who search on the Internet. And I would argue, it might have the largest market share because at least for now, it’s delivering what people “want” when they search – i.e., they deliver “relevant” results. However, how did they do this?
Well, two Stanford Computer Science grads, Sergey Brin and Larry Page wrote “The Anatomy of a Large-Scale Hypertextual Web Search Engine“ and from there, the landscape for searches changed dramatically. Their #1 goal was for “Improved Search Quality”. In the same paragraph, they broke down that the quality would be improved by “more relevant” results. And they continued to clarify that basically the anchor text behind links to websites is a great indicator for websites that are “relevant.”
Many additions were added to the algorithm up through today include adding credit for “fresh content”, providing more “local results” for searches that appear to have “local intent,” and penalizing websites that had too much advertising. However, Na has argued that the underlying principles behind the algorithm remain the same through today. After sharing it orally for years, Na published the “2 C’s of SEO” in 2011 in a entrepreneur network he involved himself in briefly.
2 C’s of SEO
Na argues that “relevance” is constantly determined by the “2 big C’s” of SEO: the Content and Credibility of a website. He says Google’s making the constant tweaks to it’s algorithm to fit within
these two primary criterion. In 2012 (& before), a website’s “content” is identified by the search engines through spidering or scanning important tags on the website like the <title> tag or the <h1> and <h2> tags. While the actual copy on the pages are important, they are not necessarily paramount because there are duplicate content problems on the web and for now, if one’s “on page SEO” is more focused on a particular search phrase vs. another’s website or page that has similar, if not identical content, the on page elements can distinguish to the search engine’s what the “intended content” is.
The “credibility” is made up of a number of factors here in 2012, but the two primary ones we believe determines a site’s “authority” are backlinks and social signals. The backlinks or links from other websites to the website in question are the foundation of the original organic seo algorithm.
There is quite a bit of recent optimism that the use of more hypertextual information can help improve search and other applications [Marchiori 97] [Spertus 97] [Weiss 96] [Kleinberg 98]. In particular, link structure [Page 98] and link text provide a lot of information for making relevance judgments and quality filtering. Google makes use of both link structure and anchor text (see Sections 2.1 and 2.2).
One of the main reasons why there are many “link building companies” is because this is still the foundation of the algorithm. However, with this said and with the security of these companies at stake, Search Engines are trying their best to find better ways to determine the credibility.
As social media becomes more a part of the activity on the web, it gains credibility itself as a signal and the search engines try to take these cues as “credibility cues.” A website’s Facebook likes or it’s +1′s in Google’s social network can definitely enhance it’s online credibility not only in the simple minded, but also can be deemed “popular” and most likely “credible.” Here’s a list by Search Engine Watch which identifies the Top 13 Social Signals that can affect your SEO (‘s credibility).
Google is THE Organic Search Engine
Despite all the growth that Bing’s had in the past year or two (primarily from their Facebook investment & relationship), Google remains today the dominant “Organic Search Engine.” Also, even with all the hoopla about Baidu and other Asian search engines growing larger because of the massive population in Asia, Google still dominates “organically” because in Asia, there is a stronger bias towards the “paid model” of search.