Algo Changes

RankBrain: An investigation of a Search AI

RankBrain. It sounds like some snake oil concoction an SEO thought up to help sell the local mom and pop shops on SEO. But, it’s not. It is actually one of the top ranking factors for Google. It’s existence was announced in October of last year, although the patent was submitted in 2012. So, what exactly is “RankBrain” and what does it do? RankBrain is officially considered an artificial intelligence system. Reason being is that it “naturally” evolves; growing in “intelligence” as time passes and it accepts search data. What search data you say? Your search data. Online search is considered natural and organic when users perform search actions without other input and as a way to find answers to their questions (queries). RankBrain’s intelligence is focused on evaluating the way organic search is being used, semantic web activity, events, and much more. So there are no worries about this AI developing sentience and enslaving us all. RankBrain is an artificial intelligence program, not a ‘true’ AI in the traditional Terminator or I, Robot sense. According to the actual patent, RankBrain focuses on contextual matching and substitution data to help improve the search results. What this means to the layman is that Google search results will “seem” more intelligent as it produces results that may include organic (and some paid results) that may not align directly to the query. Now, that doesn’t mean RankBrain replaces search results; it is one of the more than 200 search ranking signals used by the search engine. Even more so, it is one of the top three. The other two most important Google ranking factors? Content (duh) and links (double duh). Search Engine Watch pointed out that RankBrain fills in a need that exists around live updates. Content is and has always been and will likely going to always be, king. Links provide signals of importance, authority, and credibility of said content. But, what about live actions and search activity in connection to the content and links? Hello RankBrain! 15% of all queries completed each day are new. Furthermore, Search Engine Land outlined that Google processes 3 billion searches each day. With that and the 15% means 450 million entirely unique searches are created daily! Forbes‘ contributor and SEO Jayson DeMers offered a great outline of what RankBrain is NOT: Not the ‘new’ Knowledge Graph. (KG is a AIP too, but focused on offering definitions and details on the query) This isn’t an algorithm change or update. RankBrain isn’t a new type of search. It impacts what shows in the results. RankBrain isn’t a robot. It is really just mathematical equations working together. I think of it like compound interest, as time goes on and the calculations are completed, the completed calculations impact all future computations. Will there be more announced algo changes? In the past, Google would have to internally test updates, develop the change, then push the change live. This is where we got some of our fun names like Penguin, Panda, and Hummingbird to name the latest. Those days are over (or deprecated for the techies in the crowd). Now, if there are any changes to the algorithm, they will likely come in the form of live updates or “bolt on’s” to the current algorithm. That isn’t to say Google won’t update the core like when they switched to “Caffeine” in 2010 and “Hummingbird” in 2014. The Wrapper You’ve probably done a search or two since RankBrain was released on Google. Any changes to the SERP aren’t being completed by some robot locked in a closet. Design isn’t RankBrain’s forte, search is. Verve Developments offered some great examples of what RankBrain “physically” does to...

Read More

Penguin 3.0 Is Here!

The last Penguin update occurred in October of 2013. We saw some signals a new update of Google’s link quality component of the algorithm was coming. Before most Google updates are implemented, there are changes that happen with the SERPs, fluctuations in the number of search impressions, link changes in Google Webmaster Tools (GWT), and a few other signals. As of October 21st, Penguin 3.0 is being released, worldwide. It should impact 1% of all US English queries according to Pierre Far of Google UK. The roll out will last several weeks as it is implemented across all of Google’s search platform.     Although this Penguin update is given a full version number, it is being referred to as a refresh; only it is a refresh on a global scale. Most SEOs believe this refresh is focused on determining the value of links in guest posts and blog networks to a further extent than the first iterations of the update. However, the focus was to also help sites doing well in the link-building category a slight boost in search visibility. The Wrapper In the Penguin’s last update, several websites were hit with penalties and Google offered a link disavow tool to help webmasters communicate with Google concerning links the website owner does not endorse. Penguin’s latest update shouldn’t impact most businesses or websites that are not pursing paid links, spam links, advertorials, link networks or other links that violate Google’s quality...

Read More

Panda 4.1

Content is built to help inform and educate users. Google’s algorithm handles multiple areas focused on quality and content. To ensure the search results are as relevant as possible, the Google algorithm has several components that focus on separate values. Panda is aimed at identifying low-quality content. The latest update to Panda, released on September 25 and rolled out over the span of 2-3 weeks, is designed to help SMBs and their sites. This update, like others, only affects certain queries. “Depending on the locale, around 3-5% of queries are affected.” According to Pierre Far. Furthermore, Pierre said, “Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice.” The latest version of Panda (4.0) was released in May of 2014. Websites with low-quality content and content that is overly duplicated were impacted by the release of 4.0. Popular websites PRWeb and PRNewswire were impacted, losing up to 70% of their search engine traffic according to Forbes. That original release (Panda 4.0) affected about 7.5% of all English queries by comparison.   The Wrapper What can be taken from this latest Panda version and Pierre’s comments is that the algorithm has been further refined to improve the way content is evaluated. Websites that offer little unique or informative content may find themselves loosing visibility; while small businesses with good localized content may see a slight increase in search...

Read More

Penguin 3.0 to hit before the end of the Year

We have seen the signals that Google was preparing to make an update or change to their search system, in regards to refining the way the algorithm evaluates links, for some time now. Tremors in the search surrounding links comes in the form of changes in GWMT’s link reporting, changes to authorship, concerns with directory and forum linking along with the ever-controversial advertorial links. Penguin focuses on links and Panda focuses on low-quality content. Search Engine Land offers a great history of Penguin update and Barry Schwartz shares his thoughts on this news as well. According to their records, the order and dates for releases are: Penguin 1 on April 24, 2012 (initial release, affected about 3.1% of search queries) Penguin 2 on May 26, 2012 (less than 0.1% affected) Penguin 3 on October 5, 2012 (less than 0.5% of queries were impacted) Penguin 4 (Penguin 2.0) on May 22, 2013 (2.3% of queries affected) Penguin 5 (Penguin 2.1) on Oct. 4, 2013 (1% of queries affected) The next update is planned to be a complete update of this backlink evaluator of the algorithm with a focus on allowing Penguin to run like Panda does now, which is monthly. When the algorithm runs, it refreshes the Google index to align better with the signals and values of web pages; meaning that see improvements  becomes quicker. Currently, it can take a few weeks (6-10) for changes made in regards to links to be fully realized. The Wrapper Although Google is not giving out many details or officially announcing ever update, John Mueller with Google announced their anticipation of releasing a Penguin update within 2014 “in the reasonable future.” This really means that it will be easy to see recovery after a penalty and to see search improvements faster with white hat and organic link...

Read More

5 Facts about PageRank in 2014 (and Beyond)

Why Links Matter/ed – PageRank   Introduction to PageRank   When you have the job of organizing the world-wide-web, you need some method of assigning value to each page the web contains. For Google and its creators Larry Page and Sergey Brin, this value came to be known as PageRank. This arbitrary value is a combination of several parameters associated with any and every page saved to their index. PageRank is meant to offer an ‘absolute quality’ for any given page. It was understood in the early years of Google (circa 2008-2010) that the top elements used in PageRank were links pointing to pages (+), the number of links on a page (-), and page-level qualities (e.g. technical coding, content, terms, etc.). As far back as 1998, when Page and Brin initially presented Google to Stanford University, the idea of their indexing system was to evaluate content on the web and present the findings to users for easy accessibility. During the process of evaluation each page would be indexed, quantified and stored based on over 200 factors. In 2014, what matters the most in that evaluation? Not only to businesses but also to Google? What Matters in 2014 for PageRank Over the years we have all become a little jaded when we hear the term ‘algorithm’ when having conversations about Google. “Hey, did you hear about the latest algo change?” “The local algo is really messing with my business.” We may have forgotten the mathematics involved in Google’s process of ranking and valuing. There is a lot of science behind the Google algorithm(s). Even with all the updates, the fact that search platforms use mathematic equations has stayed the same. The updates the Google team makes are centered on the quality of the indexed results, incorporation of user feedback (this is where ‘organic’ comes from) and the inclusion of their business plan (sadly). Money is now a major motivator of the search algorithm and the changes being made. So, in 2014, what factors are within our power to influence search to position our web pages in prime positions?   Five Facts of PageRank Links – As with the first iterations of the Google algorithm, links are a primary factor in the mathematics used to determine the value of a web page. The algorithm has evolved past just looking at the number of links and focuses more on the quality of links. This means, the links pointing at a web page pass value, but that value is evaluated for relevancy along with some concern with ‘why’ the link exist. Why is this page linking to that page (and visa versa)? Why would a user click on the link? Ensuring pages linking to your web page(s) are appropriate is vitally important. Content – Building a great site is one thing, having tons of great links is another, but all of that only makes since when the content of the site is worthy of existing. Content is king but only when the kingdom is worthy of subjects! The best kind of links come from users that enjoy your content; whether it is a product, service, educational, informational, or just entertaining. The content is the meat of the experience, without it the page is just an empty plate. Context – Theme here is that links bring visitors, those visitors are looking for content relevant to them and their interest. Context gives them reason to read, digest the content, and finally act. Pages on the Internet exist to have users take action in some form. By definition context helps them understand why the page...

Read More

Local Search Update, Codename: Pigeon

Google Gets Pigeon-Focused In Local Search Yeah, we know, another goofy animal name for something that really impacts the way we live our lives and conduct business everyday; but it’s better than Codename: Fluffy Bunny. So, what do we know about this update after a few days with it. According to Search Engine Land, the focus of the update is to “provide a more useful, relevant and accurate local search results that are tied more closely to traditional web search ranking signals.” This resulted in a few noticeable alterations to the organic and Map results (and some less-than-noticeable changes). User searches pull in more local results for both organic and Maps. Map searches contain local results showing the cumulative reviews score and number of reviews first, business category, a shortened description followed by address. (It’s interesting to note that clicking on the Red Robin listing shows that the Map result is tagged with UTM code.) Organic searches see a different mix of local results in the local banner. Some contain reviews, owner submitted pictures, and details, and others do not. Along with this, the option to organize by price, rating and hours is available. Adding in local qualifiers alters the results significantly. (Below, Burgers vs Nashville Restaurants) Behind the scenes, the algorithm update improves the distance and location evaluation for searches, according to Google. The Wrapper As with any algorithm update, the SERPs will be in flux for the next few weeks as the system re-indexes and re-evaluates businesses against the new parameters. This may result in duplicates or multiple listings for the same business in the local pack and organic results or listings disappearing for a period only to return later....

Read More