New Creative Services now available!

Does Google+ Actually Impact Upon Search Results?



September 20th, 2013

The relationship between Google+ and high ranking search results is a pretty hot topic at present. Matt Cutts, Google’s head of search spam, has actively denied that Google directly uses +1s as an input for its algorithm. What can’t be denied is that webmasters using Google+ have much higher ranking websites than those that do not.

Google has gone to great lengths to assure users that this is a case of correlation, not causation. They have also drawn parallels between the situation and that of Facebook ‘likes,’ with Matt Cutts claiming;

If you make good content, people will link to it, like it, share it on Facebook, +1 it, etc. But that doesn’t mean that Google is using those signals in our ranking. Rather than chasing +1s of content, your time is much better spent making great content.

Yet, unsurprisingly, Google still isn’t racing to enlighten anyone on the precise details of their ranking algorithm that ‘doesn’t’ consider +1s as a factor. Naturally, their have been a number of studies undertaken within the SEO community to settle this dispute and determine how Google+ interacts with PageRank. What is surprising is how controversial the results of these studies have been.

In SEO, correlation is not infrequently mistaken for causation. As the precise details of Google’s search algorithm are unknown, optimisation has often incorporated educated guesswork in efforts to improve a website’s ranking. A good example of this are the many studies published on search engine ranking factors, which outline the common elements of highly ranking pages. While similarities in the anchor text word count and the length of a URL are held by many high ranking web pages, not all high ranking web pages share these qualities. This is because their shared characteristics are a result of correlation, not causation.

In regards to Google+, the current subject of debate is whether +1s cause a website to rank highly, or if high ranking websites have +1s for other reasons. As mentioned previously, Matt Cutts has claimed it is because of other reasons, and that this is simple a case of people misinterpreting the relationship between +1s and PageRank. But, because we don’t have access to Google’s algorithm, it is impossible for SEO practitioners to know all the ‘other reasons’ high ranking websites may have +1s. Until this information becomes available, people are likely to continue correlation/causation debates, because without it, Google’s claims must be taken at face value- and the search engine is a component of their business, after all.

Good old, trusty Moz have stepped up to the plate. Through them, Cyrus Shepard proposed that “the relationship between +1s and higher rankings goes beyond correlation into the territory of actual causation.” Now this is a pretty big call, especially considering that this directly contradicts  Google’s statement on the matter. Why does Shepard consider Google+ to have this power? He outlines three main reasons;

  1. Google+ posts are crawled and indexed almost immediately. Google+ could be used by Google as a means of identifying the appearance of new content because of its complete accessibility. This is unlike websites such as Facebook and Twitter that have privacy settings which reduce GoogleBot’s ability to crawl them.
  2. Google+ posts pass link equity. While pages and posts on Google+ accumulate PageRank, they are also followed, allowing them to pass on their link equity.
  3. Google+ is optimised for semantic relevance. Google+ posts each have a unique URL and can provide extensive, in depth information. Sharing a post also allows the accumulation of internal links whilst retaining relevant anchor text. This allows each individual post to rank on a SERP and could also send relevancy signals to an associated URL.

Eric Enge, writing for Stone Temple Consulting, conducted an incredibly detailed study to determine the relationship between Google+ and search results. His interpretation of the findings supports Matt Cutts’ assertion that the tendency of Google+ pages to receive high rankings is a case of correlation. While he agrees with Moz that Google+ leads to quick crawling of content, he does not believe this same speed is applied to indexing. In creating test pages, providing them with +1s and monitoring the outcome, Enge found no “material evidence of Google Plus Shares driving rankings movement.”

In his analysis, Enge finds that Google+ still has value for SEO. He actually agrees with Moz and Shephard that the correct use of Google+ can be a fantastic strategic tool for optimisation practices. Enge finds that use of Google+ encourages both the discovery and indexing of content.

After the first Google+ share of Enge’s test web pages, there is an almost immediate crawl of the content by GoogleBot. GoogleBot also completes additional crawls of the website every time that it receives another share. In fact, Enge highlights a small quote by Google made upon its developers’ page, that states;

“By using a Google+ button, Publishers give Google permission to utilize an automated software program, often called a “web crawler,” to retrieve and analyze websites associated with a Google+ button.

Now if this isn’t a case of Google admitting they give additional crawler attention to websites using Google+, I don’t know what is. This would imply that the use of Google+ drives website discovery by the search engine. As a result, websites using Google+ are likely to be found faster than those that don’t.

Slightly more hesitantly, Enge goes on to propose that use of Google+ probably drives search engine indexing. Each of his test pages appeared on Google’s SERP for their targeted search terms exactly ten days after their creation. While Enge admits it is impossible to be certain that the test pages received no links, he believes they did not. He proposes that this may mean Google+ pages are indexed according to a schedule, and indexing may occur without external linking. Yet the time taken for this to occur implies that Google+ is not used to determine the freshness of associated content.

Clearly the influence of Google+ upon PageRank is open to debate. Personally, I found Enge’s study particularly convincing. Furthermore, practicality begs SEO practitioners to conform to the desires of Google. This would suggest optimisation is best performed under the presumption that Google+ does not have an impact on PageRank. If it did, chasing +1s would undoubtedly become a black hat practice, with abusers penalised as a result.

What is known is that Google uses high quality websites as a template to identify what other high quality websites look like. Many of these high quality websites are using Google+, and there isn’t really any risks webmasters take as a result of its use. Google+ provides an opportunity to build authoritative networks, gain natural links, position yourself as a publisher and share content across a variety of platforms. While its use doesn’t necessarily give your website an immediate edge, when used appropriately, Google+ is certainly of great value to SEO.

All You Need to Know About Blogs…



September 13th, 2013

Blogs are a vital part of a strategic internet presence. They should connect with an audience by posting current content to be shared and engaged with by others. Unfortunately, many blogs are used as a means of self advertisement. The value of a blog is the signals it sends to search engines through associated shares, links and comments. For businesses using their blog to advertise, this tends to be a waste of the time and effort as people rarely share others’ promotional material- let alone link it.

Starting a blog offers a number of opportunities to webmasters, including:

  • The chance to identify your brand as a ‘thought-leader.’ Blogs can increase the trust a search engine places in the author of the blog as an information source.
  • An avenue through which to drive traffic to specific pages of a website.
  • A means of creating a trusted information source that is not tied to your website
  • An avenue through which people can question and interact with your main website

When establishing a blog, there are four platforms from which to choose:

  1. Subdirectory
  2. Subdomain
  3. Separate domain
  4. A blog hosting website

How do you choose which platform is most appropriate for your blog? This depends on the contribution you want it to make to your web presence. Consider these advantages/disadvantages of each platform when making your choice.

Subdirectory

+

Using a subdirectory for your blog is a great way of adding current, updated content to the domain of your main website. This ties the links and social signals received by the blog directly to the main presence you are trying to promote.

-

The main disadvantage of using a subdirectory is that it doesn’t send links from another domain. Using another platform means that there is an additional domain linking to your website, which increases the diversity of your linking profile.

Subdomain

+

A subdomain can be a means of getting greater brand presence on the search engine results page. Your primary domain can be offered as the first result, and the subdomain of your blog as an additional result for the same search. This also improves the quality of your linking profile as links to the primary domain from the blog are interpreted as links from a separate website. This subdomain can also gain a high ranking more easily due to its association with the primary domain. Furthermore, they can be hosted anywhere.

-

Subdomains just don’t add as much value to the root website as does the use of a subdirectory. The work you put into new and current content doesn’t attribute this freshness to your main website, it gives it to the subdomain. The shares and links to great content posted in this way do not add much value to the primary domain.

 

Separate Domain

+

Really, the only advantage of this strategy is that it may pose an opportunity to construct an apparently unbiased information source. If the search engine values this source, the links it provides to the primary domain can be considered exceptionally valuable, as the search engine interprets the information as unbiased.

-

In reality, this is very difficult to achieve. Becoming a credible information source takes a lot of time and effort, and even then, should it be discovered at any point as a link scheme, the penalties received render all the hard work pointless- it will also reflect very badly on your primary domain. If you are putting in the effort required to identify as a valued information source, it is probably smarter to do so for your primary website, so it can directly reap the benefits.

Blog Hosting Website

+

This is the easiest way of starting a blog. You can also experiment with link building tactics without posing great risk to your primary domain. Using sites such as WordPress or Blogger can send links to your website from another domain, which increases the breadth of your linking profile.

-

Naturally, these sites limit the flexibility you have in terms of template and design. Links provided in the blog also won’t contribute directly to your primary domain, and plugins are not supported. While this can be a great way of familiarising yourself with blogging, for a corporate web page with a clear plan, this is likely to add minimal value and can appear unprofessional.

 

What is the value of blogging in terms of SEO?

 

A blog is valued by a search engine when it is seen to relate to the search term, has been recently/frequently updated, comes from a trusted information source and actively engages with its audience. These are the signals a good blog can send to a search engine. The effort put into creating and maintaining a blog says good things about its associated domain, and the links included within the blog itself. Blogs also provide an opportunity to diversify the search terms for your primary domain, should these new terms be included in the content of the blog.

Blogging is also at the core of a website’s content strategy. Content strategy is one of the most crucial aspects of SEO, as this determines what you are actually giving to your audience. In the past, SEO has emphasised techniques of directing users to a page through technical strategies such as link building and keyword stuffing. Many of these approaches are now considered “blackhat SEO” and are heavily penalised by Google. Unlike good content, these technical strategies tend to focus on search engines- not their human audience.

New content can take a number of forms, from photos to press releases and information documents to video. Generally speaking, a blog is the most frequently updated content of a website. The content of a good blog should be the foundation of your site’s content strategy; the way in which content is planned and produced to maximise the potential engagement of your website’s current audience. An excellent content strategy will take this even further, their content will engage an existing user base whilst considering ways in which this base can be grown.

How is this done?

The answer to this question will be different for each individual webmaster. It involves considering what users want from your website, and determining if they are getting it. If they are getting it, is there a better way you can give it to them? If they want something else, how can you incorporate this into your current content strategy?

Imagine that you run a website offering pet care advice and the page on which users are most active is a forum where users share pictures and stories of their cats. Your content strategy may have been focused on improving your audience’s understanding of pet health care, but clearly this is not their key interest. In this instance, I would recommend altering blog content to include more pictures/stories about cats. Paying attention to how users react to this new content is also important- it is entirely possible they want the blogs’ content to remain similar. However, attempting to adapt your content to the wants and needs of your audience is a key aspect of SEO. If your audience responds negatively to change, change it back. Content strategy is about the relationship between your users and the content you provide them. Any response to a change you make can be used to clarify the desires of your audience.

Content strategy is also about attracting new users. Considering issues closely but not specifically related to your website can be a great way of attracting new traffic. If you can find an information niche associated with your website, blogs can be used to position yourself (and your associated website) as an information source. Blogs on topics that relate to your primary content, but are not necessarily the interest of your user base can be a means of directing new users to your website. For the pet care website example, consider creating a blog on organic food for pets.

The take away here is that content strategies must be flexible, and should not remain stagnant. Continually attempting to satisfy your existing user base whilst concurrently looking to expand not only generates a dedicated audience for your website, but signals to Google that you are a high quality information source, that responds to its audience. You’re not the boss here- Google wants you to give your audience what they want.

Blogs are also valuable as link bait. Each blog can be shared by readers through social media, and the blog author can also share their post this same way. Examining your most linked blogs is another way of determining your most popular content, which can lead you to change content strategy accordingly.

Blogs also increase the number of pages of your website to be indexed by a search engine. The greater the number of pages your website has, the great number of keywords you can optimise for. Even if this content is poorly optimised, as long as it is of a high quality, it increases your chance of appearing on the SERP for obscure and long tail keywords that you might not have considered optimising for. As we approach an age of semantic search, keyword optimisation becomes increasingly less important.

But traditional SEO isn’t dead yet. Keywords are still an important tool allowing search engines to index a web page. So how do you perform SEO in relation to your blog? Above all tailor your content to your audience. I’m sure I have mentioned this at least five times in this article, but it is in this approach that the power of blogs resides. In addition to this, optimisation similar to that required for a traditional web page should be undertaken. Steps include:

  • Incorporating keywords in the URL
  • Optimising content correctly
  • Having an eye catching title that includes the targeted keywords

Blogs should be an important part of any internet marketing strategy. Using them correctly has the potential to generate a large, engaged audience and exemplify the author as an information source. These benefits can then be passed along to any presence the blog chooses to be associated with. Essentially, a good blog provides a stream of traffic that can be directed at will of the blogger.

From Keyword Search to Contextual Search: The New Query Model



August 29th, 2013

Google search is changing. This is obvious to webmasters and SEO practitioners who continue to experience unexpected fluctuations in their traffic as a result of algorithmic changes. However, for the average user this change is not so obvious. So how is it changing? A shift from indexing to understanding is under way. The traditional search query was entirely concerned with the explicit meaning of search terms. Nowadays Google will assess the relevancy of a web pages to individual users in determining their search results, in doing so Google analyses implicit meaning.

As searches begin to place an emphasis on understanding, the implicit meaning of a search query becomes increasingly important. Implicit factors of the searcher’s context, such as their IP address, search history, location and connection speed have greater influences on search results. In fact, fifty seven different implicit factors are used by Google to personalise search, and that is for people who are not users of Google+.

These days when a user searches Google will consider the type of information the user is looking for and provide them with the pages they believe are most likely to satisfy the specific user’s information need. As “understanding” becomes increasingly important, implicit factors will lead to more diverse results for different users. A Labrador breeder living in Ohio might receive information on dog food when searching ‘pet store CBD,’  while a dog owner walking in Hyde Park searching ‘pet store CBD’ may be the only searcher that receives directions to a local pet store.

This implicit component of search regards the context that surrounds query terms, and it is this component that is having the biggest impact on search engines and SEO practices. For some time this process has been termed “personalisation” which, despite popular belief, does not only regard a user’s search history and social profile. Personalisation refers to all implicit factors impacting upon search results. Clearly the term ‘personalise’ is a little misleading- let’s call the use of implicit factors in determining search results a means of “contextualising.”

Just like other aspects of the Google algorithm, these implicit factors are difficult to determine on an individual basis. Google are also likely to change and update these factors regularly. As Google continues to contextualise search, these implicit factors are likely to change and will certainly grow in number.

Take for example Google’s recent acquisition of the Behavio team. Behavio aims to utilise a social and behaviour sensing framework to predict how people will act on their phone. As Google incorporates this technology the context in which a user performs searches could be identified, and searches could even be completed by Google on a user’s behalf. While this might sound like a big step, changes of this magnitude should be expected to be increasingly common for Google.

Context is important because it provides small websites the opportunity to rank highly in search results. Often these smaller websites contain context specific information desired by a user. Determining these wants and needs is important to Google, and this is the motivation driving the shift from indexing to understanding. Understanding relies heavily on interpreting implicit aspects of a search, whereas indexing is a far more simple process of matching terms.

For SEO practitioners, this shift has one particularly important implication; keywords are dying. This is not to say they can’t be of use, people are still clarifying their information needs by typing terms into Google, and these terms are still used by Google to determine which information to provide. However, it is not the specific search terms that are important any more, both for Google and optimisers. It understands the meaning within and around these terms that is vital, and so far the best way of interpreting these meanings is through context.

How should this affect SEO in practice? When using analytics, remember that context has influenced your traffic. This can be in regards to specific keywords as well as across groups of keywords. If you can determine the prevalent contexts that segment your audience you will also have a better understanding of their information needs which, should you cater to them, is likely to increase traffic. If you do not wish to change the content strategy of your website to maximise contextual benefit, consider creating one or two pages for this purpose. Theoretically you could create landing pages by targeting specific user contexts which would then link to the main body of your site. Contextual optimisation could very well be the next phase of SEO.

To Follow or nofollow Widgets, Impact of Google Vince and much more



August 16th, 2013

Widgets: To follow or nofollow?

Well, according to a recent video featuring Matt Cutts, the leader of Google’s search spam team, the answer is a resounding “nofollow.”

Embedding links in widgets was a popular SEO technique circa 2011. These links were an effective way of sending traffic to a particular website, but the practice became frowned upon as spammy websites frequently abused it.

However, infographics and similar widgets have become popular once again. Many SEO practitioners have used keyword rich, anchor text in these instances. Unfortunately, there has been a number of websites penalised for linking in this manner. This is likely to be a result of Google interpreting its use as a ‘black hat’ technique.

If you are using and following a widget you have found online (especially a free one), it is important to check where it links viewers. There is definitely a possibility that a click will send your hard earned traffic to a spammy site. If you nofollow the widget, this should be of no concern. If you don’t, the page is likely to reflect badly upon your own.

Whether or not this is unfair for webmasters who have gone to the effort of designing their own widgets is beside the point. Google, through Matt Cutts, has now officially recommended avoiding gaining links from widgets and infographics; “especially widgets.” It just isn’t worth the risk.

 

Social influence bias and online popularity

A recent study published in Science, a renowned academic journal, investigated the role of group mentality in the popularity of content published on an online forum. While the forum used in the study is unknown, “Social Influence Bias: A Randomised Experiment” raises some interesting issues for marketers and SEO practitioners.

The key finding of the study was the influence of existing popularity on a viewer’s interpretation of content. In practice, this meant that content rated positively by a number of past viewers was more likely to receive a positive review from future viewers. Strangely, this effect did not work both ways. Content that was given an artificial, negative rating by the researchers was actually corrected by audiences, who would still give positive ratings despite the content appearing unpopular.

The platform on which this study took place is unspecified for the sake of the study, but has been described as similar to Reddit. Speculation of the influence of this tendency has drawn attention to reviews. Websites where people actually use the opinions of others to assist users in making purchases, such as Amazon, Yelp, and even to an extent, Facebook, have a responsibility to ensure reviews are legitimate. Social influence bias clearly increases alongside the number of reviewers, and we have seen that these reviewers are not required to be ‘real’ to have this effect.

In a way, this re-establishes what we already know; popularity breeds popularity. The claim that “no news is bad news” still seems to be holding true. If web content can stimulate audiences into any kind of reaction, not only is this audience likely to grow, but the overall impression is probably going to be positive. Try to engage your audience and don’t be afraid of being controversial. Particularly for websites still growing their web presence, the issue is not the meaning behind the message you are sending, but the number of people you can get to interact with the message.

 

Google Vince and Brand Recognition

Whether you think its good or bad, established brands have an advantage on Google. The origins of this advantage can be traced back to early 2009, specifically, Google’s ‘Vince’ update. As is always the case, the exact functionality of this update is unknown. What is known is that after the update, top spots on Google’s SERP page were held far more frequently by big brands. While it is possible that offline presence has made some contribution to these rankings, it is far more likely that discussion of the brand on social media platforms, or even unlinked brand mentions, is contributing to their ranking.

This suggests that the importance of establishing a brand in 2013 is just as important as it was twenty years ago- there are just a few different ways that we can go about it. In the past, this was by using multiple channels; radio, television, billboards etc. Fortunately, for online marketing, these channels are all available on the one platform, the internet. Incorporating a range of content in different locations around the web, such as videos, blogs, press releases and images signal to Google that your brand is growing. The extent of this coverage is directly related to your position on a SERP. As it grows, so will the ‘trust’ Google places in your brand as an information source.

While brand establishment remains important, many managers are not considering the ways in which competition has changed. While mimicking the marketing practices of an industry leader has been a successful way of growing a brand in the past, the Vince update meant that established  brands were allowed to skip phases in an establishment process that is now expected to take place online. Guaranteeing these industry giants a good position on the SERP means that a poorly constructed web presence and even some of their blackhat SEO practices can be overlooked.

Lots of businesses looking to grow their brand use these high ranking websites and their optimisation practices as a template for their own. For many pages, this is a mistake. While the giant may not be penalised, a small brand just starting out undoubtedly would be.

Instead, these differences should be capitalised on. While your brand might not be as big, providing content that is just as good or preferably, better than these websites whilst conforming to the whims of Google can give you an edge. Doing this is likely to result in the organic growth of your webpage, something Google values far more than replication of a competitor’s web presence.

 

Google’s Growing Definition of ‘Unnatural Links’

Google has recently updated its webmaster tools, changing its definition regarding what is considered a link scheme, and what is considered an unnatural link. They have made it clear that Google well penalise “Any links intended to manipulate PageRank or a site’s ranking in Google search results.” Of course, the pages associated with these links will also be devalued as a result.

This is a problem because there is no real way Google can know the intention of a webmaster. How can they differentiate between the organic and manipulative improvement of PageRank? However they want. Critics are guessing this is largely determined by a link profile history. SEO practitioners should work to build links only when their use correlates to the needs of their target audience and is relevant to their content.

Buying or selling links that pass PageRank” has been specifically outlawed. This has been frowned upon for some time, but is of particular concern for webmasters that are not SEO savvy. While theoretically paying for links could be permitted by Google if you are solely interested in the traffic gained from it (which must be users intentionally searching for the content you provide), this is thin ice to stand on. Before getting out your wallet, think about how Google is likely to perceive your actions.

Google has also said “excessive link exchanges” will be penalised. Again, Google provides an ambiguous term that allows them some flexibility in determining what ‘excessive’ actually constitutes. To be safe, I would recommend not engaging in reciprocal link exchanges with websites hosting unrelated content. Even if there is some abstract connection, it is important to consider whether or not Google will be able to see it.

Large-scale marketing or guest posting campaigns with keyword-rich anchor text links” are also at risk. This practice was already in the grey area, but should now be strictly avoided. As Google moves towards a more semantic style of search, the quality, not the quantity of content must be prioritised. This is also important for bloggers to whom online identity is important. Being a guest poster for a site engaged in these practices might generate a little business but will undoubtedly injure your reputation as a trustworthy information source.

Finally, Google has prohibited automated linking programs and services. Abiding by this rule is largely about common sense. Naturally, bigger sites may require some degree of automation to practically maintain their web presence. Clearly, it is manipulative practices that Google is aiming to penalise. Be sensible and practical, for most this shouldn’t require a change in web strategy.

What does this mean for guest posting?

This doesn’t mean guest posting should be abandoned. For established pages, this remains a great way to link to a quality website with content similar to your own. This is a sensible move in favour of the audience of both websites, and is the kind of organic collaboration Google loves.

What is important is that this isn’t used a primary optimisation strategy, that the guest poster can provide relevant content and that the two websites share similarities. “Guest post” has become a misused term for thin, reproduced content purely in aid of link building strategies. It is important to remember that they can also be a means of adding value to a site, and Google understands this. Don’t overuse guest posting, keep their content at a high quality and everyone keeps on smiling.

 

The death of synonymous search?

Google has recently made yet another patent for determining the way in which it finds its search results; in the future it may substitute query terms for co-occurring terms. In theory, this should make optimising for a specific phrase less important for SEO practitioners. Synonyms of a search term can help expand search results for a user. However, this could also result in some additional search inaccuracies, as synonymous terms can have different meanings in different contexts.

The new patent provides a means of expanding search results for “substitute terms.” The patent uses the example of “cat” and “feline.” These can be considered as substitute terms for one another. Substitute terms could be determined by the frequency of their co-occurrence in content referring to the original search term, or their meta data. While there is likely to be a complex analysis of context, it is very possible that future high ranking search results may not contain the actual search term used.

For many, this is likely to mean some big changes in SEO strategy. Rather than waiting for Google to update its algorithm, smart practitioners will already be considering how this may impact upon their marketing practices. Consider terms analogous to those which you are already targeting and include them in your content when it is appropriate.

However, it is important to stress that Google is moving away from optimised text. While it could be of value to include these synonymous terms, Google is attempting to promote quality content. Websites ranking highly as a result of co-occurrence will do so because Google perceives their content as not only relevant but informed. Authors discussing a topic in a variety of terms signifies that they know what they are talking about. If you can produce informed content, these synonymous terms are likely to emerge in your content naturally.

 

Google’s New In-depth Search Snippets, Moz’s Local Search Factors and Much More



August 8th, 2013

Google’s New, In-Depth Search Snippets

In a recent blog post I discussed  a number of ‘tests’ that Google has recently appeared to have applied to its algorithm. There were a number of reports of the sudden appearance, and just as sudden disappearance, of in-depth search snippets. Well they’re back, and apparently they’re here to stay.

The intention behind this new type of search result, according to Google’s webmaster guidelines, is to “provide high-quality content to help you learn about or explore a subject.” This change is meant to cater for searchers wanting a deeper understanding of a topic, rather than the simple, quick answer. Supposedly this type of search accounts for 10% of total searches made through Google. If this is accurate, having your content appear as an in-depth rich snippet is likely to be incredibly valuable for a web presence trying to position itself as an information authority- which is something any SEO project thinking in the long term should be doing.

So how can you get your content to rank as an in-depth article? Google webmaster tools provides a few suggestions;

  • Utilise the schema.org article markup. “Notably the following attributes:

?                     headline

?                     alternativeHeadline

?                     image (note: the image must be crawlable and indexable)

?                     description

?                     datePublished

?                     articleBody

  • Use an authorship markup to prove your authors are experts in or are relevant to a particular topic.
  • Use pagination and canonicalisation correctly. This way Google can understand the extent to which your content covers the topic users are wanting to understand.
  • Make your logo known. Use your logo as the default image on your Google+ page and give it an organisation markup.
  • Do you host restricted content? Utilising First Click Free allows Google to crawl your content, even if access to your pages is restricted. This way it can still be displayed on a SERP.

Webmasters discussing these new changes have highlighted potential for the favouring of established, large publishers over smaller, independent producers. There has been some criticism of the potential for in-depth snippets to unnecessarily enhance the position and opinions of established brands and identities. This remains to be seen, and undoubtedly depends on the nuances of Google’s algorithm.

Google’s Local Search Ranking Factors for 2013

Every year, Moz publishes the local search ranking factors it has uncovered via frequent, exhaustive research. Rather than forcing webmasters to stagger through this detailed and complex minefield, they classified these ranking factors according to eight themes, assigning a percentage of influence to each:

  1. Place page signals. Consisting of “Categories, Keyword in Business Title, Proximity, etc.” (19.6%)
  2. On-page signals. Includes “Presence of NAP, Keywords in Titles, Domain authority, etc.” (18.8%)
  3. External location signals. Composed of “IYP/aggregator NAP consistency, Citation Volume, etc.” (16%)
  4. Link signals. Primarily being “Inbound anchor text, Linking domain authority, Linking domain quantity, etc” (14.4%)
  5. Review signals. Including “Review quantity, Review velocity, Review diversity, etc.” (10.3%)
  6. Social signals. Drawn from “Google+ authority, Facebook likes, Twitter followers, etc” (6.3%)
  7. Personalisation (8.3%)
  8. Behavioural/Mobile signals. Composed of “Clickthrough rate, Mobile clicks to call, Check-ins, Offers, etc” (6.1%)

For 2013, these local search ranking factors were presented by Moz in three categories, with a total of 83 foundational ranking factors, 93 competitive difference makers and 30 negative ranking factors identified for local search results. Now this is a pretty long list, Moz itself noting that this is a lot of confusing information to expect busy small business owners to comprehend.

The most effective way to use this information whilst expending minimal time and effort is to prioritise marketing efforts according to these themes. Comparing the relationship between these themes, yourself and your competitors is undoubtedly valuable, but if you can be bothered to do this, you should also check out Moz’s comprehensive coverage of local search in more detail. You can read it here http://moz.com/local-search-ranking-factors

Recovering From Panda, Phantom and Penguin Updates

A recent article by Glenn Gabe outlined his experiences working with websites that were penalised by either Panda, Penguin and/or Phantom updates. Most interesting is the ways that some of these sites recovered, or did not recover, as a result of updates that occurred after they were originally penalised.

Websites that went to efforts to repair the shortfalls highlighted by either Penguin, Panda or Phantom were often rewarded in a future update. This update did not necessarily have to be of the same type, with sites penalised by both Phantom and Penguin recovering as a result of Panda updates.

Sites that did not recover either;

  • Made no changes as a result of the update
  • Made changes but changed their minds and rolled them back
  • Didn’t make enough changes to revitalise their ranking

Sites hit by an update don’t have a lot of choice. They need to use this as an opportunity to identify the key problems with their webpage and work to repair them as soon as possible. If these sites stick to these alterations and conform to a clean SEO strategy it should  only be a matter of time before they recover from the update.

SEO for Google+

Within Google+ you can optimise your profile, pages, local, communities and updates. It is relatively simple to optimise on each of these platforms, which can be completed for each as follows;

Optimising Local

  • SEO title. This should simply be your name, followed by Google+.
  • Meta description. This should start with your name, followed by your tagline, occupation, employer, location and the first paragraph of your introduction. This paragraph should not be longer than 160 characters in length.
  • Dofollow links. These can be contained within your introduction and are links to other sites.

Optimising Pages

  • SEO title. The SEO title followed by Google+.
  • Meta description. Your page’s name, its tagline and the first 160 characters of your introduction.
  • Dofollow links. These are contained within your introduction and the ‘other links’ section. The official website you provide will be nofollowed.

Optimising Local

  • SEO title. The name of your business followed by Google+.
  • Meta description. This is the description of your business. To be able to do this, you must first have a local business page on Google+.
  • Dofollow links. The link your provide to your business is nofollowed, but is still worth providing.

Optimising Communities

  • Sadly, links in the community description are all nofollowed. Aside from using your community name in the title, there is little optimisation to be performed here.

Optimising Updates

  • SEO title. Start with the name of your profile/page followed by Google+. Place your update after this.
  • Meta description. This is the main text of your update.
  • Dofollow links. In a link-specific update, the link will be followed. However a link will be nofollowed if it is contained within the main text of the update.

Google Social Experience Cards

Recently, Google made another innovative patent, this time for a user card interface that would allow people to illustrate experiences they have had, or would like to have. The patent covers the practical operations of the social experience card and also outlines the social features it will use to segment these experiences.

These cards could be searched by a wide number of criteria such as cost, age, location, mood, time, date and weather, to name just a few. Users should also be able to choose the level access they want their card to have, whether that be to the public or a specifically defined private group. The cards themselves will also be able to receive comments, host images, videos and audio material.

A ‘generate itinerary’ function would allow you to schedule a series of experiences, and incorporate others in this itinerary however you may please. Similar activities could also be grouped, and experiences suggested between friends.

How will this impact SEO? Who knows. As always, brace yourself for change.

 

 

Some Interesting SEO Insights from Around the Web



August 7th, 2013

SEO and Mobile Apps

Search is the most common activity performed using mobile phones. Due to slower internet connections and mobile-specific content, apps are an increasingly popular way of performing these searches. Because apps perform their searches differently to traditional search engines, catering for them is important for marketers wanting to compete for the attention of mobile users.

This is having two key impacts upon search in general;

  • The types of content uncovered by a search are diversifying.
  • Brands are diversifying their content to ensure visibility across a variety of platforms

In order to succeed in mobile SEO, there are three main elements of consideration:

  1. Know the apps relevant to your web site
  2. Know the proportional value of each of these applications
  3. Learn to build rank and accessibility on each of these platforms

Fortunately, mobile search practices largely mirror that of desktop computers. Apps also tend to source their results from their corresponding index, meaning that traffic gained from existing SEO practices is likely to translate to mobile searches. When optimising for mobile, key points of consideration should be URL structure, crawl-ability, user experience, loading speeds, and of course the optimisation of important content.

Pure Spam

While Google tends to penalise sites by demoting them on the SERP, pages it determines to be ‘pure spam’ can actually be kicked from their search index entirely. This will normally occur as a result of a manual review by a Google employee, rather than algorithmic analysis.

Google gives no exact definition regarding what constitutes ‘pure spam.’ However, they have constructed this web page http://www.google.com/insidesearch/howsearchworks/fighting-spam.html which provides a number of examples of sites to whom they have bequeathed this title. What is interesting is that rather than being penalised for excessive adverts and links, these sites tend to be punished for their incredibly poor content.

This isn’t to say sites aren’t penalised for their backlink profiles, but suggests it isn’t Google’s primary concern. It also seems likely that review of links would be completed by a computer, rather than an individual- which seems to be necessary for Google to remove a website from its index entirely. Websites identified as spam should probably prioritise the development of their content and user experience, should they wish to survive.

Can Google Authorship Cause a Drop in Rankings?

The consensus amongst SEO practitioners is that it cannot. While a number of individuals have disputed this, most notable is Alex Yumashev, who has claimed his use of Google authorship reduced traffic on his web site by 90%.

This is certainly a little worrying. Both Google and independent SEO experts are claiming the use Google authorship should improve the ranking of a web page, but there remain a number of cases in which it can be directly correlated which a decline in traffic. Well, we need to blame somebody, and its probably the website itself.

In a recent blog post, Chuck Price discussed his experience of the authorship phenomena, hypothesising that “the (authorship) changes on the site triggered a site-wide crawl, which then, kicked in the Panda algorithm and fallout.” Now this sounds far more likely than Google penalising someone for incorporating their baby, Google+, into a web presence.

But what does this mean for web sites considering using authorship themselves? If nothing else, this emphasises the importance of reviewing your entire web presence, before using Google+. While Google will give users extra attention, this is only going to have a positive impact when a web page’s associated content and strategy is of high quality. If Google gives your web page a thorough crawl upon your utilisation of authorship, it will pay more attention to content that may have managed to fly under radar in the past.

This crawl could be great if your content and online neighbourhood is of a high quality, but could be potentially devastating if it is not. Should you wish to experience the maximum benefit of Google+ for your web site, you should perfect your website first, making all possible, positive changes prior to incorporating authorship.

Shortfalls SEO Practitioners Must Avoid

A recent article by Trond Lyngbo, an experienced SEO consultant, outlined some common, critical mistakes he has encountered in his SEO practices. Having worked on a wide range of websites across a variety of industries, these mistakes highlight some important concerns.

  1. SEO practices should not be repetitive. If your SEO work involves the same tasks it did last year (or in some cases, last month), its likely you are not taking advantage of the opportunities available to you. By using analytics and carefully watching which practices generate revenue, you can stay on top of algorithmic changes and maximise the profit generated by your site.
  2. If you have a consultant taking care of your SEO, leave them to it. Getting involved with details of minor parts of an SEO strategy diverts focus from the overall plan. If you are not an expert, giving your consultant abstract goals, i.e. “build 100 more links this week”, is only going to make their job harder.
  3. Lack of cohesion in business structure. If your business is not collaborating in giving the greatest effort to its online presence i.e. relying solely on a marketing department to control your web page, you are not fully utilising your business as an information source. SEO should merge analysis, development and marketing in a unified strategy.
  4. Don’t only think in the short term. Rather than considering SEO as a cost, look at it as an investment. Match your SEO expenditure to the revenue it generates and consider a strategy that can build upon this. SEO isn’t going away, and training your employees in optimisation is likely to pay off for some time to come. What’s more, practices focusing on building web presence in the short term are more likely to be engaged in activities that will be penalised in the future.

Make your Page Social Media Friendly

For your business, social media is little more than a way for users to find your page and access your content. That being said, the channel it provides can result in drastic improvements to a web page’s traffic, and its all about social media integration. How? Jillyan Scott, a well known content strategist, has a couple of ideas.

  • Create fantastic content. Not only should this be your primary SEO strategy, this is also the way of becoming popular on social media. Creating content that people will share gives you access to their entire social network, and the entire social network of any person who interacts with the content once it has been posted.
  • Create social content. Social media is popular because it allows people to interact with one another. If your page contains a social element (especially if people have reached it via social media) it is likely people will interact with your content, administrators and other visitors. This is clearly a favoured element of the web, and it is foolish not to take advantage of it.
  • Consider titles and images. This is the window to your content, and if you have curtains, no one will bother looking inside. Titles should be short and accurate, but if they can grab someone’s attention, even better. Being able to sum up content in an entire image is an even more valuable skill, appealing to audience members without requiring in depth attention. Remember, the key is to stand out amongst the multitudes of competing content.
  • Social buttons. People enjoy sharing content, but they are also lazy and distracted. Incorporating social buttons is an easy way to encourage people to share your web page without needing them to work for it.
  • RSS Feeds. Rich site summaries can be used to syndicate relevant content from a main website to a microsites with a more specific audience. Using RSS means you are not required to publish the same content a number of times.

A Weekly Overview Of Google



August 2nd, 2013

How does website speed effect search ranking?

We know that Google says it does, but the real question is “how?”

The obvious assumption would be that Google prioritises pages with a faster document completion or complete rendering times. This is not the case. Neither of these factors appear to make a substantial contribution to page rank.

The ever trusty Moz launched a study to investigate how Google used website speeds in ranking. They found the primary influence was the “time to first byte” (TTFB) of websites. TTFB is the time a browser takes to receive the first byte of information from a URL. Websites with a lower TTFB were strongly correlated with higher ranking search results.

The same study also considered the relationship of page size to page rank, finding that larger websites were far more likely to hold a position at the top of a SERP.

Does this mean you should upgrade your hardware, find a new domain host and flesh out your website? Maybe, maybe not. The thing is, highly ranked search results are frequently held by large businesses that spare no expense on equipment, design and optimisation regimes. Often this is a necessity due to the size of their existing customer base.

Google remains insistent that content is by far the strongest factor in a websites ranking. Unless you have big plans, the effort involved in improving your TTFB (aside from basic strategies such as the use of content distribution networks), is probably not worth the meagre contribution it will make to your page.

 How Googlebot crawls your site?

 Three primary elements are considered by Googlebot when it crawls your page.

  • Which pages should be crawled. This is depends on the quantity of backlinks pointing to a page, the site’s internal link structure, sitemaps and the strength of internal links pointing to each page.
  • How many pages will it crawl. This is related to domain authority, load time, site performance and potential crawl paths.
  • At what rate will it crawl. Factors including social media presence, the frequency with which the site is updated, the currentness of citations and domain authority.

 

In this blog, Tim Resnik provides a tutorial on how to retrieve and parse your website’s log file http://moz.com/blog/seo-finds-in-your-server-log. Resnik’s follow up blog, http://moz.com/blog/seo-log-file-analysis-part-2  goes on to explain how you can use this information to see how Googlebot crawls your website.

By understanding where Googlebot spends most of its time on your website. This allows you to position content to make it more or less likely to be crawled. It also allows you to learn the organic search value of different areas of your websites as well as a number of other crawling issues such as the proportion of bandwidth being consumed by each section.

 

Content Centric Searches

It’s pretty clear that Google wants to remove poor, thin content from the web. This isn’t just about devaluing sites with bad content, it’s also about rewarding those that go those whose content is good.

What many websites have failed to realise is that initiatives such as Google Panda can continue to penalise websites for poor content they have published in the past. Bruce Clay, in an interview with Murray Newlands for the Search Engine Journal claimed;

 

“it isn’t enough to have 100 great pages if you still have 100 terrible ones, and if you add another 100 great pages, you still have the 100 terrible ones dragging down your average. In some cases we have found that it’s much better, to improve your ranking, to actually remove or rewrite the terrible ones than add more good ones.”


This is an important point to consider. Starting a new content strategy is not enough. It is also vital to ensure past content is edited or removed if you want to experience the greatest possible results of a revitalised web presence.

 

 

Google doesn’t care how it ranks your site

When Google changes its algorithms, it is never going to be targeting a particular site. It will be targeting a particular issue, much in the way that Google Penguin targeted sites using black hat SEO techniques. The outcome of these changes can have an effect much wider than the specific issue, and while Google will work to keep this impact minimal, there are always going to be casualties.

If you experience a decrease in site exposure as a result of an algorithmic change, it is likely that you share one or more similar qualities with the websites Google was targeting. This doesn’t mean you need to change everything, it just means you need to determine what these qualities are, and why Google is viewing them in a negative light.

This way, you can easily alter your web site to meet the goals of Google. Yes, it sucks that they can do this- but not everyone will review the change in depth. Understanding the aim of the algorithmic changes allows you to conform to them. If you do, and your competing websites do not, you can gain easily gain a very competitive edge.

Google Panda Update

Starting July 18th, Google rolled out a new Panda update over a period of ten days. A couple of changes have been noted by webmasters;

  • An increased in the number of impressions displayed in Google Webmaster Tools, despite a stable rate of traffic. This could mark a change in how the software measures impressions. It could also be the sign of something bigger.
  • Information sites experienced the greatest impact. Critics believe this is due to changes in the way differences between spam sites and authoritative websites are conceived.
  • Naturally, a number of websites using Google+ were rewarded. While it is still unclear the extent to which Google+ impacts upon rankings, it is clear it plays a role, and this should be considered by SEO practitioners.

 

We have discussed a number of other recent changes to Google this week. These included:

 

  • Google’s revised link scheme. Link building is being phased out, with Google sanctioning increased penalties for a much wider definition of manipulative practices. “Nofollow” ads with links as well as any of your own content you repost on another platform. Also, stop creating content specifically for keyword rich anchor text.

 

  • Google’s ‘similar site’ feature. This feature can be used to see if pages share a source of inbound links. It is also a good way to review your website’s own online neighbourhood.

 

  • The multi-week algorithm update. Google have been making extensive, unpredictable changes to their algorithms, and this has been happening almost daily. In particular, these changes seem to reduce the prevalence of partial match domains in search engine results, a trend critics seem to think may continue.

 

  • Press releases are considered a source for unnatural links. As a result, links within press releases must be nofollowed, lest their associated pages be devalued.

 

    • Google doesn’t like the misuse of ccTLDs. Some country code top level domains are cheaper than others, and some stingy webmasters are making their choice of ccTLD based on cost, rather than the location of the website. Their pages are likely to be penalised as Google understands it to hamper geographic searches.

 

  • Local searches are becoming more relevant. Google has patented a means of merging local and web search results when both pages come from the same source. Only the local listing will remain. This could improve the overall power of local web pages in a search.

 

We also suggested a few strategic ideas for SEO practitioners.

  • Integrate SEO across marketing practices. The internet has become an integral part of many businesses, and SEO can no longer be considered a stand alone task. Performing optimisation separately limits the opportunities that online marketing can hold for a business, whilst having the appearance of a more manipulative strategy.
  • Think semantically. Link building schemes can’t last forever. You should be considering your own online neighbourhood as a matter of prime importance. Stop excessively promoting your content and find your information niche- this will allow it to promote itself whilst minimising the risk of being penalised on a SERP.
  • SEO super signals. Five interdependent factors should be the foundation of any SEO strategy; quality, uniqueness, authority, relevance and trust.
  • Sometimes, you can repost content. Some viable methods include reflection, use of rel=canonical and 301 redirect.
  • We also offered some tips for websites getting started.

1.   Encourage people to ‘share’ your facebook page. If you can also get them to discuss the website in the descriptive terms you’re targeting, even better.

2.   Build some hype before the site goes live. The launch is excellent content for your social media pages.

3.    Prior to launch, have your site reviewed by an interested web community. This can also be a way of improving any elements you have overlooked.

4.     Submit your website to web design galleries and competitions. You may not win but if will help build your online presence and grow your network.

Google News



August 1st, 2013

Google says press releases contain unnatural links

John Mueller, a Google webmaster trends analyst, has just hosted a video conference in which Google’s position in regards to press releases was further clarified. Mueller states that press releases should be considered “advertisements” and that links contained within them should be “no-followed”.

This seems to be a response to SEO practitioners over-utilising press releases to artificially promote their web pages. While Mueller did specifically claim that all links within press releases should be “no-followed”, he did discuss the importance of extinguishing links with optimised anchor text. It could be risky business, but there is a chance sites won’t be penalised for direct URLs in a press release that do not use anchor text.

 

ccTLDs shouldn’t be used as an alternative to traditional domains

ccTLDs, or country code top level domains, have become an attractive option to web developers as .com domains become increasingly expensive. While this practice is yet to be widespread, Google’s Matt Cutts has identified that websites using the ccTLD of a location other than their own are doing a “disservice” to the domain- and will be penalised as a result.

This makes sense. Google uses ccTLDs to help provide users with personalised, geographically relevant results. Using a .in domain when your business is located in Australia may result in Indian citizens being linked to your web page which offers services they cannot access, decreasing the quality of their search results.

 

Merging local and web search results

Google has actually patented a means of merging local and web search results. Whilst signs of its use are yet to be seen. The change is likely to mean that web pages listed twice on the same SERP, both as an authority page and as a local search result will only receive a single, local listing.

The patent suggests this may occur as a result of geographic location of searcher when the local search results are considered of high relevancy. This change could result in the merging of an organic search result with a local listing, improving the ranking of the local listing against others. Theoretically, this could improve the ranking of local web pages in general.

SEO super signals

A recent article published by Alan Bleiwess through The Search Engine Journal highlighted five major factors that contribute to the success of an SEO strategy.

  1. Quality. Content should also be highly concerned with its intended audience. Providing relevant, accurate content is a foundational component of building trust and becoming a credible information source.
  2. Uniqueness. Giving people information they can’t get anywhere else keeps people coming to your site.
  3. Authority. This is not just about linking to other credible sources, it is also about your authors and their presence on the web. However, it is important to think strategically, as relying on one or two individuals for an entire brand can be risky.
  4. Relevance. Does your website allow users to access information or perform a transaction? What specific pieces of information are being sought? Determining the intent of users and catering to that desire is vital in the success of a web page.
  5. Trust. Trust requires SEO practitioners to consider the social norms of an online community. Trustworthy sources don’t shameless push their own website; they also promote others and participate in an information exchange. Everything you post online contributes to the perception of your image and building trust requires online activity to be in aid of users, not just the web page itself.

 

Strategic Suggestions

When you have great content, find a way to use it again. Reposting content is normally a bad thing, as it can be marker of low quality, manipulative SEO practices. There are however, a few ways through which you can use your best content a second or third time.

  • Link to the content with a summary of the content, along with an update and discussion of its original context. It is natural for a website to reflect upon content that it has posted in the past, especially when something has changed.
  • Using rel=canonical. This is particularly advantageous to the owners of multiple websites, and allows them to determine what Google shows on its SERP while obtaining PageRank. This is a great way of republishing content from associated websites.
  • 301 redirect. While this strategy directs users away from your website, it is a potential way of reposting content whilst enhancing credibility.

 

Starting a new website? One of the hardest parts of establishing yourself is gaining links. Here are a few strategies that can help getting links during the establishment phase of your website.

  • Encourage friends, employees and everyone you know to ‘share’ the launch on Facebook and inform them of the descriptive terms you are targeting in order to strengthen ties with social media.
  • The launch itself can be promoted via social media and press releases. This can generate hype and also creates ties between the future website, people, and other content.
  • There are a number of web communities that are interested in evaluating the quality of a web page. Having these groups review your web page before it goes live can provide valuable insight into the pages’ operation as well as contribute to hype, whilst connecting the web page to legitimate content.
  • Submit your page to web design galleries and award websites. Whether you think the site is ground breaking or not, this strategy is a means of strengthening ties with an existing, credible web presence. Remember, the sites you submit to will reflect upon your own, so choose carefully- you don’t want the page to be viewed as spam.

What’s Google Doing?



July 31st, 2013

Well, unfortunately, no one really knows. Google is mysterious and complex, to the extent of being convoluted. Whether their algorithm improves the quality of results returned by the search engine, or if it simply ensures against anyone actually understanding its true functionality is a subject for debate.

What is Elevate doing? We want to solve the Google mystery. Search engine functionality is changing daily, so that’s how often we’re going to give you updates on what’s going on. We pride ourselves on being on top of every algorithmic change, in many cases we are actually able to foresee changes and use them to our advantage.

Listen to us, and maybe you can gain an edge too.

 

Google’s revised link scheme

While link building has been a strong SEO strategy in the past, Google’s webmaster guidelines have recently been updated in relation to this issue, suggesting increased penalisation of websites using link building. The definition of link schemes now includes three more examples;

  • “Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links”
  • “Advertorials or native advertising where payment is received for articles that include links that pass PageRank”
  • “Links with optimised anchor text in articles or press releases distributed on other sites”

How does this affect your optimisation strategy?

  1. Don’t create content for the purpose of keyword rich anchor text. Guest posts used in aid of this strategy are likely to be targeted and devalued by Google.
  2. Don’t use advertising to build links for your website. To avoid Google accusing you of this, use ‘nofollow’ for ads with links.
  3. If you post articles from your website on another, links in optimised anchor text must also be nofollowed.

 

Google’s ‘similar site’ feature?

In Google’s search results there will be a small green triangle next to a websites URL. Selection of this triangle will always give the option to review ‘cache’, but will occasionally provide an additional ‘similar’ option. Selecting this option will provide the searcher with a number of results identified as similar to the original URL by performing a “related:www.ORIGINALURL.com/” search.

While the exact method by which Google identifies related websites is unknown, it can be a means of discovering pages that share a source for inbound links. Webmasters have also discussed the use of the similar site feature to understand the environment in which the page is seen to belong, which is most important when Google perceives this environment to be of low or spammy quality. If this is the case, changes in linking to distance yourself from these pages is important to ensure their lack of quality does not reflect upon your own webpage.

Google’s Update on EMDs and PMDs

SEO practitioners have been up in arms over a tweet from Matt Cutts, the leader of Google’s webspam team. Cutts wrote “Multi-week rollout going on now, from next week all all the way to the week after July 4th.”

Now this rollout has already taken place, but what actually happened? Wouldn’t we just love to know? Those involved in monitoring Google’s search results in order to determine the impacts of these changes, such as MOZ, have been unable to determine how these changes are impacting upon websites. What they do know is that these changes have been big, and there have been lots. The change in website rankings also suggests that not all of the changes made in the rollout were kept. Some of these temporary alterations may have been tests regarding Google updates in the future.

While there has been recent downward trend in the influence of partial match domains (PMD) and exact match domains (EMD) upon search results, during the rollout period both PMDs and EMDs experienced an additional plunge in influence. This drop was more significant for PMDs, with influence reduced by almost an entire percentile over the last two weeks of June. Whilst this change was corrected, the impact of both PMDs and EMDs upon searches continues to lower on a consistent gradient. This consistent trend could imply the drop was a test by Google of algorithmic updates intended to reduce the power of PMDs and EMDs entirely.

Peter Meyers has hypothesised that these changes may “mitigate the impact of one-day updates or make the update process more opaque.” Should it be the latter, SEO practitioners are likely to experience increased difficulty in their optimisation practices. Even if Google are simply trying to reduce imposition of future updates, these updates aren’t going to make SEO work any easier.

What Should You Be Doing?

 

Integrating marketing with SEO

Stop viewing SEO as a competition between yourself and the SEO teams of other websites. The web is here to stay, and as a result, SEO will be around as long as there are search engines in which to optimise. The web is quickly becoming an integral part of business, as is SEO. Why then, is SEO so frequently dismissed in the strategy of established businesses?

One of the most powerful elements of the internet is its ability to generate mass popularity in a very short period of time. If a company already has this much popularity, the role of the internet, and hence SEO, can be undervalued accordingly. These businesses are often ingrained with existing culture and practices in their marketing activities, whether this is branding, PR or exposure. Integrating marketing with SEO practices can be difficult, particularly when marketers do not understand or value the intricacies of the process.

As a result, SEO practitioners must learn about these marketing practices and hone their ability to explain SEO functionality in plain, simple language. They have the knowledge to create vast growth within a business- their goal should be to help marketers understand why, or even to become marketers themselves.

 

Think laterally: semantics

While we can’t predict the exact changes Google are going to make, they have recently released a knowledge graph that expresses some of the ways in which Google wants to change its search functionality. While nothing is concrete, a majority of critics believe that future searches will be all about semantics.

By using the content associated with a link, paying attention to data formats incorporated by websites, social signals and the types of sites a webpage is associated with, Google is able to provide more personalised search results to individual users. Simon Penson has gone as far as to recognise Penguin as the origination of this trend, believing it may be a “process of cleaning up the link graph so that Google can actually work out your relevancy.”

Rather than resign themselves to the unknown, this encourages SEO practitioners to use Meta data integration where they can, consider their position in regards to other authorities in their field and focus heavily upon the creation of quality content.

Google is bigger than us, and from a commercial perspective it is far more sensible to play by their rules, rather than try to skirt around them.

How To Effectively Measure SEO Success?



July 23rd, 2013

There are a number of factors that are important to keep track of when attempting to determine the success of your optimisation efforts. Firstly and perhaps most importantly, is the role in which search engines have played in referring people to your web page. Visitors can also reach your web page via other means, such as directly typing your webpage into their URL bar or links from associated media. Knowing the percentage of your audience that have reached your web page from search engines and how this percentage has changed across time in an effective way of reviewing the impact of your SEO activities.

Comparing which search engine has sent you which traffic is another valuable method of measuring success. A drop or rise in traffic to your site is likely to occur in relation to a specific search engine. But comparing the traffic that has come from each individual search engine it can be determined areas in which shortfalls or advantages are likely to reside in an SEO strategy.

It is also important to determine the conversion rate of the search terms for which you are optimising. It is very possible that your search term with the highest conversion rate is where you experience the most competition from other websites optimising for the same term. Focusing on the most rewarding search terms is likely to result in increased traffic and is an important way of measuring your websites success against the success of its competitors.

Search Engine Optimisation is a fiddly business, and there are a good number of other techniques with which you can evaluate your website. However, to be truly competitive it is best to hire a professional SEO company, or at least engage in some form of SEO consultation. Contact us today and discuss the impact we could have on your web presence.

Get in touch!

Find us at:
Suite 203, 109 Alexander St,
Crows Nest, NSW 2065


Call: 02 9126 8993
E-mail: sydney@elevateconsulting.com.au