|
Internet addresses registered in new gTLDs are holding their own against—and in some cases outperforming—comparable addresses registered in legacy domains like .COM, according to new data that provides the best window yet into the operational functionality of new gTLD addresses.
A question on everyone’s mind in the run up to new gTLDs was: how would new domains perform in the wild against legacy domains on the key criteria of search? Even though Donuts bet on the hypothesis that new gTLDs would be better than their limited and outdated legacy counterparts by virtually every measure, we could not be sure how new domains would perform in search until we began to see them in use. Now, less than a year after the public availability of the first-launched gTLDs, we’re getting early indications of just how effective new gTLDs can be.
Several recent studies and data analyses by search experts and domain investors studying new gTLD performance suggest that the first generation of new gTLD addresses are either matching or outperforming legacy gTLDs and ccTLDs in search performance.
Search company Total Websites published the results of a series of targeted case studies on keyword-rich domains registered in new gTLDs and found “it’s clear to see that new gTLD domains do boost SEO rankings.”
In an interview on Reddit, Globerunner’s SEO Expert Bill Hartzer remarked on his own qualitative and quantitative analysis of new domain performance:
“Based on the New gTLD vs. Dot Com research that I did, Google tends to prefer the new gTLDs versus the .com, at least in Google AdWords. In fact, they gave the new gTLDs we tested more impressions for less money. And new gTLDs convert just as well as a traditional .com domain.”
Finally, German SEO pros Searchmetrics conducted an apples-to-apples comparison of new domain addresses vs. legacy ccTLD addresses and found that, on average, .BERLIN addresses show up 1.18 spots higher in search results than comparable .de addresses.
And supporting this research we see additional marketplace evidence that new gTLDs perform well:
Obviously it’s still early in the life cycle of new domains to make definitive statements about their performance, and reading Google’s SEO tea leaves is notoriously difficult, but the data so far paints a very encouraging picture for the millions of users who have already registered new gTLD names, and the millions more that are soon to follow.
It’s also encouraging to us that Amazon and Google applied to operate almost 200 new gTLDs. These are savvy Internet companies who obviously understand search and have concluded that new gTLDs will perform very well.
Virtually all of the researchers noted that the current crop of new gTLD names is extremely keyword rich—both before and after the dot—a factor that no doubt plays a pivotal role in their search boost. But while that keyword richness may be one of several major drivers in the success of early adopter new domains, it is also a factor that won’t change anytime soon.
The value proposition of new gTLDs is that they provide real semantic value and relieve the artificial scarcity that has stultified the legacy namespace for more than a decade. Even as new gTLDs grow exponentially in popularity, we are many years away from any scenario in which registrants have difficulty finding available, keyword-rich names in the new gTLD space.
One thing is clear though, if these search engine trends continue, even before the most appealing of the new gTLDs hit the market, 2015 shapes up to be a record-breaking year for the global Domain Name System.
Sponsored byCSC
Sponsored byDNIB.com
Sponsored byWhoisXML API
Sponsored byRadix
Sponsored byVerisign
Sponsored byVerisign
Sponsored byIPv4.Global
Hi Paul
It’s normal courtesy to link to the post, which inspired you to include all three cases. Just saying ;)
Apologies, Christopher, for the oversight. Here’s your link:
http://blog.europeandomaincentre.com/does-a-new-gtld-rank-higher-in-google-some-seo-cases/
Paul,
If all is held equal and assuming adoption then I would agree. However, the way that most of the current new gTLDs are launched, registered and used is not consistent with what Google (or Bing) is looking for to perform well in a search environment. Adoption, content and usage is the biggest issue that new gTLDs face today. New gTLD success stories seem to be outliers thus far and looking forward to seeing stronger adoption patterns in 2015.
A problem that is relevant to search is parking pages and policies allowing such low quality content that search engines dislike. This is why as part of our policies for .MUSIC, we prohibited parking pages because they encourage low quality content. This is a problem for new gTLDs. For example, as of today, 70% of Donuts’ .GURU domains are parked (See http://ntldstats.com/tld/guru).
Parked domains never rank well in search results. In fact both Google and Bing penalize parked pages because they are not sticky and offer nothing useful to the average Internet user to make a repeat visit.
Search is complicated and in many cases heavily influenced by personalization. This means the results you might see from your location are likely not be the same results that I would see. You quote the Coffee.Club example. If you search for “coffee club” in Google Australia (https://www.google.com.au/?gws_rd=ssl#q=coffee+club), the top of the result will be an exact match domain based on the local .AU ccTLD: coffeeclub.com.au.
The truth is that personalized search results vary depending on:
1) Reported Location
2) IP Location
3) Language
4) Device e.g Desktop vs Mobile vs TV
5) Search history e.g. cookies & signed in vs. not signed in)
6) Previous searches
7) GPS Location e.g. Mobile
8) Social Networks e.g. number of likes, friends and shares
9) Trust
10) Security (http vs. https)
11) Relevancy
etc.
Certain search results are also served based upon the freshness rate too. Parking pages never get updated with new fresh, relevant content that is clicked upon, so their quality score is not very strong. Fresh content will even outrank otherwise superior ranking pages.
While one can easily “game” Alexa rankings, it is impossible to “game” all of the most important ranking signals that Google takes into strong consideration:
1 Content
2) Freshness
3) Weighted Links ( According to Google, the most important sites are likely to receive links from other relevant, important sites)
4) Link diversity
5) Author or Publisher Identification
6) Quality signals e.g bounce rate or time spent on site
etc.
Google and Bing will not give preferential treatment to TLDs unless there is compelling evidence that those TLDs are trusted, authoritative and relevant. Such evidence would look into content and use, security and how users perceive those TLDs. For example, ccTLDs have been a success because they are trusted for “local” searches and have been adopted by their corresponding communities. Other TLDs such as .EDU and .GOV have done well because they are trusted and considered relevant and you rarely see parking pages on .EDU or .GOV domains.
If the premise of this article is that new gTLDs will do better in search results then it only because of natural anchor text linking. This applies generally for exact match domains which
can boost your search rankings if all is held equal. If your subject-matter’s “keyword” is part of your domain name then those linking to you will naturally add the gTLD string keyword in their anchor link pointing to your site which will only help your ranking if all is held equal when compared to your competitors. This is a positive factor for ranking but alone it will not help you achieve top search results for any competitive term.
Constantine Roussos
.MUSIC (DotMusic)