Category Archives: Search Engine Technology

  • -
Keyword Density Percentage and Google SEO JA Publications, Inc. 10

Keyword Density Percentage and Google SEO

Tags : 

What Exactly is Keyword Density?

 

If you are writing a 600 word article on Google AdWords tips, and you place the 3 word keyword phrase, “Google AdWords tips” in the article 10 times, you would have a keyword density of 0.6% for that article or page meaning that out of 600 words, 0.6% of those words are the focus keyword. The title tag, description tag, and content are all considered when determining the keyword density. You can say the entire page is considered. The focus keyword for this article is “keyword density percentage.” As of January 31, 2014, it is the second result of the organic results on page 1 of Google. The keyword density percentage is 0.4% and my SEO plugin for WordPress is telling me it’s too low. The keyword is found 3 times. If you are not familiar with the Yoast SEO plugin , check it out. It’s a great tool. To rank this article on Page 1 nationally, I did nothing more than write the article and make the Yoast dots turn green. There was nothing “Black Hat” done in order to get to #2 of page 1.

(Nkr * Nwp / Tkn) * 100

The formula for keyword density is: (3*3/739)*100 or 0.4 percent

In this post, there are 739 words. The focus keyword contains 3 words and it’s found 3 times throughout the entire page including the URL. The meta description, content, headings, and alt tags are all included.   Here’s the SEO analysis for this post. Notice the keyword density and the fact that I did not use the focus keyword in any subheading.

Overdoing Keywords Is Bad for SEO

Recently a friend of mine’s rank in the Google SERP dropped dramatically (from page 1 to page 13) and he called me and asked my opinion of his SEO. The first thing I did was crawl his site so I could take a look at his site’s SEO as a whole. Besides having several broken links and images with un-optimized titles and alt tags, and just generally poorly written meta tags, I notice he used the word “shuttle” 6 times just in the homepage’s title tag alone. In this case, this would be considered “keyword stuffing,” and wasn’t the only reason that Google removed his website from the search results. But, I’m willing to bet it contributed heavily. There were lots of different problems with this particular site. Either way, his site’s keyword density as a whole was well over what it should have been and he paid the price for it in the Google SERP.

What is the Ideal SEO Keyword Density for Google SEO?

There is really no such thing as “ideal keyword density.” On different occasions, webmasters have contacted me and asked what I felt the ideal keyword density is for their particular site or webpage. This question can only be answered on a site by site basis, however more times than not, keywords usually only have to be used a few times per page. When I optimize a website, I usually place the page’s focus keyword in the title tag once, in the page content 3-5 times (depending on the subject), and 1 time in the meta description. If you have good, solid content this is usually enough. Matt Cutts, the head of Google’s WebSpam Team, has said that good content will trump SEO every time, so the smart thing to do would be to focus on your copy and on creating good content instead of keyword density percentages.

Here’s Matt Cutts of Google Explaining Keyword Density & How it Pertains to Google SEO

 

 

 

 

 

It’s always best to create good content with descriptive titles and tags and throw your page’s keywords in wisely and well placed. “Grey/Black Hat” SEO techniques like “keyword stuffing” worked years ago but are now obsolete after Google released their latest algorithm updates like Penguin and Hummingbird.

 


  • -
Google Webmaster Tools Disavow Feature JA Publications, Inc. 5

Google Webmaster Tools Disavow Feature

Tags : 

Disavow “Spammy Links” with Google’s Disavow Links Tool

 

When webmasters attempt to drive traffic to their website by means of spammy back links, it can hurt your ranking on Google Search. Here, Matt Cutts explains how to have the almighty Google ignore those SPAMMY back links. In turn, Google will not hold these page rank-degrading back links against your website. When I say messy or bad back links, I’m referring to links to your site from either porn sites or other notoriously SPAMMY sources. Mass blog commenting and/or article marketing/spinning are a couple of ways to get a messy back link profile. Please keep in mind, unless you or your company practiced improper or poor internet marketing strategies and have caused you to have a negative back link profile that is hurting your page rank, in most cases, it isn’t necessary. Improper use of the Disavow Link Tool can hurt your site’s current rank with Google. I  recommend that only advanced users that are knowledgeable on this topic use this tool. Most of the time, it’s not even necessary. Simply cleaning up the bad links will help improve your site’s ranking with most search engines. The Google Webmaster Tools Disavow feature is a last resort for webmasters that have really poor back link issues that are directly hurting their site’s rank.

disavow links

Disavow Links with Google Webmaster Tools | Video by Matt Cutts


  • -
Google Hummingbird and the Future of Search JA Publications, Inc. 2

Google Hummingbird and the Future of Search

Tags : 

By William Gannon

Providing Nectar to Google Hummingbird

 

There has been a great deal of concern within the SEO community recently, over the changes implemented in the Google Hummingbird update. The latest algorithm comes as a planned response to innovations made in voice recognition and wearable technology devices such as Google Glass. Although, the changes have been occurring over the course of the year in 2013, many concerns have been raised as legacy systems are decommissioned.

One of the legacy systems that SEO consultants are particularly concerned with remains to be Google’s PageRank. While search engines run on many algorithms, the implementation of social media signals and page connections through simple links to high value sites tend to provide higher rank in websites. Linking back is a popular method used by SEO companies to allow the search crawlers and traffic to flow through the collection of sites. Keywords, in this scenario, ultimately becomes the goal which leads to nefarious keyword manipulation and poor search results. Hummingbird integrates new technology that blocks keyword cheaters and benefits quality content.

Prior to Google Hummingbird

Prior to the Hummingbird update, in April of 2012, Google released Penguin which targets techniques that cheaters use to manipulate search rankings. Earlier in 2012 Google implemented the Panda update, which incorporates a human search quality rater that will assist in assigning a URL rating. Websites will be judged according to the intent and utility that they provide. Other factors include the website’s contact location, language, query interpretation, necessary content, and video content relevance. The scale beyond the robot queries now assigns a manual rating system. Pages are rated on a scale of Vital, Useful, Relevant, Slightly Relevant, Off-Topic, Useless, and Un-rateable.

The idea of a long-tail search remains the focus of any site that seeks to provide information to its visitors. Ideally, you never know what the end-user is going to provide in the search query box, so your interests are best suited in concentrating on SEO basics, design, and providing relevant content. During testing of the new systems, we have found that focus will remain in identifying people or businesses that provide the internet with the best profile of the target search.

 

 

google hummingbird updateWhen you provide content to a Google listed domain, the index targets users, pictures, article content, videos and any links to further content on the website. The old tricks are still the best if you seek to provide the internet with information about your website. Making your content relevant in the social media circles can certainly provide more opportunities to be found by the major search engine providers. It is best to know your targeted audience and provide to your community with a simple catchy URL address and with the expansion of new generic top-level domains (gTLDS) like .ninja, thus creating a new layer to theory of word games and structures.

 

The Future of the Search Query

The Hummingbird update ultimately leads us to the future of searching and queries in conversational search patterns. As a business owner, we would all like to be at the top of the list when someone inputs swimming pools, carpentry, landscaping, web design, computer repair, etc. In these cases you will try to foresee what users are querying in search engines that might bring users to your page. It is very important to include your locations and target business areas with relevant content that allows the new rating systems to index your sites accordingly. The main idea is to allow SEO’s to publish relevant material and query results to mobile and voice recognition systems.

 

Hummingbird is All About Improvement

The improvements following Hummingbird will allow Google to rank your page against the millions of other pages out there on the internet. The best way to attract attention to your website is to utilize all the tools available to the developer community. Start with the basics of simple webpage design, apply the techniques of ad marketing, and always focus on networking.The major advancements in search engine technology are growing so quickly that one can find themselves in the position of perpetuating old and out of date methods. It’s important to keep up with the major changes that seek to enhance our visibility and customer base. Allowing a pro SEO to optimize your website, will provide Hummingbird with vital information to help improve your standing in the internet community and draw more customers to your business.