Category Archives: Search Engine Technology

  • -
Personalized Web Search & How to Disable It (Kinda) 4

Personalized Web Search & How to Disable It (Kinda)

Category:Search Engine Technology,Uncategorized


What’s Personalized Web Search ?

 

“Google introduced personalized search in 2004 and it was implemented in 2005 into Google search. Google has personalized search implemented for all users, not only those with a Google account. There is not much information on how exactly Google personalizes their searches; however, it is believed that they use user language, location, and web history.” Source: Wikipedia

 

Recently, Google said in a blog post that it’s “transforming Google into a search engine that understands not only content, but also people and relationships.” So, if your Google+ friends like a certain restaurant, its more likely to come up higher on a search. And if you search for a person and that person has a Google profile, that’s what’s likely to pop up. Personalized search uses profiling and user tracking with artificial intelligence to custom-tailor your search results pages, advertising preferences, other other aspects of the internet and internet search. Many find it an invasion of privacy and sometimes even consider the techniques a rights violation. But, this post doesn’t go into those points of view, as we are trying to keep this short, while staying neutral to the mighty GOOG.

 

personalized-search

Image courtesy of dontbubble.me

 

How to Turn it Off ( Sort of )

 

Unfortunately, the only way to completely turn off Google personalized search is to not use Google or Google products or visit the sites that use the Google advertising network. However, there are some changes you can make to prevent the impact it has on your online privacy and the search results you see. The first thing you can do is make sure if you use Chrome browser, to disable websites from accessing your device location and always clear your browser cache, history and cookies every day at the least. I don’t store cache or cookies beyond the session, so once my browser closes, everything gets deleted and the disk space is overwritten.

Virtual Private Networks

Google will do a reverse location search on your IP address and use your location, even if your browser settings have it disabled on sites. I’m referring to Google.com which accesses your IP regardless and will usually use the location based on your IP address. This is where Virtual Private Networks or VPNs come in handy. Using a VPN service that anonymizes your device’s IP address randomly and on the fly makes your identity different constantly. Here’s a link to a post that points out 10 reasons you might want to use a VPN. 

 

Cache, Cookies, & Private or Incognito Browser Mode

Using Private or Incognito mode as much as possible can minimize the impact personalized search has on your internet activity also.  Over time, everywhere you go is stored into a big folder on a Google server with your IP on it. Then, that data is analyzed, and using AI or artificial intelligence, shows you what it thinks you should see based on the websites you visit and your typical locations. There are programs that automatically delete tracking cookies and other search & history details that get stored into you profile and your devices. There’s several good applications that make this task a breeze and are available for most operating systems and browser platforms free of charge with limited functionality. CCleaner for Windows works with most major browsers including Opera, Firefox, and Chrome and offers a free version that does everything except clean your browser automatically. You’ll have to buy the full version if you want to clean your browser automatically.

 

Whenever you’re using https://www.google.com, be sure to find the “search settings and turn off private results. You might want to pause all of your internet activity in your Google account activity controls also. I’ve even heard of software that can change your device MAC address also. If you understand why that would be significant, your knowledge is beyond the scope of this post obviously.

 

 

 

Some folks use TOR networks or any encrypted anonymous network connection to hide their internet activity, but Google blocks most anonymous networks from using search and most of their services.  I’m pretty sure anonymity networks are heading in the direction of becoming illegal from the way it’s going here in America, the land of the free.

At the rate at which internet technologies are evolving, this post will most likely be obsolete within a year tops. Anyone hoping to stay only a sequence of 1’s and 0’s is going to have to do much more than this very soon at the current rate of change in search technologies or, God forbid, stop using it all together.



  • -
Keyword Density Percentage and Google SEO 10

Keyword Density Percentage and Google SEO

Category:Google SEO,Search Engine Technology Tags : 

What Exactly is Keyword Density?

 

If you are writing a 600 word article on Google AdWords tips, and you place the 3 word keyword phrase, “Google AdWords tips” in the article 10 times, you would have a keyword density of 0.6% for that article or page meaning that out of 600 words, 0.6% of those words are the focus keyword. The title tag, description tag, and content are all considered when determining the keyword density. You can say the entire page is considered. The focus keyword for this article is “keyword density percentage.” As of January 31, 2014, it is the second result of the organic results on page 1 of Google. The keyword density percentage is 0.2% and my SEO plugin for WordPress is telling me it’s too low. The keyword is found 2 times. If you are not familiar with the Yoast SEO plugin , check it out. It’s a great tool and definitely makes optimizing WordPress easier . To rank this article on Page 1 nationally, I did nothing more than write the article and make the Yoast analyzer dots turn green. There was nothing done off-site and I didn’t submit the URL to Google or Bing. It’s in this website’s sitemap and search engines first crawled it there.

 

 

The Formula for Keyword Density

 Density = ( Nkr / Tkn ) * 100

‘Keywords’ (kr) that consist of several words artificially inflate the total word count of the dissertation. The purest mathematical representation should adjust the total word count (Tkn) lower by removing the excess key(phrase) word counts from the total:

Density = ( Nkr / ( Tkn -( Nkr * ( Nwp-1 ) ) ) ) * 100. where Nwp = the number of terms in the keyphrase.

This general formula allows that the total word count will be unaffected if the key(phrase) is indeed a single term, so it acts as the original formula.

In this post, there are 810 words. The focus keyword contains 3 words and it’s found 2 times throughout the entire page including the URL. The meta description, content, headings, and alt tags are all included.   Here’s the SEO analysis for this post. Notice the keyword density and the fact that I did not use the focus keyword in any subheading.

keyword density percentage

Analysis from the Yoast SEO WordPress plugin

 

 

Overdoing Keywords Is Bad for SEO

Recently a friend of mine’s rank in the Google SERP dropped dramatically (from page 1 to page 13) and he called me and asked my opinion of his SEO. The first thing I did was crawl his site so I could take a look at his site’s SEO as a whole. Besides having several broken links and images with un-optimized titles and alt tags, and just generally poorly written meta tags, I notice he used the word “shuttle” 6 times just in the homepage’s title tag alone. In this case, this would be considered “keyword stuffing,” and wasn’t the only reason that Google removed his website from the search results. But, I’m willing to bet it contributed heavily. There were lots of different problems with this particular site. Either way, his site’s keyword density as a whole was well over what it should have been and he paid the price for it in the Google SERP.

 

 

Taking Google’s Advice is a Safe Bet

What’s the Ideal Keyword Density for Google SEO?

There is no such thing as “ideal keyword density.” On many occasions, webmasters have contacted me questioning their SEO and asking what I felt the ideal keyword density is for their particular site or webpage. Every site is different, however more times than not, keywords usually only have to be used a few times per page.  It’s more about where your keywords are on the page then the number of times they are found. If you read your post and find yourself repeating the same phrase over and over, you’re in fact hurting your SEO. Focus on creating good content that answers questions and addresses the needs of your target user base instead of keyword density percentages.  Put your efforts into making sure your site is useful and original. If you try to force your users into taking anything else, they will go elsewhere every time. And, once you lose them, they’re not coming back. You can read more on  the fundamentals of online marketing here.

Don’t Focus on Keyword Density

 

“Grey/Black Hat” SEO techniques like keyword stuffing with white text on white backgrounds worked years ago but are now obsolete after Google released their latest algorithm updates like Penguin and Hummingbird. Google has evolved into a highly intelligent being and is much smarter than people give credit for when they focus on keyword density. Here’s 5 SEO copywriting tips that will help you get started on writing quality content instead of keyword density percentages.

 


  • -
Google Webmaster Tools Disavow Feature 5

Google Webmaster Tools Disavow Feature

Category:Search Engine Technology,Uncategorized Tags : 

Disavow “Spammy Links” with Google’s Disavow Links Tool

 

When webmasters attempt to drive traffic to their website by means of spammy back links, it can hurt your ranking on Google Search. Here, Matt Cutts explains how to have the almighty Google ignore those SPAMMY back links. In turn, Google will not hold these page rank-degrading back links against your website. When I say messy or bad back links, I’m referring to links to your site from either porn sites or other notoriously SPAMMY sources. Mass blog commenting and/or article marketing/spinning are a couple of ways to get a messy back link profile. Please keep in mind, unless you or your company practiced improper or poor internet marketing strategies and have caused you to have a negative back link profile that is hurting your page rank, in most cases, it isn’t necessary. Improper use of the Disavow Link Tool can hurt your site’s current rank with Google. I  recommend that only advanced users that are knowledgeable on this topic use this tool. Most of the time, it’s not even necessary. Simply cleaning up the bad links will help improve your site’s ranking with most search engines. The Google Webmaster Tools Disavow feature is a last resort for webmasters that have really poor back link issues that are directly hurting their site’s rank.

disavow links

Disavow Links with Google Webmaster Tools | Video by Matt Cutts


  • -
Google Hummingbird and the Future of Search 2

Google Hummingbird and the Future of Search

Category:Google SEO,Search Engine Technology Tags : 

By William Gannon

Providing Nectar to Google Hummingbird

 

There has been a great deal of concern within the SEO community recently, over the changes implemented in the Google Hummingbird update. The latest algorithm comes as a planned response to innovations made in voice recognition and wearable technology devices such as Google Glass. Although, the changes have been occurring over the course of the year in 2013, many concerns have been raised as legacy systems are decommissioned.

One of the legacy systems that SEO consultants are particularly concerned with remains to be Google’s PageRank. While search engines run on many algorithms, the implementation of social media signals and page connections through simple links to high value sites tend to provide higher rank in websites. Linking back is a popular method used by SEO companies to allow the search crawlers and traffic to flow through the collection of sites. Keywords, in this scenario, ultimately becomes the goal which leads to nefarious keyword manipulation and poor search results. Hummingbird integrates new technology that blocks keyword cheaters and benefits quality content.

Prior to Google Hummingbird

Prior to the Hummingbird update, in April of 2012, Google released Penguin which targets techniques that cheaters use to manipulate search rankings. Earlier in 2012 Google implemented the Panda update, which incorporates a human search quality rater that will assist in assigning a URL rating. Websites will be judged according to the intent and utility that they provide. Other factors include the website’s contact location, language, query interpretation, necessary content, and video content relevance. The scale beyond the robot queries now assigns a manual rating system. Pages are rated on a scale of Vital, Useful, Relevant, Slightly Relevant, Off-Topic, Useless, and Un-rateable.

The idea of a long-tail search remains the focus of any site that seeks to provide information to its visitors. Ideally, you never know what the end-user is going to provide in the search query box, so your interests are best suited in concentrating on SEO basics, design, and providing relevant content. During testing of the new systems, we have found that focus will remain in identifying people or businesses that provide the internet with the best profile of the target search.

 

 

google hummingbird updateWhen you provide content to a Google listed domain, the index targets users, pictures, article content, videos and any links to further content on the website. The old tricks are still the best if you seek to provide the internet with information about your website. Making your content relevant in the social media circles can certainly provide more opportunities to be found by the major search engine providers. It is best to know your targeted audience and provide to your community with a simple catchy URL address and with the expansion of new generic top-level domains (gTLDS) like .ninja, thus creating a new layer to theory of word games and structures.

 

The Future of the Search Query

The Hummingbird update ultimately leads us to the future of searching and queries in conversational search patterns. As a business owner, we would all like to be at the top of the list when someone inputs swimming pools, carpentry, landscaping, web design, computer repair, etc. In these cases you will try to foresee what users are querying in search engines that might bring users to your page. It is very important to include your locations and target business areas with relevant content that allows the new rating systems to index your sites accordingly. The main idea is to allow SEO’s to publish relevant material and query results to mobile and voice recognition systems.

 

Hummingbird is All About Improvement

The improvements following Hummingbird will allow Google to rank your page against the millions of other pages out there on the internet. The best way to attract attention to your website is to utilize all the tools available to the developer community. Start with the basics of simple webpage design, apply the techniques of ad marketing, and always focus on networking.The major advancements in search engine technology are growing so quickly that one can find themselves in the position of perpetuating old and out of date methods. It’s important to keep up with the major changes that seek to enhance our visibility and customer base. Allowing a pro SEO to optimize your website, will provide Hummingbird with vital information to help improve your standing in the internet community and draw more customers to your business.

 

 


Tweets

Contact JA Publications, Inc.