Web Hosting News
Written by Taylor Hawes
Monday, December 16th, 2013
Google has changed. The Internet has changed. The combination, while putting the responsibility on the search giant’s algorithms, fundamentally affects how we shape our efforts at content discovery, simply because of the ubiquity of Google’s use. This change can be scary, but knowing how the change works, what to expect, and how it affects you will make all the difference as these revisions hit your site. In this post, we’re outlining 5 things you need to know about Google Hummingbird.
1. The Search Query Has Changed
In the beginning, search engines indexed information based on a rather primitive method of keyword indexing. These indexes did not understand human language, they simply represented an amalgamation of terms associated with locations, weighted by popularity and inbound links. In order to appease this format, those searching for information were required to truncate full, intelligent sentences into keywords and phrases that rubbed the algorithm the right way. Doing so would yield results, but with limited success.
Hummingbird throws that playbook out the window. Many of the old factors still exist, including keywords and PageRank, but these contribute to a formula that accounts for 200 different factors when returning results. In doing so, the engine works to incorporate long-form queries and human speech patterns to influence the relevance and quality of search results. What this means for you: no longer will your pages be judged simply on primitive factors. Relevant, original, and interesting information is, for the first time, being revealed and shuttled forth to interested eyes in dynamic new ways.
2. Blame Fluff For The Changes
The changes are not baseless; this isn’t simply a revision for revision’s sake. Google’s efforts are born of an era of Internet content where traditional methods could be exploited, placing unoriginal, uninteresting, and un-engaging, though keyword dense, content in front of curious viewers to the detriment of their search efforts and the reputation of websites offering compelling work.
3. Hummingbird Works in a Series
In this fact lies, perhaps, the greatest change to Google’s underlying engine. Previously, queries were submitted and results were returned based on a number of factors. However, each query represented a new effort, effectively limiting the ability to drill-down information when further insight was sought. The Hummingbird engine takes a new approach to the process of search, incorporating human behavior as a central tenant.
Continued searches are now viewed with a combination of order and context based on previous searches. If this sounds confusing, here’s a breakdown: each search in a series is understood by the engine in a different way. Initial queries are viewed as browsing, offering surface information and broad responses. A follow-up search related to the topic reveals more in-depth information. This series continues, retrieving information to a greater degree of specificity based on the search order and length of specific queries. In doing so, the engine emulates the human research process, seeking broad concepts and then working down to the details, in order to facilitate knowledge acquisition.
For commercial firms, this procedural search opens new doors for information previously buried in the hierarchies of corporate websites. Until recently, pages needed to have carefully crafted keywords to delineate their use as a more robust and authoritative resource. However, the series now cuts the guesswork out of the process. Those searching for “umbrellas” will receive several firms, delivered and ranked. A further search of “canvas umbrellas” will offer product pages and information matching the description, understanding the greater refinement of the request. Another search for “waxed, canvas umbrellas for under $100″ will narrow product recommendations and provided information, comprehending that, at this point in your journey, you are likely ready to buy a specific product. Beyond this step in the funnel lies information for present customers involving tech specifications, how-to instructions, and maintenance references, just to name a few.
4. Original, Informative Content is the Future
This series of steps and refinement of keyword comprehension means one thing: original, engaging content is the future. No longer are rote, keyword dense answers aimed at currying site traffic the ringleaders. In particular, Hummingbird favors authoritative, information-rich sources that piggyback off of Google Plus authorship and publisher-ship to tailor results to a fatigued and discerning public.
Since the engine is based on the promise of delivering answers to questions, this, above all else, should drive future content efforts. Offer FAQ pages, Q&A blog content, how-to posts, and interviews that focus on questions and answers to assert your authority in a particular avenue. Offer industry debates and “ask the expert” posts in order to drive your traffic as a firm that offers valuable information. In all things, remember that users are asking questions. Your job is to have the answers.
5. SEO is Evolving
In this way, SEO isn’t disappearing, but, instead, evolving. As mentioned, Google’s revision comes largely at the behest of users desiring to find more relevant content, tired from disappointing front-page entries that simply “played the game”. Traditional methods of link-mining, keyword stuffing, and cheap, overly sensationalist titles will receive less reward than ever before.
In place of these methods is a combination of traditional keywords and long-tail keywords. When embedding information in your page, your prior expertise in researching relevant keywords will still play a part, but stuffing the box will not. Simply focus on integral terms that hone your page down to its proffered expertise and value. In addition to these one-word, keywords, incorporate longer terms that effectively answer questions. In particular, observe the algorithms treatment of single keywords as indicative of broad information, 2-3 word-length keywords as more in-depth research and learning, 3-4 word-length keywords as detailed information, and 4+ word-length keywords as specialist information for customers and experts.
Hummingbird’s changes are unlikely to lose you traffic, but the science behind search engines has changed profoundly, necessitating adaptation. Gone are the days of gaming the system and here is an era of authority and originality. Series of queries will yield more robust results, as unearthing helpful content and answers are the goal. Optimize your site for the new format by including single-word terms and longer, more robust keywords in tandem. The combination may hurt impostors, but as a genuine vendor of valuable information, consider a ticker-tape parade and a bottle of Champagne.
Written by Taylor Hawes
Wednesday, November 6th, 2013
In May of 2013, former National Security Agency contractor Edward Snowden fled the United States with classified documentation revealing some of the most sophisticated and prolific public spying in American history. The PRISM program he divulged is an extensive campaign that utilizes classified intelligence directives to acquire “metadata” from major Internet players like Google and Yahoo. Since then, Snowden has brought to light myriad directions of similar ilk, geared toward data collection in the name of intelligence efforts.
In a recent leak, however, it was revealed that PRISMs scope pales in comparison to the NSA’s international data mining project, known by the acronym MUSCULAR and run in tandem with the British GCHQ. The program, it was shown, utilizes the linkages between Google and Yahoo data centers, mining entire data flows and shipping the data back to NSA data warehouses in Fort Meade.
The NSA program utilizes a structural flaw in the two companies’ architecture. Yahoo and Google maintain high speeds through decentralized data centers spanning multiple continents and connected by thousands of miles of fiber optic cable. In order to maximize performance, these data centers continuously sync information between repositories, including whole user accounts and email indexes.
In order to obtain the information desired, the NSA needed to circumvent exemplary data security protocols. These protocols include 24-hour guards, biometric identity verification, and heat-sensitive camera at data centers. According to the article in the Washington Post, company sources had reason to believe that their internal networks were safe.
Despite these measures, a weakness was uncovered. An internal NSA slide show leaked by Snowden contained a hand-drawn diagram outlining the transition point between Google internal networks and user computers. The drawing highlighted Google front-end servers as the weak point, noting that these servers actively decrypted information and could be exploited for data acquisition purposes.
Neither company was aware of the backdoor intrusion. Both companies acknowledge and acquiesce to front-end requests for data but maintained that their internal networks were secure. Google vice president for security engineering Eric Grosse even announced plans to encrypt linkages between data centers with the presumption of security.
Since the leak, both companies have reacted in outrage. Google’s chief legal officer, David Drummond remarked on the subject: “We have long been concerned about the possibility of this kind of snooping, which is why we have continued to extend encryption across more and more Google services and links, especially the links in the slide.” Yahoo commented: “We have strict controls in place to protect the security of our data centers, and we have not given access to our data centers to the NSA or to any other government agency.”
Legally speaking, the NSA is exploiting a loophole related to international espionage practices. While Congressional oversight has limited domestic spying, international monitoring remains less inhibited. Because the data centers of the two Internet giants span multiple continents, interception of these data flows is technically permitted under Section 702 of the FISA Amendments Act of 2008.
This international monitoring occurs with the cooperation of the British GCHQ. The UK agency maintains a data cache that can hold three-to-five days of traffic data before recycling storage. During this time, NSA software utilizes search terms in order to sift desirable data from the dredges. This data, once identified, is shipped via fiber-optic cables to the data warehouses in Fort Meade. This information, the agency claims, has produced intelligence leads against “hostile foreign governments.” At this point, this assertion of intelligence value remains largely unsubstantiated, likely due to the classified nature of such leads.
The scope of the MUSCULAR program lies in the volume of search terms used while sifting through acquired data. According to records, these inquires include 100,000 terms, more than two-fold the amount used in the PRISM program. The volume indicated in the Washington Post’s documents topped 181 million records over a 30 day period. The data acquired includes who sent or received emails, the subject of these emails, when and where they were sent, and the text, audio, and video content of these messages.
The program strikes a chord with both companies due to its unique nature. Both organizations were willing participants in the collection of data through front-end means, but the back-end intrusion remains uncharacteristically aggressive. Google, as mentioned, will move to encrypt its internal networks, however Yahoo has not indicated whether it will do the same.
The ramifications of these revelations is yet to be seen. However it is likely that, in the wake of negative public reaction to the PRISM documents, the sentiment will be similar. Ultimately, the continued exposure of agency programs continue to demonstrate the inter-connected and heavily monitored nature of our digital communications; a fact that can no longer go unacknowledged.
Written by Duncan Cumming
Wednesday, September 11th, 2013
Although Google Glass won’t be on sale until 2014 and it will be years before the futuristic technology penetrates the device market far enough for advertisers to invest in its instant and interactive capabilities, Google has just been granted it’s “Pay Per Gaze” patent, so it’s certainly hitting the thoughts of your PPC agency, website designers and webmasters.
Adwords was introduced in 2000 and PPC as we know it began in earnest in 2002; a multi-million dollar industry of which advertisers and digital marketing specialists are keen to explore every new opportunity, where each new device and platform translates to just that: a new opportunity.
The Google Pay Per Gaze Patent
Google’s pay per gaze patent was filed for back in 2011 for a “head mounted gaze tracking device” which would send images from the direction of the wearers “gaze” to a server, that server would identify relevant adverts and charge the advertiser.
The patent is not just limited to online advertising but can also relate to advertisements in the users environment which they view and interact with but0 Google has been a little non-committal about whether it will be implemented at all, inferring that not all patents get developed into products.
The patent also suggests the capability to assess a user’s emotional response to an advert and react accordingly.
If you’re not prepared to wait for Pay Per Gaze to become a reality, if it does (although if you are a realist it’s only normal to expect Google to want to make a few millions from advertising through the next step in wearable computing) then there are other Pay Per Click alternatives to talk to your digital marketing agency about.
Will it just be Google Glass?
It remains to be seen whether Google’s Pay Per Gaze patent will give them a complete monopoly on the head mounted device PPC industry, but there are certainly smart eyewear competitors to Google Glass emerging as Digital Trends reports:
Sony Smart Glasses
Sony does already produce 3D glasses for gaming, but has filed patents in 2012 for devices capable of transmitting information to others and a pair of glasses with displays for both eyes.
Microsoft filed a patent in 2011 which included layering information on top of live action events and their other patents have included Xbox and Gaming smart eyewear.
The competitor round-up would not be complete without an addition from Apple, who have filed much more vague patents which suggest they have been researching the area but are more likely to hit the market with an iWatch sooner.
The report also includes potential products from lesser-known players and products, with some capabilities already on the market like gaming glasses and those which incorporate digital cameras but which have far less potential so far to send to you running to your digital marketing agency to initiate a Pay Per Gaze campaign.
Having established his career in digital sales and marketing, Duncan Cumming formed his own digital marketing agency, Cayenne Red. Along with the running of his business, Duncan spends time writing informative and helpful articles about the different areas of online marketing.
Written by Sean Valant
Friday, August 23rd, 2013
Yesterday (August 22nd, 2013) a massive number of IP addresses used for email gateways on virtually every webhost in the world became blacklisted on multiple networks. This resulted in a global inability for email to be received (any time the email originated from one of the blacklisted IPs and was “received” on one of the blacklisting networks).
The issue is on-going at the time of this writing, and some customers are still being affected at this moment, however HostGator was one of the first companies to successfully mitigate the situation and we have since been assisting other companies with this issue. As it stands, we are presently working to now get our IP’s removed from the blacklists and restore full worldwide email deliverability from our network.
This situation resulted from a combination of multiple factors stretching back a few months. Before we explain the circumstances, we want to once again stress the importance of keeping all scripts on all hosting accounts updated. Failure to update scripts, as well as not exercising basic security practices, is what allows situations like this to continue to occur. An out-dated script on a hosting account is akin to an unlocked car left in a parking lot… it’s an invitation for maliciousness by unscrupulous individuals.
Unlike the situation back in April that affected WordPress, this time the target was Joomla. Back in May, there was a string of exploits against known vulnerabilities in Joomla. These vulnerabilities, related to a component called JCE, had been previously addressed via certain mod_sec rules. However, a workaround was discovered that allowed malware to be installed, and later activated, to allow the uploading and execution of mailing scripts.
These mailing scripts were activated en masse yesterday, beginning a massive spamming campaign resulting in the blacklisting of email gateway IPs worldwide. One of the largest networks with users reporting issues initially was AOL, resulting in us creating this forum post.
As with all issues of this nature, there are lessons to be learned. The most important lesson here is to (again) keep all scripts on your hosting account up-to-date. Most scripts have a one-click feature to update them anytime a new version is released. Keeping scripts up-to-date is paramount in ensuring a secure hosting account.
HostGator has now added additional monitoring capability to our systems which will alert us to situations like this even faster than yesterday. Our work is on-going, though we should have the majority of the blocks resolved by tomorrow (spam lists move slow, with good reason). But remember, there is no better way to keep your car safe than to lock it. Please take this moment to log into your hosting script back-ends and ensure they are up-to-date. Don’t give the bad guys an open door to walk through.