Small Business SEO | “Google’s Hummingbird Algorithm From A Content Strategist’s Perspective”

Source     : Scholars and Rogues
By              : Sam Smith
Category : Small Business SEOLocal SEO

Marketing and Search aren’t different things anymore, if they ever were. Google recently implemented their new “Hummingbird” organic search algorithm, perhaps the company’s most significant overhaul in more than a decade. Thomas Claburn at Information Week explains that Hummingbird is an expansion of Google’s Knowledge Graph, which was “introduced last year as a way to help its search engine understand the relationships between concepts rather than simply matching keywords in documents. The Knowledge Graph structures data, so that a search for, say, Marie Curie, returns facts about her contributions to science, her life, her family and other related information, not all of which are necessarily contained in the same document.”

Danny Sullivan’s FAQ at Search Engine Land noted the drive toward understanding how people actually articulate what they’re interested in (especially as we move beyond text-based searches with other interfaces, such as voice search and Google Glass). Searchers will continue to move from keywords to conversations. Danny used an example search: “what’s the closest place to buy the iPhone 5s to my home?” Rather than simply find pages that include the words “iPhone 5″ and “buy,” Danny notes that with Hummingbird, Google can better understand that you’re looking for a physical store (the “place”) near your current location that carries iPhones (and that an iPhone is an electronic device).

Until recently Google’s infrastructure didn’t have the ability to understand the entire query rather than individual word. If you revisit the video Google published last year of an internal search quality review meeting, you’ll see that part of the discussion centered around the fact that for longer queries Google’s algorithms could only evaluate portions at a time, rather than the whole. Those of you with a longing for deeper technical considerations will probably appreciate Bill Slawski’s analysis at SEO by the Sea. For me, in my new career in the search industry, this update comes at an interesting time. I’ve long advocated for greater customer focus and integration across audience touchpoints, and while this seems a simple and obvious enough goal, obstacles abound (from new and evolving communications technologies and channels to traditional internal political dynamics). For too many organizations, search remains on the periphery, unintegrated into the strategic fabric of marketing and communication development. Search pros are frequently relegated to the role of tactical specialist, brought in after the fact to lipstick the pig.

Viewed from this perspective, Hummingbird represents an incredible opportunity. Google’s basic philosophy seems pretty simple: help searchers quickly find exactly what they’re looking for. On the Web publishing side of the equation, that means they work to mitigate spamminess and gaming tactics, penalizing sites that fail to answer user questions and provide meaningful, useful content and elevating those of high quality and utility. On the searcher interface side, the problem is considerably more complicated. How can the zillions of mind-numbingly complex ways in which a human brain frames a query be translated into a workable mathematical language? First the searcher has to understand what he or she wants. Sometimes this is easy – who delivers pizza in West Seattle? – and others not so much – I want something different for dinner tonight.

Then the searcher has to figure out what word or sequence of words, in what order, best articulates the query. There’s occasionally some effort involved in translating the desire into Google’s language, and most of us have probably encountered cases where we had to try several times before we found the right term.
The search engine has to interpret the request and route it to the pages that best address it. Finally, the searcher has to hope that the Web site with the right answer/information is presenting it in such a way that the appropriate page pops up in the search results, preferably at very top.

Each of these steps embodies potential signal:noise challenges, and at the core of things is an essential technological reality: Google speaks a little bit of human and humans have learned to speak a bit of Google, but at present the two do not have a shared fluency in any language. Each is getting better, but the conversation isn’t yet as fluid as we’d all like it to be. Back to Hummingbird and what it tells us about the future. The hypothetical perfect search experience blows right through Semantic Web 3.0 and on into what, for lack of a better term, I’ll call the Intuitive Web – Web X.0. In this idealized, sci-fi world search is an almost telepathic process. The user wants X, types/speaks the query, and up comes exactly what he/she asked for. Feel free to think of this in terms of Amit Singhal’s famous Star Trek story. Larry Page once described the perfect search engine as understanding exactly what you mean and giving you back exactly what you want. It’s very much like the computer I dreamt about as a child growing up in India, glued to our black-and-white TV for every episode of Star Trek. I imagined a future where a Starship computer would be able to answer any question I might ask, instantly. Today, we’re closer to that dream than I ever thought possible during my working life. (More here.)

The results transcend the limitations of articulation (the system noise associated with the user finding the right words), addressing what was meant. They may even include something that’s better than what the searcher was envisioning – truly intuitive search is going to include a healthy serendipity function. An example. Say I’m a little fuzzy about that night in Kauai a few years ago and I’m trying to tell my friend about this incredible drink I had. I know it was some sort of rum punch and we were at this fantastic steak place. That much I remember. Intuitive Google could quickly sift through all of my confusion and figure out that I’m probably talking about the Tai Chi at Brennecke’s Broiler. (Which, btw, I can’t recommend highly enough.)

In other words, thanks to Hummingbird and its progeny, Google has learned to speak human. If we’re deep enough into a perfectly integrated big-data/CRM world the intuitive/predictive potential for Google is staggering, and I’ll leave it to you to decide if you’d like to be thrilled or, having seen Minority Report, terrified. After reading numerous articles about Hummingbird over the last couple of weeks, I think I’m beginning to see the general shape of Google’s grand vision. They want to migrate from our current model, where search is a process comprising several discrete steps, each rife with static, into something that’s so responsive it’s completely transparent. From the user perspective, this is a really attractive proposition. Search companies, though, are faced with the task of evolving, preferably in a strategic fashion. While technical optimization is explicitly about minimizing noise and maximizing signal, the content side has some unfortunate history re: spamminess – link farming, lack of relevant content on the page, overlapping content, packing pages with keywords, etc. These sorts of manipulative tactics are pure noise generation. In an ideal, organic world, there’s a theoretical one-to-one link between what I want and what your company provides. As soon as you and your competitors begin gaming the process, though, you have introduced artificiality into the process. The goal isn’t to hook me up with what I need, it’s to hook me up to your bottom line. If I ask who you think the best band in town is, I don’t need some guy I don’t know (who’s being paid by Citizen Dick’s management agency) “optimizing” your answer, right?

Given this, you’re probably just as happy as I am when Google smacks down the spammers. With Hummingbird (and Penguin and Panda and 100% [not provided]) Google is moving inexorably in the direction of perfecting the signal:noise ratio. In a world where its algorithms are increasingly intuitive and predictive, there will be less and less call for ad hoc, tactical SEO as we currently know it. The task will no longer be about optimizing content after the fact – it will be about producing helpful content that tells a client’s story in the most useful manner for the searcher. Google X.0 wants content that’s compelling and honest from the get-go. As the graphic in Eric Enge’s piece at Search Engine Land put it, Google wants you to act online like search engines don’t exist. I’m a storyteller, and there have been plenty of times when bad SEO marching orders got in the way of how I instinctively wanted to describe the value of the service or product I was describing. I was encouraged to communicate poorly, frankly, because the system had to be manipulated. And this manipulation necessarily imposed barriers to the free flow of communication between myself and my audience.

Google X.0? Tell your story, clearly articulating your offering’s benefits and value. Act like search doesn’t exist. In other words, Hummingbird is great news for everybody. It will be wonderful for those of us searching – you won’t have to figure out what words Google will understand because now it speaks your language fluently. It will be great for businesses – instead of worrying about how to game their sites to deal with keyword tyranny they can simply get back to clearly describing their features and benefits. It will be great for other (that is, non-commercial) Web publishers (like S&R and your favorite political, music, sports or hobby blog) because intuitive search levels the field by favoring content that provides the audience with information it cares about. It will even be good for search firms, although yeah, the game is changing for them. This is the core idea that Vanessa Fox was articulating, way back in 2010, in Marketing in the Age of Google. In a nutshell, the search industry needs to be envisioning its own future in terms of integration into the broad marketing and communications process. Once upon a time the Internet was this new thing that companies didn’t understand. PR outreach to online-only publishers and journalists was non-existent. Then came social media, and again we saw a long, painful process as confused businesses struggled to see how it fit into their marketing strategy. Mobile? Forget about it. The same is true for search. At this point in time, SEO is still generally treated as a separate thing from marketing and communication. It’s related, sure, but is rarely, if ever, treated as a fully integrated piece of the marcom function. This is going to change, and Google will make sure of it. Over time, it will be increasingly essential that search principles be embedded in the marketing organization from the point of conception. Instead of being an external function brought in after the fact to “optimize” the storytelling process, it will strategically inform the storytelling, product development and upstream marketing/research processes.

People who currently know nothing about search need to onboard a search-savvy view of their jobs. And on the other side of the equation, I think it’s going to be important for search people to begin conceiving of themselves as marketing professionals instead of as search specialists. This last bit is critical, because specialists don’t get invited into the strategic decision-making process very often. They’re regarded as tacticians only, and are rarely solicited for their opinions on anything other than technical concerns. Search-infused marketing in a Hummingbird world must be strategic and it must have access to the C-level. If not, it’s going to have a hard time in the Google X.0 future.

Source :


Local SEO Search Orlando | “When The Government Shuts, Even Web Sites Go Down”

Source      :
By               : VIKAS BAJAJ
Category   : Local SEO Search Orlando, Best Orlando SEO Company  ,Local SEO Search

When the federal government last shutdown in 1995, most agencies and departments had either no presence on the Internet or a very basic Web site. Since then, agency Web sites have become the primary public face of the government for most Americans who do not live in or near Washington. So it’s perhaps not surprising that the decision by agencies such as NASA and the Library of Congress to take down their sites this week has gotten a lot of attention.

In keeping with the senseless nature of the shutdown, some Web sites are down while others are still up. The Federal Trade Commission, for instance, has blocked access to its site. It has posted a notice online saying that it’s closed indefinitely as are its systems for people to register complaints or enter telephone numbers on the do-not call list. By contrast, the Department of Education has left its site up with a notice informing visitors that it will not be updated during the shutdown. Sites for the White House, Treasury and the Internal Revenue Service, are being updated at least in part. (Here’s a pretty comprehensive list of which sites are up and which are not.)

Each department and agency has had to decide what to do with its Web site based on its interpretation of federal laws and rules. In a memo (PDF) written last month, the Office of Management and Budget offered some guidance to officials trying to figure out what to do. It said in part:

The mere benefit of continued access by the public to information about the agency’s activities would not warrant the retention of personnel or the obligation of funds to maintain (or update) the agency’s website during such a lapse. However, if maintenance of the website is necessary to avoid significant damage to the execution of authorized or excepted activities (e.g., maintenance of the IRS website may be necessary to allow for tax filings and tax collection, which are activities that continue during an appropriations lapse), then the website should remain operational even if its costs are funded through appropriations that have lapsed.

In further keeping with the truly bizarre nature of government shutdowns, the O.M.B. also reminded government officials that they should pay no attention to whether it will cost more to shut down their Web site than it does to keep it going.

Source :

Orlando Small Business SEO | “Yahoo In Recycled Email Privacy Row”

Source       :
By                : Jane Wakefield – Technology reporter
Category   : Orlando Local Search Engine Optimization, Orlando Small Business SEO

Yahoo email addresses reassigned to a new owner are receiving personal emails intended for the previous owner. One man told news site Information Week that he had received emails with some highly sensitive information in them. In June the web firm announced Yahoo addresses and IDs would be reassigned if they had been inactive for a year. Privacy experts called on Yahoo to address the issue “immediately”. Yahoo says it has taken a series of measures to overcome privacy and security fears. “Before recycling inactive accounts we attempted to reach the account owners [in] multiple ways to notify them that they needed to log in to their account or it would be subject to recycling,” a Yahoo representative told the BBC.

“We took many precautions to ensure this was done safely – including deleting any private data from the previous account owner, sending bounce-backs to the senders for at least 30-60 days letting them know the account no longer existed and unsubscribing the accounts from commercial mail.” It is also in the process of rolling out a feature called “Not My Email” where users can report an email that is not intended for them.

The process will come as little comfort to the previous owner of an email account now owned by Tom Jenkins, an IT security professional. Mr Jenkins told Information Week: “I can gain access to their Pandora account [online radio] but I won’t. I can gain access to their Facebook account, but I won’t. I know their name, address and phone number. I know where their child goes to school. I know the last four digits of their social security number. I know they had an eye doctor’s appointment last week and I was just invited to their friend’s wedding.” Other users have revealed that they have also received messages that contain personally identifiable information.

Intimate data
“I recommend logging into your Yahoo account every six months or so in order to ensure that you retain control over it,” said security expert Lee Munson.  Privacy experts said that the issues were inevitable. “These problems were flagged by security and privacy experts a few months ago when Yahoo announced their intention to recycle old emails, and cautioned that Yahoo’s plan created significant security and privacy risks. Yahoo downplayed these risks, and ignored critics, but now we see these concerns were legitimate,” said Mike Rispoli, spokesman for Privacy International. “This email recycling scheme, an effort to re-engage old users and attract new ones, is resulting in some of our most intimate data being accessed by someone we don’t know and without our knowledge.

“We’re talking about account passwords, contacts for friends and families, medical records – this issue needs to be addressed immediately by Yahoo if they care about the privacy of their users and want them to trust the company with sensitive information.”

Source :

Local SEO Services | “2013 Google SEO Ranking Factors”

Source    :
By       : Krystal Shannon | Business 2 Community
Category  : Local SEO Services

SEO insiders at Moz recently released their annual study charting Ranking Factors for 2013. Below, I simplify the study’s conclusions and explain what the data means to you. With Moz data and ORM expertise from, you’ll get a clear picture of the future of Google rankings — in 2013 and beyond.

Here are five essential insights for 2013:

Backlinks are rising, and Domain Keywords are losing ground. Moz says: The Page Authority model within the Mozscape index addresses ranking ability based on links. Links created the study factor that was most strongly correlated with high Google rankings. When comparing exact match and partial match domains, Moz concluded that Google was still adjusting their ranking guidelines. The study analysts speculate that Google may be removing low-quality exact match domains, which would account for those domains ranking higher in the search engine results pages despite being less prevalent.

Social Signals are (still) essential.

Moz says: Google +1s and Facebook shares remain vital. Both were highly correlated with strong Google rankings. Social signals in general held one of the strongest correlation positions of all SEO dimensions examined in the Moz study. As Moz noted, Google+ is edging out Facebook and Twitter, primarily due to the intimate link between the network and its parent search company. We’ve known for years that social signals are essential attributes. Well-positioned URLs almost always exhibit a high volume of shares, likes and re-Tweets. When users are looking at top-ranking search results, URLs with high amounts of social activity stand out clearly. True, social networking activity is increasing across the board. But regardless of general activity levels, highly-shared content continues to correlate strongly with search rankings. Don’t overlook social signals – their correlation with rankings isn’t going anywhere but up.

Strong Content is your number one asset.

Moz says: Based on 2013 findings, Moz predicts that traditional ranking factors, including anchor text and exact match domains, will be dwarfed by new site assessment guidelines. A site will be judged by its perceived value to readers, content authorship and social signals among other factors.  Content must be engaging and original at all times. As Google aims to optimize the user search experience, the engine will attempt to determine which sites deliver useful, relevant and engaging content and which sites fall short. Plant yourself firmly in the first category by maintain content that is fresh, meaningful and of high quality.  Promote your engaging content with social signals. As outlined above, social signals remain an essential ranking tool. Encourage users to read, respond and engage with effective social management.

Don’t let On-Page Standards slip.

Moz says: The study found a relatively high correlation between rankings and title tag, HTML body, H1 tags and meta descriptions. In other words, on-page keywords are still correlating significantly with search engine rankings. As you consider on-page keyword saturation, carefully consider if your site reflects optimum on-page technology implementation. Basic tech requirements must be met if you expect a site to rank. But keep in mind that fulfilling the criteria for on-page and tech-oriented keyword usage will not inherently secure strong rankings. Web pages that FAIL to meet the criteria, however, are guaranteed to fall short in rankings. Consider on-page tech and keyword building to be building blocks that MUST be established before you move on to more significant ranking approaches.

2013 Google SEO Ranking Factors image Krystal pic Reveals 2013 Google SEO Ranking Factors

In short, 2013 search ranking factors indicate the strengthening of some elements (backlinks, brand privilege, social signals) while others fade (domain keywords).  Evolve with the times. Cover the basics by implementing strong and effective on-page keyword standards, and let quality content be your guide as you enhance your search engine status. Moz says: Anchor text continues to display a vital correlation with higher search rankings. Moz notes that this correlation continues in spite of Penguin updates. While Google attempts to cut down on over-optimized anchor text options, correlations remain high for both partial anchor text and exact match anchor text.

Source :