Local SEO Services |”Google Clarifies: Guest Blogging Is OK, But “Guest Blogging For SEO” Is Not”

Source   : Marketing Land
By          : Danny Sullivan
Category :  Local SEO Services, Best SEO Company

Relax, publications that use guest bloggers and contributors. Google’s not planning to penalize you under its new “guest blogging equals spam” warning put out yesterday. Rather, the company says, guest blogging is only bad if the the main reason a post is run is to gain links meant to influence rankings in Google. Yesterday, the head of Google’s web spam team Matt Cutts declared that “guest blogging is done; it’s just gotten too spammy.” As a result, some worried that having guest posts meant they could look forward to a future of being penalized by Google.

Worries That All Guest Posts Are Bad

One especially notable example was award-winning science fiction author Charles Stross, who wrote on Hacker News: I’m spending three weeks on the road in the next month, so I’ve got three hand-picked guest bloggers taking over the mike on my site, for the duration. Emphasis on hand-picked, i.e. invited because they’re interesting and I’m hoping my readers will enjoy what they’ve got to say.I get to take some time off, they get access to a new audience, and the audience get some new and thought-provoking material — because from my PoV it’s not about SEO, it’s all about the quality of the content. (Hint: I’m a novelist, one of the guests is a film-maker, the other two are other novelists. We all pretty much live or die by the quality of our writing.)

Guest Posts For More Than SEO Are OK
To deal with such concerns, Cutts updated the title of his post to add the bolded words:

The decay and fall of guest blogging for SEO – He also added more explanation to stress that not all guest blogging is bad: I’m not trying to throw the baby out with the bath water. There are still many good reasons to do some guest blogging (exposure, branding, increased reach, community, etc.). Those reasons existed way before Google and they’ll continue into the future. And there are absolutely some fantastic, high-quality guest bloggers out there. I changed the title of this post to make it more clear that I’m talking about guest blogging for search engine optimization (SEO) purposes. I’m also not talking about multi-author blogs. High-quality multi-author blogs like Boing Boing have been around since the beginning of the web, and they can be compelling, wonderful, and useful. I just want to highlight that a bunch of low-quality or spam sites have latched on to “guest blogging” as their link-building strategy, and we see a lot more spammy attempts to do guest blogging. Because of that, I’d recommend skepticism (or at least caution) when someone reaches out and offers you a guest blog article. In short, the post from Cutts was a continuation of what Google’s been long saying. It wants to reward sites that have “earned” links, rather than sites that have gained links without any real effort. Places that publish anything submitted to them as “guest posts” are just one example of links that aren’t really earned.

No, There’s No Automatic Bad Guest Post Algorithm

Stross also had this comment that I’ve seen many other people echo: The question I’m asking is, how do google’s algorithmic processes figure out whether a post is a guest post? Are they doing style analysis on prose now? Or counting outbound links, or looking for anomalous keywords? Or is it just a matter of looking for spam-flavoured advertorial? The answer is that it doesn’t, and that it can’t, not easily. Nor did Cutts say that was the case, but it’s easy to see how some might assume that’s what he meant. Cutts is warning people who accept guest blog posts, or those who do a lot of guest blogging, that they might find themselves with a spam penalty that would almost certainly be applied manually, if Google’s spam police did a review. It’s similar to how Google warned against advertorials last year, following a penalty Google applied to Interflora and several UK newspapers over these. There are some cases where Google has algorithms designed to automatically detect behavior it considers spam, such as “thin content” that the Panda filter goes after or spamming that the Penguin filter attacks. Potentially, Google could try to figure out a way to tell if a guest post is done “for SEO reasons” or not. But that would be very difficult, and it’s not the case now (or else Cutts would have announced some new filter with an animal name). Rather, Cutts seems to be saying that if you’ve been accepting or doing guest blog posts solely for ranking purposes, be warned. If your site gets flagged for some reason for a closer look by the spam team, then such activity is now deemed part of the many link schemes it might decide to manually penalize you over.
Google & The Degradation Of The “Link Signal”

SOurce : marketingland.com/google-clarifies-guest-blogging-71201

Best SEO Company | “Emerging Channels Challenge SEO for Mobile Marketing Dollars”

Source    : Search Engine Journal
By             : John Boitnott
Category : Local SEO Services, Best SEO Company

Mobile Apps

Mobile Apps

As more consumers switch to smart phones, emerging channels are gaining importance in a marketing space that SEO has dominated. For years, brands have used SEO strategy to market their products, but it has become harder to find as an increasing number of search results are returning unrelated content. Also, according to one entrepreneur, in 2013 organic search results made up only 13 percent of screen space. The rest of the screen was dominated by “ads and junk.” Problems with SEO-lead strategy are only magnified as more consumers view content on mobile devices. For instance, vertical and native search on mobile is continually threatening traditional search. Google’s traditional search traffic had declined 3 percent by the end of 2012.
Emerging Channels Explained

Advertisers are turning to emerging channels for help. These are marketing pipelines that lack established ad exchanges or ad units but hold massive audiences. They include social networks and sites beyond Twitter and Facebook such as Quora, StumbleUpon and Tumblr. With a strong emphasis on data, advertisers such as Unilever, Rhino Linings, and others are testing out emerging channels too. These methods are especially strong in the mobile ecosystem. Native applications such as Pinterest, Instagram, Pheed, and Vine are designed to encourage organic sharing — even relevant brand content targeting specific age ranges and demographics.

The need for a mobile advertising solution is only becoming more paramount. Mobile advertising is growing at 61 percent year over year and will be an $18 billion market by 2015, according to an Emarketer forecast. SEO and SEM will likely remain dominant players in the near future, but mobile has a unique appeal to advertisers.
“The Next Big Thing” – Take Rhino Linings as an example. A leading manufacturer of spray on truck bed linings, Rhino was looking to engage consumers across mobile channels. Working with San Francisco based startup Buzzstarter, Rhino got consumers to share infographic content across Instagram, Pinterest, and almost 280 other channels.“Putting Rhino content in the hands of consumers is important to us,” said Russell Lewis, founder of Rhino Linings. “Looking at mobile, we are seeing great engagement at the right place and the right time.” Unilever, the global consumer products company that sells personal care and food products, has recently been exploring emerging channels like Pinterest as a means to engage mobile consumers. Working with their media agency Mindshare, Unilever is finding mobile engagement opportunities closer to the point of sale. All of these initiatives steal away dollars traditionally given to SEO and SEM.

“We are always on the lookout for the next big thing,” said Lou Paik, Shopper Marketing Manager at Unilever.
Entertainment Brands Take Notice

Rhino Linings and Unilever are not alone as entertainment brands are seeking out emerging channel marketing too. Some of the first advertisers to dive into the space are film studios like CBS Films.  They are using more nascent channels to drive shares and views of trailer content, behind the scenes information, and film specific homepage traffic.  It culminates in helping drive ticket sales for films’ opening weekends. For a field that has seen much reliance on SEM and mostly upstream marketing opportunities for new release awareness, emerging channels are a welcome and downstream driver of demographic specific marketing. “Emerging channels allow content creators the opportunity to reach specific demographic subsets at a fraction of the cost,” said Michael Tringe, the Co-Founder of CreatorUp!, a web-series and entertainment marketing firm based in Los Angeles. “Rather than casting a wide net, emerging channels allow entertainment marketers the opportunity to specialize messages and tie ROI more tightly into their marketing budgets.” Considering the cycle of entertainment offerings and output, we may see more experimentation from other entertainment brands. There is a wide array of options for doing this. In some instances, both marketing and consumption can be mixed.  Some pioneers have attempted this. One instance is “Heroes” creator Tim Kring’s online series with production firm The Company P.  Called Conspiracy for Good, the program attempted to put viewers in the show across multiple channels.  But now, the multitude of channels just starting to take hold is going to open even better possibilities. “What’s really exciting,” said Tringe, “is the opportunity to go where our audience goes.” Using these alternative platforms, creators may be able to reach people much more organically across disparate communication formats. That in turn, could tie much better into the story they are trying to tell with content products. This type of marketing experimentation also creates a potential new path for nearly all advertisers.

Source : searchenginejournal.com/emerging-channels-challenge-seo-mobile-marketing-dollars/85981/

Local SEO Services | “The 10 Most Common On-Page SEO Gaffes”

Source     : Business 2 Community
By              : Jason Williams
Category :  Local SEO Services, Best SEO Company

On Page SEO- Web Tasks

On Page SEO- Web Tasks

What are some common SEO Mistakes to avoid?
That is a great question and is probably the most common question that I get asked by friends, family and clients.  First off, search engine optimization isn’t brain surgery or quantum mechanics, it’s hundreds of little details that align to make a bigger picture.  Many people spend tons of time and money into building a beautiful website that is responsive and useful to visitors but then quit fine tuning the details.

Here are some common On-Page SEO Mistakes to avoid

1. Not Having Unique Title Tags
Your title tag is the title of the page that shows up in the search results pages as well at the top of your browser.  If your website has the same title for every single page, you’re missing out on some valuable SEO real estate.  The title tag should explain to the visitor what the page will be about.  And your title tag should have your keyword phrase in it.  (If your webpage was a filing cabinet, your title tag would be on the outside explaining what’s on the inside.)

2. Having Title Tags that are too long…
On the search engine results page, there is a limit to how long your title should be.  If you have ever seen how Google truncates or cuts off titles that are too long with a simple “…” Many people say to keep it below 70 characters but search engines use pixel width to determine when to cut off words.  If your title tag is too long, its useless since the visitor won’t even be able to see it.   Best practice is to try to keep your title tag to about 65 characters so it won’t get cut off.

3. Not Having Unique Meta Descriptions

Far too often I see clients websites that use the exact same meta description for every single page.  The meta description is not a ranking factor but rather a quick sentence or two about what information the visitor will find on a page.  You might look at it as a quick sales pitch to get visitors to click on your site.  The meta description should be written in readable form with your keywords as close to beginning as possible.

4. Having Meta Descriptions that are too long

Just as the length of you title tag is important, the length of your meta description is also important.  Try to keep your meta description to about 156 characters to keep it from getting cut off.  Many websites that I work on have meta descriptions that are longer than the page content.  It does no good since the visitor will never see the part that gets cut off.

5. Not having H1 tags or having too many H1 tags

Your h1 tag is your page header tag that helps define what your page is about.  A page without an h1 tag is just like a filing cabinet without files in it.  A properly written h1 tag is one of the most important page elements to help with search optimization.  The h1 tells the visitor and search engines what the page is about.  It should support and be relevant to your title tag without saying the exact same thing.  (If your webpage was a filing cabinet, your h1 tag would be the green dividing folder that helps you find what you’re looking for.)

6. Not using h2-h6 header tags

Why should I use h2 tags?  Well, h2 tags not only help break up your page content to help a visitor skim your page to find the information they need, it also helps search engines define what your page is about.  Best practice is to use no more than six, h2 tags per page.  (If your webpage was a filing cabinet, your h2-h6 tags would be the manila folders that help you find what you’re looking for.)

7. Having links to pages that don’t exist (404 errors)

When we run our free SEO audit for clients, we crawl each page looking for broken links and 404 errors (page not found) When a search engine spider crawls your site, hitting these dead ends causes them to turn around.  Having too many 404 errors sends a signal that this site may not give visitors a good experience.  Why would these errors even exist?  Well, usually it’s because there is a link that points to a page that has been moved or deleted.  Fixing the links to point at the proper (or new) page is the first step.  Then taking the URL that no longer exists and 301 redirecting it to the proper page is ultimately what needs to be done.

8. Having poor content on your site

You’ve spent time and money building your site, making it user friendly and optimized for search engines then put up poor content that doesn’t give the visitor anything valuable.  It would be like opening a filing cabinet, finding the right folder you were looking for and there isn’t much inside.  Great content provides the visitor with a better experience.

9. Not having a XML sitemap

Of course, search engines can crawl your site link by link without the need for a sitemap, but having an XML sitemap makes it much easier for them.  Creating and submitting an XML sitemap for search engines helps them easily discover every page of your site.  There are plenty of programs to help you do this but here is an easy free XML sitemap generator to use.  www dot xml-sitemaps dot com

10. Not having social sharing buttons

Your site is awesome and people want to tell others about it but not having social sharing buttons makes sharing your site more difficult.  Adding social sharing buttons allows people to spread the word about your awesome site.

Source : business2community.com/seo/10-common-page-seo-gaffes-0728697#AklsRJcY7OtvfkZ5.99

Best SEO Company | “2014 SEO Playbook: On-Page Factors”

Source   : Search Engine Land
By       : Tom Schmitz
Category : Best SEO Company, Local SEO Search

Local SEO Search Orlando

Local SEO Search Orlando

Welcome to part 2 of my annual SEO playbook. (Click here for part 1.) I have to thank Danny Sullivan and the Search Engine Land team for giving me the perfect outline for the 2014 playbook, the Periodic Table of SEO Success Factors. Part 2 will cover on-page factors, including content, HTML and architecture. You’ll find more than enough food for thought and some very actionable steps. This is not a step-by-step SEO guide, and it’s pretty informal. Before embarking on a search engine optimization campaign, do your research or consult with an expert.

Content: Quality

Quality was a big discussion item during 2013, especially topics like word count and deep content. After Panda, you’d think we would be well past the age of producing short “fluff” articles. However, too many websites, especially business sites that struggle to post fresh content, continue the practice. Recently, I saw a corporate blog post listing 10 must-read books on a topic — the article consisted of 10 thumbnail images and the names of the books, linked to an online bookstore. You can’t afford to keep putting out cut-rate articles like that; in bulk, they are perfect Panda-penalty bait.On the opposite end is deep content — pages or articles of around 1,500 words or more. Websites have seen success with this content, so it may make sense to take the time spent creating lots of short, “fluffy” posts and use it instead to produce a few longer, more meaningful articles. Whatever you do, make sure content is well written, with attention to grammar and spelling. Don’t just say something; back it up with thoughtful opinion or researched facts. Put some meat on the bones. Personally, when it comes to article content, if I cannot easily pass 450 words, I will combine it with other content or deem it not worth writing about. As for e-commerce descriptions, I used to deem 250 words as the sweet spot. Nowadays, I am less concerned about word count and more focused on creating a great list, matching features with benefits.

Content: Keywords

Keyword research is not going anywhere and is still the foundation of all on-site SEO. The difference is, after the Hummingbird update, we are discussing the role of entities, where topics take the place of keywords in the result pages. Google has made great strides in synonym identification and concept grouping — some have even called it the death of the long-tail keyword. (But, as with all the supposed death knells in our field, this, too, is probably an exaggeration.) My advice is to make sure each page stands on its own as a topic. Do not create multiple pages about the same exact thing in order to optimize for different keywords. Instead, stick to single, well-written, citation-worthy, topic pages and optimize them for multiple keywords. This can be another good reason to use long-form content.

Content: Engagement

Engagement is about whether visitors spend time reading your content or bounce quickly away. Once again, meaningful content is key. It’s amazing how it all comes back to quality. Are you publishing something your audience or target personas will want to read, or are you just filling holes in an editorial calendar — or perhaps publishing out of guilt because you have not published anything recently? Engagement isn’t just limited to text content, either; Web page design is equally important. Words don’t just have to read well to be engaging — they have to look good. Readability includes everything from page layout to font selection to letter and line spacing. Additionally, pay attention to navigation and the presentation of links to other content, as these elements can have a huge impact on time, bounce rates and other visitor engagement metrics such as time on page/time on site.

Content: Ads

Another part of layout is the placement of ads. Search engines will not ding you for having advertisements. That would be hypercritical. What they will penalize is too many ads or inappropriate ad placements. I do not foresee big changes in this area beyond the enhancement of current search engine policies. In addition to display ads, be especially wary of text link ads. Make certain they are content-appropriate or matching, and that you nofollow them. If you still use automated phrase link advertising inside your content, I strongly suggest you consider removing this. If you use interstitial or pop-up advertising, make sure it doesn’t interfere with the ability of search engines to crawl your pages.

Content: Freshness

I am a big proponent of fresh content — this includes not just posting about hot topics, but also ensuring that you are publishing new content on a regular or frequent basis. Not only is new content important to attract readership, it also improves crawl frequency and depth. Earlier, I wrote that you should not create content just to check off your editorial calendar. Not to backtrack, but if you do not have an editorial calendar in place, you probably should create one and get to work creating content. Think of your content as a tool to generate awareness and trust. This means you must get beyond writing about just your company and its products or services. Go broader and become a resource — a real, viable, honest-to-goodness resource — for your target market and the people or companies that your target market serves. Taking this broad approach will give you more to write about, allowing you to focus on topics of interest to your target market. This is the kind of content you can build an audience with. In my opinion, if you are not trying to build an audience at the top of the marketing funnel, you are probably doing it wrong. Obviously, there are exceptions to this; though, I think a lot more companies fail here than don’t need to worry about it.

HTML: Titles & Headers

Title tags are interesting right now. The usual rules for writing optimized title tags and headers have not changed. I do foresee search engines (Google especially) rewriting more title tags algorithmically. If you see Google rewriting your title tags, test changing your HTML to the same text Google presents in the SERPs. By test, I mean change a judicious few, then observe what happens to performance indicators. If you see improvement, a broader title tag optimization program could prove worthwhile. Going back to entity search and optimizing for multiple keywords… when you are doing topic optimization, you must be cognizant of which keywords you use in the title and H1 tags. I wish I could give you a surefire formula, but one does not exist. As you look at synonyms, pay attention to which words or phrases received the most exact match searches and trust your intuition when it comes to popular language use.

HTML: Description

I don’t see anything changing with Meta description tag optimization. Write unique descriptions for every page. They will not change your rankings; but, well-written descriptions can increase click-through rate. I always pay attention to length, around 150 words. In reality, the actual length depends on the combined pixel width of all characters, but from a practical standpoint just make sure your descriptions are not getting cut off when they appear in the results. For pages that appear in site links, be sure that the portion of the description that appears beneath each link forms a coherent thought. This is a place where many enterprise sites and brands can improve.

HTML: Structured Data Markup

Each year, it seems structured data markup is always a big topic. First is the question of whether or not you should use it for organic search engine optimization. Some long-time experts do not like structured markup or machine-readable language because they do not want to help the search engines present information in a format that does not generate visits. For example, if you type in the name of your favorite NFL team, Google will show you information about that team, including their next scheduled game, right on the SERP. Here’s an example I fondly remember: someone once asked, if you ran a zoo website, would you want Google to show your business hours at the top of the search results, or do you want people to visit the website, where they will learn more about current exhibits and events? This is a fair question — to which I think the fair answer is, whatever will get the most bodies through the door. Google, Bing and Yahoo are going to show the data they want and in the format they desire regardless of how you or I feel. Personally, I’d much rather be a trusted source, even if it means my website information is made available in the SERPs. For this reason, I am a huge proponent of structured data markup like schema.org and RDFa. Other forms of structured markup, like the author and publisher tags, are not controversial and have entered the realm of best practices. Use them.

HTML: Keyword Stuffing & Hidden Elements

Negative ranking factors like keyword stuffing and hidden text are so old that many of us practitioners brush them off as search engine optimization 101. Unfortunately nothing is ever so easy. Stuffing is definitely a factor in e-commerce shopping cart optimization. It can be tricky not to use the same word or phrase over and over again when they are used as categories or descriptions for products. Different shopping carts have different levels of control. Some are more easily optimized than others. On category pages, it may be as simple as limiting the number of products you display on each page. Without going into an entire lesson on shopping cart optimization, what I will tell you is: if you have not done a shopping cart review in the last two years, it is time. Make certain your e-commerce platform is keeping up. It still surprises me how often I see unintentional cloaking. Usually, it’s a result of the template writer getting around a quirk of the content management system. But I have also seen static links in a template that are cloaked using display: none on some pages while they appear on others, depending on something such as the category. The bottom line is this: if it appears on the page, it should be in the HTML. If it does not appear on the page, it should not appear in the HTML.

Architecture: Crawl

Not enough search engine optimizers pay attention to crawl. I realize this is a pretty broad statement, but too many of us get so caught up in everything else that this becomes one of the first things we ignore unless there are red, flashing error messages. Obviously, you want to make sure that search engines can crawl your website and all your pages (at least the ones you want crawled). Keep in mind that if you do not want to botch the flow of PageRank through your site, use the meta noindex, follow tag to exclude pages, not robots.txt.  The other concern you should have is whether or not search engines crawl and capture updates to existing pages in a timely manner. If not, it could be an overall domain authority issue or that PageRank is not flowing deep enough in sufficient quantities. There are tricks to resolve this, such as linking to updated pages from your homepage or a level-one page until the updated deep page gets reached. The more wholesome approach is to make sure that the content which gets updated is naturally close to content or sections of content with higher authority, or to build legitimate internal links from related content that has its own off-site PageRank. I am not telling you all your content should be crawled all the time. Search engines budget crawl frequency and depth for good reasons. What I am saying is manage your website crawl budget and use it well; don’t just leave everything up to chance.

Architecture: Duplicate Content

Earlier this year, Matt Cutts stunned the search engine optimization community by telling us not to worry about duplicate content. He assured us that Google will recognize this duplicate content, combine the disbursed authority, and present one URL in the SERPs. This is really not a big surprise, as Google has been working toward this for quite some time. Webmaster tools has had automated parameter identification and Google spokespersons have discussed duplicate content consolidation for some time.
To repeat what I have written before, Google is not the only search engine out there and reality does not always work the way Google says it does. The bottom line is: keep managing your duplicate content by preventing or eliminating as much as possible, and as for the rest, put your canonical tags in place. Speaking of canonical tags, I know a popular hack has been to use one canonical URL, improperly, on all the pages of multipage articles. There are other canonical hacks out there, as well. I’d be wary of these. If you’re using canonical tags, machine-readable content or advanced meta-tags, you’re basically waving a big red flag telling search engines that your website is technically savvy and using search engine optimization. In other words, you’re begging for additional scrutiny. It would not surprise me if Google becomes more fierce in penalizing websites for this type of technical misdirection. Search engines tend to use a soft touch on levying penalties algorithmically for fear they will burn innocent websites. But as we have seen with Panda and Penguin, as they become more confident, they also become more aggressive. If you are optimizing for an employer, keep it clean.

Architecture: Speed

Most websites are not going to see an SEO benefit from increasing the speed of their website. Google has always said only a small fraction of sites are affected by this part of the ranking algorithm. This view seems to be borne out by correlation studies. Honestly, the best test of speed is to take your laptop to the local café and surf around your website. If you are not waiting for pages to load up, then you are probably okay. The exceptions (sites that should be concerned about speed) are large enterprise and e-commerce websites. If you optimize for one of these, shaving a few milliseconds from load time may lower bounce rates and increase conversions or sales.

Architecture: URLs

The current best practices for URLs should hold true throughout 2014. Simple and easily readable URLs are not just about search engine optimization. With today’s multi-tabbed browsers, people are more likely to see your URLs than they are your title tags. I will also add that, when seen in the search engine results pages, readable URLs are more likely to get clicked on than nonsensical ones. If your content management system cannot create readable URLs based on your title tags or will not let you customize URLs, it is probably time for a CMS review. This is now a basic search engine optimization feature, so if your CMS cannot handle it, I wonder about the rest of your CMS’s SEO efficacy.

Architecture: Mobile

2013 was an interesting year for mobile SEO. Google and Bing agree that the ideal configuration is for websites to have a single set of URLs for all devices and to use responsive Web design to present them accordingly. In reality, not all content management systems can handle this, and Web designers have presented case studies of situations where the search engine standard is neither practical nor desirable. If you can execute what Google and Bing recommend, do so. However, if you cannot or have a good reason not to, be sure to use canonical tags that point to the most complete version of each page, probably your desktop version, and employ redirects based on browser platform for screen size. You will not risk a penalty from the search engines as long as your website treats all visitors equally and doesn’t make exceptions for search engine spiders. Basically, this is similar to automatic redirection of visitors based on their geographic location for language preference. That about wraps it up for on-page SEO factors in 2014. Be on the lookout for Part 3 of my 2014 SEO Playbook, which will cover off-page SEO factors relating to link building, local search and social media.

Source : searchengineland.com/2014-seo-playbook-on-page-factors-178546

Local SEO Services Orlando | “Shoppers Warm Up To Retail On Thanksgiving Day”

Source    : Freep
By             : Georgea Kovanis
Category : Local SEO Services Orlando, Best Orlando SEO Company

Local SEO Serviceso Orlando

Local SEO Serviceso Orlando

Stores opened earlier than ever Thursday evening for post-Thanksgiving sales, and the shoppers shopped. And shopped.And shopped. Lines at Toys R Us, which opened at 5 p.m., and Best Buy, which opened at 6 p.m., initially seemed shorter than they’ve been in previous years. Amanda Reeser, general manager of the Best Buy in Madison Heights, said it took longer this year for the very long line to form outside her store as it has in years past. But the crowds finally showed up. Minutes before its opening, security personnel figured at least 300 people were standing outside, waiting to get in.

Jessica Berry, 27, of Ferndale said she wouldn’t normally have been able to make the Toys R Us opening at 5 p.m. Thursday. But this year, only a few members of her family gathered for Thanksgiving, and they had brunch, so she had time to shop. She dropped her husband off at a Walmart before driving herself to the Madison Heights Toys R Us, where she waited in line outside the store to get a train set and other toys. Jackie Council, 29, of Warren said she got in line outside the Best Buy in Madison Heights two hours before its 6 p.m. opening. She managed to get the Kindle and game system she wanted, she said, but felt like she’d had to rush her Thanksgiving celebration.

“Just starting early is a lot more hectic,” she said. “You don’t get to spend much time with family.” Nationally, retailers expected 33 million people to shop in stores or online on Thanksgiving Day, according to a National Retail Federation survey. On Friday, 97 million are expected to shop. For the full holiday weekend — Thursday through Sunday — up to 140 million people planned to shop. Last year, 147 million people surveyed said they planned to shop over the holiday weekend. Around metro Detroit, shoppers seemed in good spirits. Almost a half hour before Target’s 8 p.m. opening, people standing in the long line outside the Madison Heights store began singing “Rudolph the Red Nosed Reindeer.” “I love it!” Dee Farmer, 47, of Brighton said of Thanksgiving shopping. She was outside the Brighton Target store, where she’d been warmly ensconced in a tent since Wednesday night.

Farmer, a flight attendant, flew into Detroit Metro Airport from Germany about 3:30 p.m. Wednesday. She rushed home, changed clothes, grabbed her tent and propane heater and was set up in front of the store by 7:30 p.m. — 24 hours and 30 minutes before the store was to open Thanksgiving night. She wanted a 50-inch TV. And an iPad. And a camera. And probably some pajamas. And spending the night in a tent — and lunch from McDonald’s — was worth the wait, she said.

“You become one of these weird people,” she said, the sort of person who has an insulated tent and a propane heater.After Target, she said she would go home and have dinner — unless she decided to go to more stores. Which, she admitted, was a distinct possibility. By 11:15 pm, there was ample parking at Oakland Mall in Troy and checking out at Macy’s, which opened at 8 p.m., was relatively quick. The store’s busiest departments — handbags, jewelry, women’s shoes and the home department. The biggest draw? Appliances for $9.99. It seemed no one could resist them, as almost every shopper in the store’s home department seemed to be walking away with a toaster oven or mini vacuum.

Source : freep.com/article/20131128/NEWS05/311280096/Black-Friday-Thanksgiving-Day-shopping

Best Orlando SEO Company | “Don’t Believe The hype: Google+ Does Not Mean Great SEO”

Source      : E-consultancy
By              : Henry Elliss
Category  : Local SEO Services, Best Orlando SEO Company

Yesterday, I had a rather heated debate with a fellow online marketer, on one of the most popular topics within SEO at the moment: Namely, the impact of Google+ (and its +1s) on search rankings – or lack of, to be more precise.  Let me start this post with a couple of caveats. First up, whilst I’m very much on record as not being a fan of Google+ (I *may* have called it ‘The King’s New Clothes of Social Networking’ a few times) my opinion about the topic in question is entirely unrelated to this.

I may not be a fan, but I certainly recognise the impressive offering Google have developed in the fight against Facebook. I have a Google Plus profile, I encourage our clients to use it too and I pop on there at least once a week to see what’s what. Secondly, and this one goes without saying I suppose,  this post is based on my opinion. But frankly, most of the opposing arguments are also based on opinion. Search all you might (‘scuse the pun), there is almost no plausible or credible proof that +1s have any impact on SERPs or search visibility.

Whilst I firmly believe that, I should also point out what I am not saying. I’m not saying brands (or webmasters) shouldn’t have Google+ profiles. Far from it, as the branded-search coverage alone makes this worthwhile. After all, who doesn’t want a little extra Google real estate when people search your brand? I’m also not saying that Google doesn’t factor ‘social signals’ in to search rankings. Far from it, as I’ve long been an advocate of the integration of Search and Social, and can give you plenty of examples of when the two have worked together to give a better result. Finally, I’m not saying that you shouldn’t keep a close eye on how Google develops Plus or +1s in the algorithm in future. What I am saying is that you shouldn’t let passionate or argumentative SEOs convince you that Google+ should be a key part of your site’s strategy. Sadly, this is exactly what I see happening far too much. As a blogger myself (I’m one of the dreaded ‘dad bloggers’ in my spare time!) I’m a member of a number of blogging communities, where my fellow bloggers and I can discuss our plans, opinions, tactics and ideas.

Not a week goes by without one of them reporting that they’ve been told (or they’ve read) that Google+ needs to be central to their strategy, sometimes even going as so far as to say they shouldn’t bother with any other social networks in their social strategy.  Upon further research, very few of these blog posts, slideshares or stories ever actually include any evidence or facts. They’re essentially hearsay and opinion painted as fact. Get into an argument with one of these passionate writers and you’ll be told things like “Everyone says it, it must be true” and “Prove it doesn’t make a difference!”

The second of these comebacks is baffling to me. As somebody saying it doesn’t have an impact, surely the impetus isn’t on me to provide proof. That would be like asking an Atheist to ‘prove’ that there isn’t a God: can you ever truly prove something doesn’t exist? No, you can’t. As a hardy sceptic (and a casual atheist) I take the same view of Google+ as I do to God: prove it to me categorically and I’ll take you seriously.

Of course, I wouldn’t be any better than them if I didn’t provide any proof of my own. And what better proof than Google itself?
The evidence against… Despite the fact that Google probably benefits massively from this argument (after all, marketers are telling anybody who will listen that they need to use Google+ more, what isn’t there to love for Google in that?) it has actually been very open and honest about this.

Just last week at PubCon in Las Vegas, Matt Cutts explained that social signals like Likes, retweets and +1s will have no short-term impact on your search performance. They won’t help you rank better, in other words. While he did say that a long-term haul of these social signals ‘could’ have an impact on your influence, the fact remains that Cutts clearly stated social signals from Facebook and Twitter would be just as likely to have this affect as +1s. In other words, Google+ doesn’t need to be the central hub of your social strategy. There have also been a number of third party studies looking into the effects of Google Plus on rankings.  This one, by Cyrus Shepard on Moz.com, finds a strong correlation between +1s and rankings though, as the author says, it doesn’t constitute proof in itself. Also, this study reported on Search Engine Land can find no evidence that G+ means better rankings. Google Plus isn’t big enough to use as a ranking signal , And let’s face it, in all honesty, how could Google hope to maintain a respectable and competitive search engine if it took data from what is undoubtedly still a very small social network?

Google claims to have almost 250m ‘active’ users – compared to Facebook’s 1bn+ (though how it defines ‘active’ is clearly very debatable), but even the most anecdotal of evidence will tell you that Google+ is used by only a small proportion of users. Take my own Facebook friend list for instance. I’ve spent some time working out how many of my 700+ Facebook friends are also active on Google+. By basic logic alone, you’d assume it must be at least 150 – if not 200. But how many of them are actually active on Google+? My best estimate said no more than 20, and I’m being quite generous in my definition of active there. The simple truth is, very few people are using Google+ in any great capacity at the moment, so Google would be utterly bonkers to make data from it anything but the tiniest, inconsequential factor in search rankings. And if anybody tells you otherwise, ask them for proof. And I mean proper proof – not signed-in, short-term boosts – I mean long-term, available to all ranking changes which will have actual impacts on the average webmaster.

Source : econsultancy.com/np/blog/63665-don-t-believe-the-hype-google-does-not-mean-great-seo?utm_medium=feeds&utm_source=blog

Local SEO Services | “Lenovo Announces “Convertible” Yoga Tablets With 18-Hour Battery Life”

Source       : News Lenovo
By                : Brandon Hill (Blog)
Category   : Local SEO Services, Orlando Local SEO

Lenovo’s new 8″ and 10″ Android tablets feature three operating modes- That tablet wars are starting to heat up. In the past few weeks, we’ve seen a couple of fresh Windows 8.1-based tablets enter the market along with second generation. Surface 2 and Surface Pro 2 tablets. We’ve even seen a Windows RT-based entry from Nokia along with the new iPad Air and iPad mini with Retina Display from Apple. Now it’s Lenovo’s turn (again), and the hardware maker is looking to turn a few heads with its new Yoga Tablet. The Yoga Tablet lives up to its namesake by including three modes of operation:

Hold Mode: Makes it easier to handle the device when reading, making the device more akin to holding a magazine or a book.

Stand Mode: By rotating the cylindrical portion of the tablet’s body, a stand pops out that provides an adjustable viewing angle from 110 degrees to 135 degrees. This mode is beneficial when watching movies or interacting with the tablet on a hard surface.

Tilt Mode: In this mode, the Yoga Tablet can be placed on a desk to allow for easier typing, internet surfing, and playing games. Under the hood, the Yoga Tablet packs in a 1.2GHz Cortex-A7-derived quad-core processor, 1GB of RAM, and your choice of either 16GB or 32GB of onboard storage (a microSD slot is included for additional storage expansion). Other features include 5MP rear camera, a front-facing camera, and optional 3G connectivity. Unfortunately, the Android-based tablets — which are available in 8” and 10” varieties — only come with a 1280×800 display. Also, the Android 4.2.2 operating system is a step behind Google’s most recent offerings.

The Yoga Tablet weighs in a 1.35 pounds for the 10” model, and a 0.88 pounds for the 8” model. Both are good for up to 18 hours of battery life. The Yoga Tablets will be available on October 30, with the 8” model going for $249 and the 10” model coming in a $299.

Source : news.lenovo.com/news+releases/yoga-tablet.htm

Local SEO Search Orlando | “Become a Leading SEO Mechanic with Both Google & Bing Webmaster Tools”

Source      : searchenginewatch
By               : Amanda DiSilvestro
Category  : Local SEO Search Orlando, Orlando Local Search Engine Optimization

Webmaster Tools offerings from both Google and Bing can offer a wealth of insight to business owners. In order to get the whole spectrum of insights, marketers must learn just what they can do with both Google and Bing Webmaster tools. Using both together allows you greater insight into the factors contributing to the success—or lack thereof—of your SEO strategy. Internet Marketing Ninjas COO Chris Boggs and Grant Simmons, director of SEO and social product at The Search Agency, shared their advice on better integrating data from Google Webmaster and Bing Webmaster Tools earlier this year at SES San Francisco.

Google Webmaster Tools: Proactively Monitor and Have a Plan in Place to React (P.R.E.P.A.R.E).Internet Marketing Ninjas COO/CMO and SEMPO Chairman Chris Boggs started the presentation with the topic everyone really wanted to hear: Google Webmaster Tools (GWT). He started with SEO diagnostic principles and explained that you need to be both proactive and reactive when monitoring SEO. Marketers need to have a plan as well as the ability to manage from a reactive perspective, he said. If you come across something in your diagnoses, your analytics are going to be a good second opinion. Without tools, it’s just a guessing game.

Once you have this in mind, you can start digging into GWT by focusing on a few things first:

1. Quick Barometers

Boggs referred to the “Brand 7 Pack” as a company’s homepage and six sitelinks that appear in search results. If you don’t have seven, you have an SEO problem, he said. Your social entities such as Google+ should also be ranking, with your titles to be clear and easy to understand. If you want to see what your domain looks like from Google’s perspective and see the cleanliness of your page titles, type in “site:” and then your domain name without the “www.” Below is a screenshot of a website with a good 7 pack:  You can then go to your Webmaster Tools account to diagnose any problems you may see and determine exactly where the problem lies and how to fix it. From a reactive mode perspective, look at your analytics and verify. It’s very important for SEOs to live by this mantra. Webmaster Tools isn’t something to take for granted. Have an agency or consultant monitor the findings in GWT and relay information to design, development, and marketing teams.

2. HTML Improvements

Visit the HTML Improvements category to determine if your titles and descriptions look bad on a Google SERP. You can see if Google agrees, then click on anything with blue writing to learn more about the problem. Boggs was asked after the presentation what tool might get users in trouble if they don’t understand it, and this was his answer. He explained that almost every site is going to have some duplicate descriptions and titles, so he wouldn’t try to get that number down to zero. You don’t need to remove every single warning from GWT. How to Find the Tool: Located under Search Appearance.

3. Sitelinks

You can visit the sitelinks tab to demote a certain sitelink (one of the links under your company homepage shown on a search results page like in the screenshot above). Google is going to automatically generate links to appear as your sitelinks, but you can tell Google if you don’t want something there. How to Find the Tool: Located under Search Appearance.

4. Search Queries

Here, you can look at the top pages as well as the top queries for your site. Most people will just take the default information, but Boggs stressed that there are tabs for a reason. Look at the top queries as well as use those “more” tabs to get more information. How to Find the Tool: Located under Search Traffic.

5. Links

You can click on “links to your site” to get a full list of those linking back the most, but the tool that many forget to use is the “internal links” tool. Internal links are very important; Boggs explained it’s worth the time to go through and look at the number of these internal links and then download the table so you can really slice it and dice it. How to Find the Tools: Located under Search Traffic.

6. Manual Actions and Malware

With this tool, no news is good news. If you get a manual action warning, it means you need to do something that is probably substantial in order to keep your rankings where they are. Malware is also something you can look into which is another place you don’t want to see anything. How to Find the Tool: Find manual Action under Search Traffic, Malware under Crawl.

7. Index Status

If your page index is 10x, you might have a problem. The advanced tab here gives you a much better look at that data.
How to Find the Tool: Located under Google Index.

8. Content Keywords

What you want to look for here are the words you are using in your content. You don’t want to see a lot of “here” or promotional phrases. Identify where your gaps are or where you have too much content. How to Find the Tool: Located under Google Index.

9. Crawl Errors

Google now has a feature phone tab to help you with crawl errors. You have to understand any crawl errors that might occur and remember that you should provide data that is very specific to mobile, as well. You can also take a look at your crawl stats, which means the time spent downloading, and make sure there is no spike.
How to Find the Tools: Both located under Crawl.

Finally, Boggs explained that Google Webmasters Tools should be thought of proactively by pairing it with Google Analytics. What kinds of things is GWT telling you when it comes to your analytics and how that data is affected? Consider this screenshot from Boggs’ presentation:
In the end, Boggs explained that expertise is knowing the most basic things about SEO and doing them repeatedly, perfectly, every time. You’re going to come across situations where there are a lot of hooks and changes in the algorithm. Something someone might have done one to five years ago could be a very bad move now. That’s part of the game.

Bing Webmaster Tools: Bing Stands for “Bing Is Not Google”

Director of SEO and Social Product at The Search Agency, Grant Simmonsbegan his presentation with the quote “Bing stands for Bing is not Google,” and the laughter amongst the marketers and SEOs just about said it all. It’s true; Bing is often not taken as seriously as Google because it just isn’t as popular, yet Bing Webmaster Tools (BWT) does offer some good insights that Google does not.
Once you’re signed upand logged in, consider the top things that you should look at first to really get a handle on BWT:

1. Dashboard

You want to make sure that pages you think you have are the ones the Bing has indexed. If that number isn’t what you expected, ask yourself a few questions: Are they crawling my site frequently? Am I not updating my site? These are all quick things you can see right from the dashboard, and you can even look at search keywords to see how people are finding you.
Quick Fact: Bing doesn’t use Google Analytics.

2. Diagnostic Tools

The diagnostic tools category is comprised of 7 subcategories: keyword research, link explorer, fetch as Bingbot, markup validator, SEO analyzer, verify Bingbot, and site move.
How to Find the Tool: This is a category all on its own!

3. SEO Analyzer

This tool works great when analyzing just one URL. You simply type in the URL and hit “Analyze” to get an overview of the SEO connected with that URL on the right hand side of the page. The tool will highlight any issue your site is having on the page; if you click on that highlighted section, Bing will give you the Bing best practice so you can make improvements.
How to Find the Tool: Located under Diagnostics & Tools.

4. SEO Reports

This tool shares a look at what is going on with your whole site (as opposed to just one URL). You will get a list of SEO suggestions and information about the severity of your issue, as well as a list of links associated with that particular error. The tool runs automatically every other week for all of the sites you have verified with BWT (so not your competitor’s sites).
How to Find the Tool: Located under Reports & Data.

5. Link Explorer

You can run this tool on any website to get an overview of the top links associated with that site (only the top links, however, which is considered one of the limitations of the tool). Export the links into an Excel spreadsheet and then slice and dice the information as you’d like.
How to Find the Tool: Located under Diagnostics & Tools.

6. Inbound Links

Link Explorer is probably one of the more popular tools when it comes to BWT, so it’s certainly worth mentioning. However, according to Simmons, Inbound Links is a better tool that doesn’t have as many limitations. This tool will show you trends over time so you can really see if there is value on deep page links. You can see up to 20,000 links, as well as the anchor text used, with the ability to export.
How to Find the Tool: Located under Reports & Data.

7. Crawl Information

It’s important to remember that the Bing bots are different than the Google bots, and the crawl information tool can help give you insight. From a high level, Simmons explained that when the tool gives you the stats, you should be looking at the challenges you might have from the migration you did last year. Are your 301s still in place? Are they still driving traffic? From the 302 pages, should they be made permanent? It’s also a good idea to look at the last time your site was crawled. If it’s been a while, remember Bing likes fresh content and you may need to make some updates. Again, this information is exportable.

How to Find the Tool: Located under Reports & Data.

8. Index Explorer

Simmons said this is one of the coolest things found in BWT, one reason being that Google doesn’t really have anything like it. You can see stats for a particular page, which can be good to see based on a subdirectory or section of your site. The tool has great filters and offers an awesome visual representation of crawled and indexed pages.

How to Find the Tool: Located under Reports & Data.

Of course, there is a lot more to BWT than just the eight features listed above, including the keyword research tool, geo targeting, disavow tool (they were the first to offer this), and crawl control. Their features are very comparable to Google, they have excellent navigation and even a few extra capabilities. Simmons concluded the presentation by saying that we should really focus on BWT to make a difference.

Source : searchenginewatch.com/article/2302345/Become-a-Leading-SEO-Mechanic-with-Both-Google-Bing-Webmaster-Tools

Local SEO Services – Orlando | “How Google Updates Will Prompt SEO Strategy Changes”

Source    : itworldcanada.com
By            : Nestor E. Arellano
Category  : Local SEO Services, Orlando Local SEO

Web site operators will have to recalibrate their search engine optimization strategies as the impact of three recent big updates issued by Google Inc. begin to kick in.  The upgrades include: Google Hummingbird; the encryption of all search data; and the shift to a new Keyword Planner tool. These three changes accentuate the increasing significance of: Optimizing for natural language and mobile search; avoiding emphasis on keywords for SEO; and regularly publishing high-quality content. These changes make it more important for firms to develop a solid content marketing strategy, according to Jayson DeMers, founder of SEO and social media services company AudienceBloom.

Here’s how the updates will impact existing SEO practices:

Hummingbird – Google called its reworked search engine algorithm Hummingbird because the company says its “fast and precise.” According to DeMers the upgrade does a better job at understanding the intent of long-tail search queries (queries that include more than a few words) as well as spoken and natural language search queries (like questions asked by users on their smart phones). For example if a user asked: What’s the closest place to buy an iPhone 5s to my home? a traditional search engine might focus on the words “buy” and “iPhone 5s.” Hummingbird will like understand better the actual location of the users home (if the user shared that with Google) and that “place” means an actual brick-and-mortar store, said Danny Sullivan of online technology publication Search Engine Land. He said Hummingbird goes beyond finding Web pages with matching words.

Google’s SEO dictum of original, high-quality content remains, but this could mean that the Google will be looking for content and Web sites that deliver better mobile experienced, said DeMers. This means companies need to up their mobile content game, he said.

Content is still king, said Adam Stetzer writer for search engine online publication Search Engine Watch, but it is important for Web site to be able to answer specific questions because Hummingbird is good at matching long-tail queries. Search data encryption – Except for clicks on Google AdWords, Google is also now encrypting all search query data. This means that keywords typed in a Google query are protected by SSL encryption even if the user is not signed into their Google account. This move could be to block spying but some SEO experts also believe that Google is now curtailing access to its free data about keywords and encouraging SEO professionals to take out paid AdWords campaign instead. Before the move Google Analytics showed the number of visits each keyword or search phrase received in a site during a certain period of time; percentage of new visits resulting from the keyword phrase; bounce rate and other data.

Source :: itworldcanada.com/post/how-google-updates-will-prompt-seo-strategy-changes#ixzz2iWGrqLhs

Local SEO Search Orlando | “6 Major Google Changes Reveal The Future Of SEO”

Source       : searchenginewatch.com
By               : Eric Enge
Category   : Local SEO Search Orlando, Orlando Local Search Engine Optimization

The last few weeks have been amazing. Google has made some big changes and they are all part of a longer term strategy that has many components. In short, Google is doing a brilliant job of pushing people away from tactical SEO behavior and toward a more strategic approach. You could argue that “tactical SEO is dead”, but that’s not quite right. And don’t run around saying “SEO is dead” because that is far from the truth, and I might just scream at you. Instead, let’s take a few steps back and understand the big picture. Here’s a look at the major developments, some of Google’s initiatives driving this change, and the overall impact these changes will have on SEO.

1. ‘(Not Provided)’
Google made the move to make all organic searches secure starting September 23. This means we’ve lost the ability to get keyword data for users arriving to our websites from Google search. Losing Google keyword data is sad for a number of reasons. This impacts publishers in many ways, including losing a valuable tool for understanding what the intent of customers that come to their site, for conversion optimization, and much more.  For tactical SEO efforts, it just means that keywords data is harder to come by. There are ways to work around this, for now, but it just won’t be quite as simple as it used to be.

2. No PageRank Update Since February

Historically, Google has updated the PageRank numbers shown in the Google Toolbar every 3 months ago or so, but those numbers haven’t been updated since February. This means 8 months have gone by, or two updates have been skipped. In addition, Google’s Distinguished Engineer Matt Cutts has said Toolbar PageRank won’t be updated again this year, leading many to speculate that PageRank is going away. I won’t miss it because I don’t look at PageRank often and I normally don’t have a Google toolbar in my browser. However, a lot of people still use it as a crude measurement of a site’s prominence. For sites with a home page that has PageRank 7 or higher, it may in fact be reasonable to assume that the site has some chops. Correspondingly, sites with a home page that has a PageRank of 3 or lower, it is either new, or probably a low quality experience. Stuff in the middle, you just don’t know.

If Google shuts off this data flow entirely, which wouldn’t be surprising, then they will have to rely on other real world (and better) measurements instead. This would actually be better than using PageRank anyway, because Google says they don’t use it that way themselves, so why should we?

3. Hummingbird

There are a few elements to Google’s Hummingbird algorithm, announced in time for Google’s official birthday, but like Caffeine before it, this is really a major platform change. Google has built a capability to understand conversational search queries much better than before.
For example, submit a query to Google such as “show me pictures of Fenway Park”, and it does: Knowledge Graph show me pictures of Fenway Park .. Then you can follow that query with this one: “who plays there”, and you get this result: Both of these show conversational search at work (but note that the Boston Beacons folded in 1968 after just one season, so that is an error in that result – shows that they have much work to do!).

Hummingbird really changes the keyword game quite a bit. Over time, exact keyword matches will no longer be such a big deal. The impact of this algorithm is likely to be quite substantial over the next 2 or so years. Net-net, they have drastically reduced access to the raw data, and are rolling out technology that changes the way it all works at the same time!

4. Google+
OK, this one isn’t new. Google launched Google+ June 28, 2011. While it seemed to get off to a slow start initially, many argue that it has developed a lot of momentum, and is growing rapidly. The data on Google+’s market share is pretty hard to parse, but there are some clear impacts on search, such as the display of personalized results: In addition, you can also see posts from people on Google+ show up in the results too. This is true even if you perform your search in “incognito” mode:

And, while I firmly believe that a link in a Google+ share isn’t treated like a regular web link, it seems likely to me that it does have some SEO value when combined with other factors. How Google+ fits into this picture is that it was built from the ground up to be a content sharing network that helps with establishing “identities” and “semantic relevance”. It does this quite well, and in spite of what you might read in some places, there is a ton of activity in all kinds of different verticals on Google+.

5. Authorship
OK, authorship also isn’t new (launched on June 7, 2011), but it is a part of a bigger picture. Google can use this to associate new pieces of content with the person who wrote it. Over time, this data can be potentially used to measure which authors write stuff that draw a very strong response (links, social shares, +1s, comments) and give them a higher “Author Rank” (note that Google doesn’t use this term, but those of us in the industry do). We won’t delve into the specifics of how Author Rank might work now, but you can read “Want to Rank in Google? Build Your Author Rank Now” for my thoughts on ways they could look at that. That said, in the future you can imagine that Google could use this as a ranking signal for queries where more comprehensive articles are likely to be a good response. Bottom line: your personal authority matters. I also should mention Publisher Rank, the concept of building a site’s authority, which is arguably more important. Getting this payoff depends on a holistic approach to building your authority.

6. In-Depth Articles
Google announced a new feature, in-depth articles August 6. You can see an example of this here: The Google announcement included a statement that “up to 10% of users’ daily information needs involve learning about a broad topic.” That is a pretty big number, and I think over time that this feature will become a pretty big deal. Effectively, this is an entirely new type of way to rank in the SERPs. This increases the payoff from Author Rank and Publisher Rank – there is a lot to be gained by developing both of these, assuming that Google actually does make it a ranking factor at some point. Note that I wrote some thoughts on how the role of in-depth articles could evolve.
Is There a Pattern Here?

Yes, there is. The data they have taken away has been historically used by publishers to optimize their SEO efforts in a very tactical manner.
How do I get higher PageRank? What are the keywords I should optimize for? Taking these things out of the picture will reduce the focus on these types of goals. On the other side of the coin, the six major Google changes listed above are all moves that encourage more strategic behavior. Note that I didn’t bring up Google Now, which is also a really big deal too, and it’s another big piece of the Google plan, just not a major driver of the point I’m trying to make today. All of these new pieces play a role in getting people to focus on their authority, semantic relevance, and the user experience. Again, this is what Google wants.

For clarity, I’m not saying that Google designed these initiatives specifically to stop people from being tactical and make them strategic. I don’t really know that. It may simply be the case that Google operates from a frame of reference that they want to find and reward outstanding sites, pages, and authors that offer outstanding answers to user’s search queries. But the practical impact is the same. The focus now is on understanding your target users, producing great content, establishing your authority and visibility, and providing a great experience for the users of your site. Properly architecting your site so that the search engines can understand it, including using schema and related markup, addressing local search (if that is relevant to you), and work of this type still matters, too.

But, the obsession with tactical items like PageRank and keywords is going to fade away. As Google tweaks the way their service operates, and look for ways to capture new signals, they do things that naturally push you in that direction. It isn’t going to stop. Expect more of the same going forward!

Source : searchenginewatch.com/article/2301719/6-Major-Google-Changes-Reveal-the-Future-of-SEO