Best Orlando SEO Company | “First Computer Made Of Carbon Nanotubes Is Unveiled”

Source      :
By               : James Morgan Science reporter
Category  : Best Orlando SEO Company, Orlando Local Search Engine Optimization



The first computer built entirely with carbon nanotubes has been unveiled, opening the door to a new generation of digital devices. “Cedric” is only a basic prototype but could be developed into a machine which is smaller, faster and more efficient than today’s silicon models. Nanotubes have long been touted as the heir to silicon’s throne, but building a working computer has proven awkward. The breakthrough by Stanford University engineers is published in Nature. Cedric is the most complex carbon-based electronic system yet realised. So is it fast? Not at all. It might have been in 1955. The computer operates on just one bit of information, and can only count to 32.
“In human terms, Cedric can count on his hands and sort the alphabet. But he is, in the full sense of the word, a computer,” says co-author Max Shulaker.  “There is no limit to the tasks it can perform, given enough memory”. In computing parlance, Cedric is “Turing complete”. In principle, it could be used to solve any computational problem. It runs a basic operating system which allows it to swap back and forth between two tasks – for instance, counting and sorting numbers. And unlike previous carbon-based computers, Cedric gets the answer right every time.

“People have been talking about a new era of carbon nanotube electronics, but there have been few demonstrations. Here is the proof,” said Prof Subhasish Mitra, lead author on the study.  The Stanford team hopes their achievement will galvanise efforts to find a commercial successor to silicon chips, which could soon encounter their physical limits.  Carbon nanotubes (CNTs) are hollow cylinders composed of a single sheet of carbon atoms. They have exceptional properties which make them ideal as a semiconductor material for building transistors, the on-off switches at the heart of electronics.  For starters, CNTs are so thin – thousands could fit side-by-side in a human hair – that it takes very little energy to switch them off. “Think of it as stepping on a garden hose. The thinner the pipe, the easier it is to shut off the flow,” said HS Philip Wong, co-author on the study. But while single-nanotube transistors have been around for 15 years, no-one had ever put the jigsaw pieces together to make a useful computing device.

So how did the Stanford team succeed where others failed? By overcoming two common bugbears which have bedevilled carbon computing. First, CNTs do not grow in neat, parallel lines. “When you try and line them up on a wafer, you get a bowl of noodles,” says Mitra.  The Stanford team built chips with CNTs which are 99.5% aligned – and designed a clever algorithm to bypass the remaining 0.5% which are askew.  They also eliminated a second type of imperfection – “metallic” CNTs – a small fraction of which always conduct electricity, instead of acting like semiconductors that can be switched off. To expunge these rogue elements, the team switched off all the “good” CNTs, then pumped the remaining “bad” ones full of electricity – until they vaporised. The result is a functioning circuit.

The Stanford team call their two-pronged technique “imperfection-immune design”. Its greatest trick? You don’t even have to know where the imperfections lie – you just “zap” the whole thing. “These are initial necessary steps in taking carbon nanotubes from the chemistry lab to a real environment,” said Supratik Guha, director of physical sciences for IBM’s Thomas J Watson Research Center.  But hang on – what if, say, Intel, or another chip company, called up and said “I want a billion of these”. Could Cedric be scaled up and factory-produced? In principle, yes: “There is no roadblock”, says Franz Kreupl, of the Technical University of Munich in Germany.  “If research efforts are focused towards a scaled-up (64-bit) and scaled-down (20-nanometre transistor) version of this computer, we might soon be able to type on one.” Shrinking the transistors is the next challenge for the Stanford team. At a width of eight microns they are fatter than today’s most advanced silicon chips.

But while it may take a few years to achieve this gold standard, it is now only a matter of time – there is no technological barrier, said Max Shulaker. “In terms of size, IBM has already demonstrated a nine-nanometre CNT transistor.  “And as for manufacturing, our design is compatible with current industry processes. We used the same tools as Intel, Samsung or whoever. “So the billions of dollars invested into silicon has not been wasted, and can be applied for CNTs.”  For 40 years we have been predicting the end of silicon. Perhaps that end is now in sight.

Source :


Search Engine Optimization-SEO | “Amazon’s Kindle HDX Tablets Get Remote Mayday Help”

Source      :
By               : Leo Kelion – Technology reporter
Category  : Small Business SEO , Search Engine Optimization SEO

Amazon’s latest tablets will include the ability to call up round-the-clock tech support via a video box at the press of a button. The Kindle Fire HDX’s Mayday feature will allow one of the firm’s employees to explain how to work the machine or take control if necessary. There is no additional cost involved. One analyst said the facility should help Amazon to stand out from other tablets, including Google’s Nexus and Apple’s iPad.”The new Mayday feature is a clever way to reach out to new tablet users,” said Thomas Husson, principal analyst at tech advisers Forrester. “Coupled with an affordable price for the lower-end Kindle Fire HD and new entertainment content and features, Amazon is clearly willing to appeal to the masses.” In addition to Mayday, Amazon is also introducing the ability to download selected movies and TV shows from its Prime Instant Video so that they can be viewed when the user does not have an internet connection. Rival on-demand services Netflix and Hulu do not offer this facility.

While the new features should help Amazon attract new customers, one expert warned that some might have security concerns about the firm building in software that allowed a third party to take over the tablet. “With any device that has any kind of remote access on it, there is always going to be that risk that it could be either hacked or abused,” said Chris Green, from the Davies Murphy Group consultancy. “But it’s such a small danger that the benefits outweigh the risks, and the average consumer or business user shouldn’t be put off from storing useful information because they fear it could be compromised.” Amazon has said that the Mayday feature can be disabled and stressed that it is a one-way video feed, so that the adviser cannot see the tablet owner.

The firm has not revealed how many assistants it had employed, but said it was aiming for a response time of 15 seconds or less. Chief executive Jeff Bezos added that it would be “ready for Christmas morning” – likely to be one of the busiest days. Mr Bezos previously told the BBC that his company sold its Kindle devices at cost, but one industry-watcher said investing in pop-up advisers still made financial sense.  “Amazon’s strategy has always been not to make profits from hardware sales but from selling content on those devices,” said Nitin Bhas from Juniper Research, a consultancy. “Adding round-the-clock tech support is a winning strategy but a long-term one and provides Amazon with a platform to expand further.” Upgraded system – Amazon also announced it was introducing the third version of its Fire operating system, codenamed Mojito. Kindle Fire HDX tablets The Kindle Fire HDX will be powered by a new version of Amazon’s operating system. The OS is a variant of Google’s Android system, but is designed to highlight recently downloaded items on the home screen as well as other Amazon-specific services.

These include:

X-Ray ,which offers access to synchronised lyrics for songs, details of music played during films, and background information about characters in books -The US firm said that developers would be able to offer software already developed for Google’s system with “little to no [additional] work”.  But experts warn there may still be snags.

“We know there have been some instances of compatibility problems with apps that wouldn’t run on the existing Kindle Fire devices because they were taking advantage of functionality and shortcuts that are in the standard Google build that weren’t present or were broken in the Amazon version,” said Mr Green. “It sounds like Amazon has tried to do its best to mitigate the number of instances where this is likely to occur, but the simple fact is that the Android software has forked. “There will still be issues where applications written for ‘normal’ Android will be expecting features, functions and more importantly other apps to be on the system that are simply not there and that is going to cause some issues.” The Kindle HDX – which comes with either a 7in (17.8cm) or 8.9in (22.6cm) display – will be released in the US on 18 October. It is powered by Qualcomm’s Snapdragon 800 quad-core processor, which Amazon said was three times as powerful as the Texas Instruments chip in its earlier model. It will cost between $229 (£143) and $579 (£362) depending on the amount of storage, screen size and whether or not a 4G data chip is included. Amazon has not announced when it will become available in other markets.

Source :

Orlando Local Search Engine Optimization | “Oracle Makes Java More Relevant Than Ever- For Free”

Source      :
By               : Robert McMillan
Category  : Orlando Local Search Engine Optimization, Orlando Small Business SEO

When Oracle bought Sun Microsystems four years ago, it quickly and ruthlessly started tearing out the unprofitable stuff. Sun was a company run by engineers, a Xerox-PARC-like outfit where a cool idea about the next big thing was all it took to get a budget. But Oracle is run by the accountants. Inside Larry Ellison’s company, either the numbers add up, or your project dies. As it turns out, many of the key ideas behind today’s hottest trends were thought up by Sun engineers, but Sun was forced to watch as other companies — Amazon, Google, and so on, and so forth — reaped the rewards. Oracle wasn’t going to let that happen again.

Except that it has.

As WIRED reported today, the Java development platform is experiencing a renaissance of sorts, as hot web companies grow out of their mid-2000s programming tools and look for something that can help them more effectively juggle tens or even hundreds of millions of users. Invented by Sun, Java is now overseen by Oracle, and yet, as those big web companies embrace Java in such a big way, Oracle is on the outside looking in. When it was founded back in 2006, Twitter’s programmers used Ruby on Rails. But as the service grew, it became clear that Ruby wasn’t the best way to juggle tweets from millions of people across the globe. Now Twitter runs on Java, as do large parts of Google, FourSquare, and Linkedin.

Inside these companies, there are thousands of servers running the Java Virtual Machine, or JVM, a piece of software the executes programming code. And the JVM is built by Oracle. But it’s available under an open source license, which means the company is fostering one of the hottest trends on the internet, while missing out on licensing fees. Take LinkedIn. It uses the free JVM, but that doesn’t help Oracle’s bottom line. “We don’t actually use many Oracle Java tools other than Java itself,” says Jay Kreps, a principal staff engineer with LinkedIn. “They seem to target enterprise development, which has a pretty different set of needs.”

Oracle clearly likes licensing fees. It launched a high profile (and, amongst developers, unpopular) lawsuit against Google, saying that the search giant should pay Oracle copyright licensing fees after building a copy of the Java virtual machine. Oracle lost that case, but it’s appealing the verdict. LinkedIn’s Kreps, like others we’ve interviewed for this story, thinks that Oracle has done a pretty good job managing its Java open source project since it shelled out $7.4 billion for Sun back in 2010. “To their great credit, Java’s only gotten more valuable under Oracle’s stewardship,” says Jonathan Scwhartz, the former CEO of Sun Microsystems.

Oracle has actually opened up Java even more — getting rid of some of the closed-door machinations that used to be part of the Java standards-making process. Java has been raked over the coals for security problems over the past few years, but Oracle has kept regular updates coming. And it’s working on a major upgrade to Java, due early next year. But it’s hard to tell how much dough Oracle actually makes from the platform. To be sure, Oracle does have a financial interest in Java. The company makes a lot of money selling an expensive and widely used Java middleware server called the Oracle Weblogic Server. And it makes money licensing Java to companies such as IBM so they can ship it with their servers.

But the widely used open source JVM is not a big money maker. Oracle can make some money from companies that want bug-fixes for obsolete versions, but that’s about it. We asked Oracle for a comment on its Java plans on Friday, but by press time Tuesday night, the company still couldn’t find anyone willing to discuss this. For David Blevins, the CEO at Java developer Tomitribe, Oracle’s limited financial opportunity is nothing but a good thing. “If it was a bigger money-maker for them, they would lock it down like crazy,” he says. “It’s almost to our advantage that it isn’t a primary path to their revenue stream.” So, at least one small part of Oracle is run like Sun.

Surce :

Orlando Local Search Engine Optimization | “Apple Sets New Record For iPhones Sales Launch”

Source      :
By               : AFP
Category  : Orlando Local Search Engine Optimization, Orlando Small Business SEO

Search Engine Optimization (SEO)

Search Engine Optimization (SEO)

NEW YORK (AFP) – Apple said Monday it sold a record nine million iPhones in the three days after launching two new versions of the smartphone. “This is our best iPhone launch yet — more than nine million new iPhones sold — new record for first weekend sales,” said Apple chief executive Tim Cook in a statement. The figures from Apple appeared to defy predictions from its critics that the company is losing momentum in the smartphone market and in innovation.

“Apple gets the last laugh,” said Roger Kay, analyst at Endpoint Technologies Associates, in a tweet. Apple said demand has exceeded the supply for the new handsets, and that some customers will have to wait. “The demand for the new iPhones has been incredible, and while we’ve sold out of our initial supply of iPhone 5s, stores continue to receive new iPhone shipments regularly,” Cook said.

“We appreciate everyone’s patience and are working hard to build enough new iPhones for everyone.” Apple began worldwide sales Friday of the high-end iPhone 5S and a lower-cost iPhone 5C, drawing crowds from Australia to Tokyo to Paris to New York.

The new phones are being sold in the United States, Britain, Australia, Canada, China, France, Germany, Hong Kong, Japan, Puerto Rico and Singapore. Apple also said more than 200 million of its smartphones and tablets are now running the redesigned operating system iOS 7, “making it the fastest software upgrade in history.” The new operating system has a bolder look, and includes the free iTunes Radio launched by Apple. It is a free upgrade for a number of iPhones and iPads sold in the past couple of years. Apple said in a filing with the Securities and Exchange Commission that with the strong response to the new iPhones, it now expects revenue for the fourth fiscal quarter to be near the high end of its range of $34 billion to $37 billion, and that profit margins will also be near the high end of its estimate of 36 to 37 percent. Apple shares, which have been under pressure in recent months, shot up 4.97 percent to close at $490.64.

The news from Apple “implies a message from management that the company is back on track,” said Ben Reitzes at Barclays. Telecom analyst Jeff Kagan called the sales “stellar” but said Apple may have boosted the numbers by not allowing pre-orders for the iPhone 5S.

“Traditionally users could pre-order devices. Not this year,” Kagan said. “This year anyone who wanted a new iPhone, must get into line. That bolstered the lines and strengthened opening weekend from a PR perspective. This upset users, but made for great numbers.”

Apple faced criticism for not cutting the price of its iPhone 5C as much as some had expected to appeal to emerging markets and budget-conscious buyers. The lower-cost iPhone sold for $99 in the United States with a carrier subsidy, but $549 without that, and more in other countries. Walter Piecyk at BTIG Research praised Apple for “an incredible manufacturing feat” in getting so many devices to market and said he expects the company to sell 34 million iPhones in the fourth fiscal quarter. All of this gives Apple strong momentum, Piecyk said: “The positive impact on Apple’s impact from the press around the product launch, record sales and recovering stock price cannot be underestimated.”

A survey by the research firm Localytics said the more expensive iPhone 5S was outselling the 5C in the US market by a margin of more than three to one, and by a five-to-one margin in Japan. “This makes sense, since those who feel the need to buy a new device the very weekend it launches are most likely the power users who want the highest-end phone experience,” said Localytics analyst Bernd Leger.

“It’s not altogether clear whether poorer countries are buying more 5c’s compared to 5s’s, but it will be good to keep a close eye on this data in the next few days to see if there is any pattern emerging.” The news came the same day BlackBerry, which just a few years ago was near the top of the smartphone market, said it had reached a deal to sell the company for $4.7 billion, after weak sales of its new handsets which led to losses of nearly $1 billion.

Source :

Search Engine Optimization SEO | “Global Warming ‘Hiatus’ Puts Climate Change Scientists On The Spot”

Source      :
By             : Monte Morin
Category  : Search Engine Optimization SEO, Best Orlando SEO Company

It’s a climate puzzle that has vexed scientists for more than a decade and added fuel to the arguments of those who insist man-made global warming is a myth.  Since just before the start of the 21st century, the Earth’s average global surface temperature has failed to rise despite soaring levels of heat-trapping greenhouse gases and years of dire warnings from environmental advocates. Now, as scientists with the Intergovernmental Panel on Climate Change gather in Sweden this week to approve portions of the IPCC’s fifth assessment report, they are finding themselves pressured to explain this glaring discrepancy. The panel, a United Nations creation that shared the 2007 Nobel Peace Prize with Al Gore, hopes to brief world leaders on the current state of climate science in a clear, unified voice. However, experts inside and outside the process say members probably will engage in heated debate over the causes and significance of the so-called global warming hiatus.

“It’s contentious,” said IPCC panelist Shang-Ping Xie, a professor of climate science at the Scripps Institution of Oceanography at UC San Diego. “The stakes have been raised by various people, especially the skeptics.” Though scientists don’t have any firm answers, they do have multiple theories. Xie has argued that the hiatus is the result of heat absorption by the Pacific Ocean — a little-understood, naturally occurring process that repeats itself every few decades. Xie and his colleagues presented the idea in a study published last month in the prestigious journal Nature.

The theory, which is gaining adherents, remains unproved by actual observation. Surface temperature records date to the late 1800s, but measurements of deep water temperature began only in the 1960s, so there just isn’t enough data to chart the long-term patterns, Xie said. Scientists have also offered other explanations for the hiatus: lack of sunspot activity, low concentrations of atmospheric water vapor and other marine-related effects. These too remain theories. For the general public, the existence of the hiatus has been difficult to reconcile with reports of record-breaking summer heat and precedent-setting Arctic ice melts. At the same time, those who deny the tenets of climate change science — that the burning of fossil fuels adds carbon dioxide and other greenhouse gases to the atmosphere and warms it — have seized on the hiatus, calling it proof that global warming isn’t real.

Climate scientists, meanwhile, have had a different response. Although most view the pause as a temporary interruption in a long-term warming trend, some disagree and say it has revealed serious flaws in the deliberative processes of the IPCC. One of the most prominent of these critics is Judith Curry, a climatologist who heads the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology. She was involved in the third IPCC assessment, which was published in 2001. But now she accuses the organization of intellectual arrogance and bias. “All other things being equal, adding more greenhouse gases to the atmosphere will have a warming effect on the planet,” Curry said. “However, all things are never equal, and what we are seeing is natural climate variability dominating over human impact.”

Curry isn’t the only one to suggest flaws in established climate models. IPCC vice chair Francis Zwiers, director of the Pacific Climate Impacts Consortium at the University of Victoria in Canada, co-wrote a paper published in this month’s Nature Climate Change that said climate models had “significantly” overestimated global warming over the last 20 years — and especially for the last 15 years, which coincides with the onset of the hiatus. The models had predicted that the average global surface temperature would increase by 0.21 of a degree Celsius over this period, but they turned out to be off by a factor of four, Zwiers and his colleagues wrote. In reality, the average temperature has edged up only 0.05 of a degree Celsius over that time — which in a statistical sense is not significantly different from zero.

Of course, people don’t actually spend their entire lives subjected to the global average temperature, which is currently about 15 degrees Celsius, or 59 degrees Fahrenheit. Those who fixate on that single measurement lose sight of significant regional trends, particularly in the Northern Hemisphere, climate scientists say. Xie and Yu Kosaka, an assistant project scientist at Scripps, used computer models to simulate the Pacific decadal oscillation, a phenomenon related to the El Niño and La Niña ocean temperature cycles that lasts for 20 to 30 years. The model suggested that the northern mid-latitudes — an area that includes the United States and most of Europe and China — were “insulated” from the oscillation’s cooling effect during the summer months, as was the Arctic region. “In the summer you’ve basically removed the Pacific cooling, so we’re still baked by greenhouse gases,” Xie said.

As a consequence, 2012 marked two climate milestones, he said. The U.S. experienced its hottest year on record, while ice cover in the North Pole shrank to the lowest level ever observed by satellite. Other climatologists, such as Bill Patzert of NASA’s Jet Propulsion Laboratory in La Cañada Flintridge, say sea level rise is “unequivocal proof” that greenhouse gases are continuing to heat the planet, and that much of this added heat is being absorbed by the oceans. As ocean water warms, it expands and drives sea levels higher, Patzert said. Currently, oceans are rising at an average of more than 3 millimeters, or 0.12 of an inch, per year. This pace is significantly faster than the average rate over the last several thousand years, scientists say.

“There’s no doubt that in terms of global temperatures we’ve hit a little flat spot in the road here,” Patzert said. “But there’s been no slowdown whatsoever in sea level rise, so global warming is alive and well.” Whether that message is communicated successfully by the IPCC this week remains to be seen. In the days leading up to the meeting, the organization has found itself on the defensive. A draft summary that was leaked to the media reported that scientists were “95% confident” that human activity was responsible for more than half of the increase in average global surface temperature between 1951 and 2010. But critics openly scoff, considering the IPCC’s poor record for predicting short-term temperature increases.

“This unpredicted hiatus just reflects the fact that we don’t understand things as well as we thought,” said Roger Pielke Jr., a professor of environmental studies at the University of Colorado in Boulder and vocal critic of the climate change establishment. “Now the IPCC finds itself in a position that a science group never wants to be in. It’s in spin management mode.”

Source :,0,791164.story

Small Business SEO | “SEO Tips for Business Bloggers”

Source       : Small Business Patch
By               : Tim Fasano
Category  : Small Business SEO , Search Engine Optimization SEO

Small Business SEO

Small Business SEO

Often when people think about SEO, they think they are optimizing their content for a robot. They try to strike a balance between being Google friendly and being user friendly.  The truth is that what search engines want is for  content to be fantastic for users. Web users are their customers, too, and they want to please them. So when thinking about SEO, think about your readers, how they might find you and what they want and need from you. If you do that, the search engines will love you.

Keywords: Think about what you’re writing about and make sure you use very clear, specific words in your headline and the first 1-2 sentences of your content plus throughout. Keep your potential readers in mind. What will they be searching for when they are on the web? If you’re a hot wings restaurant in Chicago, for example, you should use the keywords “wings,” “restaurant,” “Chicago,” “buffalo wings” and other search-friendly words prominently and frequently.

Link, link, link:  Links are an important part of what search-engines are looking for, and they’re good for users, too! Link to your past content on related topics to create a strong internal experience and keep people engaged with your site. Link to outside content that would be useful to your readers to show you support their experience above all else and that you engage with the web ecosystem.  Also, place your link over the words that best show what you are sending the reader to. For example, here’s an article about SEO for small businesses.  Do you see what I did there? My link is placed directly over the words that describe the article.

Get others to link to you:  Inbound links are extremely important for SEO. They show that other people and organizations value and trust your content enough to link to it. It’s especially helpful when sites with strong traffic link to you. Any sites with .gov or .org in the domain are viewed especially important by search engines and will aid in helping your content rise in search rankings. Whenever you have a connection with another site, talk to them. Ask them to link to you in exchange for linking to them. Explain the benefit to both parties and support each other in the web ecosystem.

Think about photos:  Most search engines (cough, Google, cough) index photos separately. So when you put photos on your content, think about search-friendliness. Name the photo file, which is captured. Write a good caption. This will help users put your photos in context and will enhance your SEO!

Be an expert: When you create lots of content around a specific content, search engines and your users say, “Hey! They’re a real expert on that topic.” That’s fantastic for SEO. So break up one piece of content into multiple pieces and link them all to each other. Voila! Expertise established. Be recent: Being fresh and relevant is very important on search! Keep your content up to date as often as you can. If you can embed something that will be continually updating, like a Twitter stream or a Storify, all the better! Have a history:  Your old content is just as important as your new content. Keep your content lived and archived. Link back to posts from the past, especially if they were well-trafficked. Even on the web, it’s true that time establishes credibility! Be credible:  Be what you say you are. Always. Send people to good web pages, internally and externally. Be correct. Use good grammar and spell things correctly. Seriously. It matters. Be shareable: How often content is shared socially is becoming more and more intertwined with how it is ranked in search. If you have content that inspires your users to share it on their social networks,  everyone benefits! Don’t forget to share it on your own social pages, too. Viral posts have to start somewhere. Let Google help you: Google Trends is a great site to look at what’s trending and to analyze keywords. Checking out this site helps you think like a user! But don’t create content SOLELY for matching up to search trends. Being credible is more important in the long run than a quick win. You want your users to come back again and again!

Source :

Search Engine Optimization SEO | “Google’s Disavow Process and Penguin 2.0 Recovery”

Source      :
By              : Michael Harper – Your Universe Online
Category : Best Orlando SEO Company

Many webmasters have felt the wrath of the Penguin 2.0 algorithm updates from Google. They’ve seen rankings drop, and links that once performed well for them thrown into the ineffective pile. We gathered 4 top SEO industry experts–Bill Hartzer, Doc Sheldon, Jim Boykin, and Jody Resnick–to discuss the recovery process they’ve followed for clients and the successes and pitfalls they’ve discovered along the way. Listen on for more to catch the tips and tricks they’ve unveiled in this month’s Google Hangout.

Arnie: Hi everyone, I’m Arnie Kuenn with Vertical Measures, and welcome to another edition of our monthly Google Hangouts. Really excited about this one today. We’re going to talk about the disavow process and Penguin 2.0 recovery. We’ve got four experts here with a lot of experience and knowledge on the subject. I think if my screen’s lined up correctly, we’ve got Bill Hartzer, Doc Sheldon, Jim Boykin, and Jody Resnick. Hey everybody, how are you doing?

Bill: Hi, how are you?

Jim: How are you today, Arnie?

Arnie: I’m doing great. Sunny day in Phoenix, Arizona. It’s only 105. We’re happy with that.

Jody: It’s a dry heat though, right Arnie? That’s what we keep telling everyone.

Arnie: Yes, yes.

Jody: More like an oven. Here in Florida it’s more like a sauna, so.

Arnie: Yes, yes, that’s right. You do have the humidity. I was just there not too long ago. It was hot.

Jody: Absolutely. You get wet going outside, no question.

Arnie: Alrighty. So I’m going to dive right into it, because we do have four panelists. We’ve got a lot to talk about, and we do try to keep these within 20 to 30 minutes. But anyway, I’m just going to throw a question out, and the first one that wants to answer it — by the way, introduce yourself when you do answer the first time. But our very first question is, does anybody really felt like they’ve had a decent success with the whole Google Disavow process? Anybody want to just jump in and . . .

Jody: I’ve had some success pretty recently. My name’s Jody Resnick, I’m the CEO and Founder of Trighton Interactive. We’re a full-service digital agency here in Orlando, Florida. My specific expertise is in search engine optimization, search engine marketing, social media. We build websites because we have to to make our campaign successful. Enough about that.

The success that I’ve had with the Disavow tool specifically, I’ll talk very quickly about two examples. One, it was for a law firm that had a shady SEO, they had built a lot of, probably 15,000 links on directories and just a bunch of spam directories and articles and things that were really penalizing them.

In that specific instance, we used a lot of tools to identify the domains that were certainly not adding any value, addressed with a letter to Google through the Disavow process specifically stating how this had occurred, what we were doing to remedy the situation, and removing some of these links manually, which is important. Google likes to see that you’re actively pursuing this. It’s not enough to just use the Disavow tool. In my opinion, you also must go in and actually manually get some of these removed, if possible. It really helps things, by either contacting the website, doing whatever you can. In many instances, they’re not going to remove the links, but you’ll find some people that will and that shows a lot of respect from Google.

Let me jump to the second example because I want to give other people, certainly, the chance to talk as well. It was a large Internet retailer, 100 client, actually in the printing vertical, and they got slapped with a manual Google penalty. They hired Trighton Interactive, and we are still working with them, but the penalty has been lifted and we used the Disavow tool. Similar situation, these guys had a content writer of articles, and actually the articles were very high quality, but the tactic being used was extremely Black Hat, which was in the bio section of each article was the exact same bio of this author and she had two keyword-centric links in the bio pointing back to our client’s website.

It was very unnatural, undistributed link profile. So in that situation, we were able to manually eliminate a lot of the links, and then with the Disavow tool and success being we’re seeing the traffic in the organic rankings really start to ramp back up for them. So those are my examples.

Arnie: Super. Alright. Anybody else want to jump in?

Bill: Sure, I’ll talk a little bit about what we specialize here. I’m Bill Hartzer from Standing Dog Interactive, Director of SEO. One of the things that we specialize in is working to Disavow and clean up some large, sites that have millions of links. One of the things that we’ve particularly talked about and what we’ve done in the past is really treat sites separately.

In some particular cases, we have sites that have not received any unnatural link warnings. So the way we deal with sites that have never received any unnatural link warnings, we’re doing that a little bit differently than a site that actually has gotten an unnatural link warning for the Disavow. So a lot of it is still looking particularly at really identifying all the links. I think that a lot of companies and a lot of people doing the Disavow, they will do a Disavow and not be successful because they’re not actually getting all of the links.

If you’re using something like link research tools, you have to realize that link research tools will pull links from a lot of different sources, however if you’re not giving even more links to them, for example, just taking the Majestic SEO links in particular and passing those on to link research tools or whatever other tool you’re using, you’ve got to find all the links and all the historic links that are out there, that there are links one sites that have not been crawled in two years that are still in there, links that we need to get rid of.

So as far as successes, one in particular, we did have an unnatural link warning. The client came to us specifically that wanted to get that removed. We did the initial Disavow. One major keyword for their site brought just ungodly amounts of traffic. They had gone down to eighth place in the search results for that keyword. They were always hanging around third or fourth. They could never crack into the top three for that keyword. We did a Disavow and after that Disavow, they jumped to number one because we had cleaned up a lot of bad links. We did the reinclusion request, they still had that penalty where Google came back and said, you still, even though you’ve cleaned up all your links and everything’s looking good, you’ve still at this point, still have the penalty.

So we’ve identified actually, specifically what that penalty was and we’re almost just about ready to do another reinclusion request but we need to get certain links removed and that’s taking a little bit longer than we had thought. Related to some blog posts and guest blog posts and so forth, we want to get those removed.

So I think if there’s one point that people need to come across and really understand, is that doing a Disavow is really kind of completely different or you need to handle it differently than cleaning something up in response to an unnatural link warning or some kind of situation where you got a notice from Google. If Google has not notified you at all, then you would handle the Disavow probably a little bit differently you may not have to go through all of the steps to prove that you have made this huge effort to Disavow these links and so forth.

Arnie: Super. Hey Doc, I think we’re going left to right then. Why don’t we give you a chance and then we’ll move on to Jim?

Doc: Okay. Well first off, I’m Doc Sheldon, co-founder of Top Shelf Copy and founder of We have seen, to point to Bill’s comment, I agree completely but I would add a third category in there. When we have seen a manual warning, absolutely we have seen that you had better go through the motions. You had better do your homework before you bother them with the Disavow tool or you’re going to get nothing out of it.

Where we have seen what appears to be an algorithmic dampening, as opposed to the using the term “penalty”, we have seen that they seem to be accepting some effort but they’re not asking you to get drastic. We are seeing some relief by just maybe we go after 20 or 25% of the bad links with emails, and then we go ahead and submit everything on a disavow and we’re seeing some relief there.

But the third aspect is just on a proactive housecleaning, when we’ve got a site that doesn’t appear to have taken a hit yet, but suddenly we find they’ve got an extra 60,000 links that are showing up that are all going to Japanese porn sites and pill sites and whatnot, real quick, we Disavow them. We don’t go through any effort to get the links removed whatsoever. We put them on a Disavow list and I have seen those actually jump rank, which would make me feel at least there’s some correlation there that they may have been getting dampened somewhat, but a lot of them, getting them Disavowed, it did jump three, four, five places in the rank.

I think really, you definitely have to handle them differently. I think that there’s really three categories. If you’re just doing some proactive housecleaning, it doesn’t seem to make any necessity to go through any extensive effort to get links removed before you do the Disavow.

Arnie: Super. And Jim, I know you and I corresponded a little bit and I think you said you’ve done at least 20 of these, and I teased you because I thought, don’t you have a business to run? So you may have done more than all of us, I don’t really know, but I’ll hand it off to you and let you tell a little bit of the story that you’re running into.

Jim: Sure. I’ve probably done about 45 Disavows over the past few months, a lot after Penguin 2.0 you and I’ve been speaking about it a lot. A lot of it is developing the tools to analyze the backend, which ones are good and which ones should go. In a perfect world, the client would just say like, “Hey we bought these links from this person. Here’s the spreadsheet. This is all that we ever did and this is bad,” and I don’t know, for some reason, getting that never happens. It’s like, here’s all the back links or whatever, so a lot of it is there’s a lot of public tools to analyze the back links. We have one as well.

I guess there’s a lot of it where, there were sometimes when I do the Disavow and people have tons of unnatural back links. So how do I say, I feel that the chances for recovery or where they’re going to recover at is a lot better but there are sometimes where I do the Disavow and none of their links are natural. It’s like, oh my God, like, you played a game of buying a bunch of links, you moved up in traffic, you linked for a long time, you made a lot of money, but now there’s a penalty and none of your links are natural. Like where do I even start to Disavow? Oh my God, you’re starting from a blank website. You have to rethink your whole business and your marketing.

We’ve seen some recovery. Of course, it’s not as many as I think anyone would like to see. We all want to see everything, here you submit this and boom, the following day it’s back and everything has been indexed. I think that a lot of it comes into which things to Disavow. What are your chances for recovery and where will you be when you reach that?

Arnie: Yep. And I don’t know about everybody else, I’m a little bit of a hard time hearing your audio, I don’t know if that’s the same as everybody else or not.

Doc: Same here. It was kind of garbled.

Arnie: It was a little bit garbled, but we’ll try to clean it up when we do the transcript. One question I do have, kind of a follow-up to that with everybody, do you feel like when you actually have seen the successes from submitting the Disavow, we’re just talking about the Disavow, not necessarily a reinclusion, but the Disavow, that it’s actually taking it until there’s another Penguin update to see the results, or has anybody seen anything, in the next several business days kind of a thing or does it appear that you have to wait until there’s another Penguin update?

Bill: There definitely is some lag in time. I’m not seeing in a few days, I’m seeing more in about a couple weeks or so, about a month or so. That’s what I’m typically seeing.

Doc: Yeah, and I have seen as briefly as two weeks and as long as two months. But not waiting for another Penguin update though.

Jody: I absolutely agree, I have seen it much faster and no correlation to a Penguin update at all and I would echo the same time period. You know, two weeks to six weeks or eight weeks is not uncommon based on my experience.

Arnie: Yeah, and we’ve pretty much seen the same thing and we were trying to figure out if it was at all correlated to a refresh or an update or whatever but, Jim, I don’t know if you want to test your sound? And I don’t know if you heard the question, we’re asking how long it’s been taking after submitting a disavow before we’ve actually seen positive results from those that you’re actually seeing some positive results.

Jim: For those that have had the recovery, it’s been over, gosh, I think most of those were all from the initial, initial Penguins, before the 2.0 and those were three to four months, and the other part to that was adding in new pages and new content and seeing it move up as well. So I’d say three to four months.

Doc: Can I throw something else in, Arnie?

Arnie: Yeah sure, Doc, yep.

Doc: You know, one of the things that I see a lot of people, SEOs, they seem to lose sight of the fact that once you’ve taken some sort of a hit, whether it be algorithmic or not, the odds are that a lot of those bad links are already devalued. So if they expect that they’re going to recover their ranking, I’m sorry, there’s nothing there. You’ve got a filter on you, you’ve got the dampening effect, removing the bad boy is not going to suddenly shoot you back up to where you were. That’s one of the things that I’m seeing as a common problem.

So it’s very difficult, really, to determine if you’ve seen an improvement because typically we’re also doing a lot of other things. We’re not just taking care of some bad links, we’re trying to take care of thin copy, we’re going after architectural and technical issues, a number of optimization efforts that are going to improve the entire site’s overall ranking ability. So at what point can you say, this was because of the Disavow tool? I think it’s hazy at best.

Arnie: Yeah, no, Doc, that’s an excellent point. I’m really, really glad you are bringing it up. You probably saw all of us nodding. You’re absolutely right. Everything that you just said as far as making multiple changes at the same time and Google probably has already discounted those links.

Bill: Well, it depends. I mean, it really depends on what types of links you’re disavowing because certainly, the really low quality type of link, disavowing that, you run into that situation. But clearly, I have a client that was hit by some negative SEO. So basically, what was happening was that their site is on one particular topic and there are these 750,000 links that were built are actually on really good sites.

So the links are whatever tool you throw in, you’re throwing in a link that is on where someone hacked into the site and put a text link in the footer that says, Aleve, Viagra, Cialis, and pointing it to your site. That’s actually a very good, page rank seven, good link, a link I’d love to have, but the fact of the matter is, that there’s 500,000 pills links, anchor texts, that’s pointing to the client that appeared over the matter of the past two to three years. So those are links that ultimately really I think are hurting rankings and that’s the stuff that I’m getting rid of.

So in this case, getting rid of those types of the negative SEO stuff and getting rid of those types of links, that can be a little bit different than what we’re seeing whether or not there’s a link from some e-zine articles, article that you had 500 e-zine articles and with the same exact anchor text. So, we’re talking a Disavow, and timing and so forth, and there’s so many different variables like this that it can really depend on what’s going on.

Jim: It basically depends upon the intent of how that link got on the page, you know? And, A, anything in bulk, of course, you have to assume anything in bulk has patterns and is being filtered, but part of the problem is some people think that some things are good. Like article syndication where it’s like, gee, I have this article and I submitted it to a thousand article syndication sites and the purpose was to get a thousand back links.

Anything done in bulk, especially, that are in patterns where the intent of the website is to give out links, things like the blog reviews where there’s so many of the blog reviews that are sites set up only for one purpose,  which is to do the blog reviews. There’s no real author on the website that’s really producing anything of quality, but the thing is a lot of people think that this is good marketing. “Gee, I want to get blog reviews,” and the clients, they look at the page and they’re like, oh, it’s relevant, it’s a good page. But they’re not looking at the site as a whole as part of this big network of things, things like the article syndication and the directories and stuff, and that’s where I think some people may be getting into trouble, is all that stuff in bulk and what is the intent of the page or why your link is there, but sometimes people think things are safe that they’re really not.

Jody: If I could real quick, I just wanted to add one thing that I hear from a lot of our clients and some people that have come to us is, they’re trying to determine what is a bad link and what isn’t a bad link. That’s something that we’ve helped our clients out with. In the one instance, I mentioned, these guys had really great, original content, but it was the way that their content author had linked in the bio these keywords. So some of the articles were on very high domain authority and page authority sites, even on page rank fours, but it’s like the way that the links were embedded and very non-relevant to the subject matter, it took some convincing to these guys that, hey, these are actually really hurting you, where they thought that, hey, look, these are credible websites, those aren’t really bad links. I kind of had to show them that, no, certainly they are hurting them. So I think it’s a lot of times kind of educating our clients as well to really help them understand why all these links need to be removed. I’m not sure if you guys have run into that.

Arnie: Well, yeah, I think it speaks to what Jim just said about intent. If the intent looks like you wrote that article, no matter how good it is, to get a sleek anchor text link down at the author bio, then Google, they frown upon that.

Yeah, I have another question I want to throw out to you guys. We have got some clients where every 90 days they’re in a back link analysis and doing Disavow and all of that, so we’re just setting it up on a routine basis. Are you guys doing anything similar to that, where you’re just constantly working with this same client to constantly monitor their back links and also refreshing the Disavow process?

Jim: I haven’t been, because I figure if someone was hit by Penguin, I want to look at the back links prior to then, like basically they should stop whatever they were doing. Stop all stuff and then let’s analyze this. But what’s interesting is, do you need to continually monitor it? Ideally, if someone does stop at one point because of Penguin and they realize, “Hey, I’ve got to stop.” I’m not going to — how do I say it — I wouldn’t continue to monitor it, unless you’re looking at doing a preemptive Disavow, which is a whole other topic. Do you do preemptive disavows for people, do you monitor people’s back links and look at stuff that’s bad?

Arnie: Well one of the reasons, the logic we have, is like what Bill was talking about earlier, is not every tool is going to find every link and we just keep finding that we’re using a variety of different tools, and we’ve only been doing this for maybe the last six months on a routine basis, maybe gone through three cycles with a single client, but we keep finding new stuff that is old. So that’s one of the reasons that we’re kind of doing it.

Jim: Yeah. We tend to we pull from Moz and Majestic, and kind of pull that and combine it. But to be honest, when I’ve been doing the Disavows, I’ve been just analyzing the back links from Google, which I know is different than what Bill feels. I guess there’s two different ways to do it, you know. We do have another tool, we can take everything that we know, and Google, and merge that, and run that, but I’ve been drinking the Google Kool-Aid about they keep saying that the Webmaster tool back links is the only thing that you need, in theory, to use to analyze your back links.

Bill: Well, it depends. I mean here’s a problem. You’ve got a client that has seven million back links. Google Webmaster tools will show you a random sampling of 100,000. There’s no way you’re going to see all of the back links in that, you know?

Jim: I totally agree, and I definitely agree that analyzing everything is certainly the most thorough. I guess I’m just drinking the Kool-Aid of, your average webmaster isn’t going to be able to do that and I’m going on the theory that there is enough within the links that Google shows to untrip it. But your way is certainly much more thorough.

Bill: Yeah, most sites have less than 100,000 links that I run into. We’ve just been really successful with doing very large link Disavows and doing that.

Jim: Yeah. But I would just say that the average site that may have been hit by this doesn’t have access to all that, so the question is, does Google put the magic solution in those crappy sample of back links that they do give you?

Arnie: Well, actually I wanted to ask a question and then-, well I’m going to ask this question and then we’ll go into wrap up where I’m just going to see if any of you have some tips you’d like to throw out. But speaking of the sample links, one of our staff actually submitted a question to Matt Cutts and it was the topic of his video July 31st, so just a couple days ago, and it was basically asking about, will Google give some samples. And Matt keeps talking about that, but we’ve never seen one yet. Have any of you ever received an email from Google with example bad links?

Bill: No, and in fact the only response that I had actually gotten was that when — that I believe Matt said specifically to go to, if you’re having an issue, go over to the Google product forums and hope that John Mueller or one of the webmaster people are in there and in product forums, ask the question about your particular site and they will supposedly answer it, hopefully in webmaster forums. That’s the only place where that I’ve heard is an option at this point. They had alluded to before that they would be giving you a sample of some bad links in their response in Webmaster tools but I have not seen that.

Arnie: Has anybody?

Doc: Yes, I have. One reinclusion request of multiple with a client, they sent back two samples, “We see you made some progress, however your penalty still stands, here’s a couple of samples of the types of links, blah blah blah.” Then a couple months later, a subsequent reinclusion request, they gave three samples. That was one of those instances where the client was desperately trying to hang on to every link.

He was going through this mental process of, well, this link can be justified because he has the same last name as my sister-in-law. So he was being extremely gentle with his knife and he wasn’t really doing much. Had we been doing it we would have chopped probably 30,000 or 40,000 more links off in the first go. He wanted to go through them himself one by one and we saw on the first occasion, two samples, on the follow-up occasion, three.

Arnie: Real recent?

Doc: No, this was about four months ago.

Arnie: Wow. Jim or Jody, have you gotten any good examples like that?

Jody: I have not received any examples like that at all.

Jim: Yes, I got one about two months ago and it actually really, really surprised me and here’s kind of a tip for everyone too. At first I was much more stricter on stuff within the past two years, and a little bit lenient on the stuff after two years and I was kind of going on that from the theory of on Matt Cutts when he came out with his first video. He said that if you got an email from Google or you were hit by Penguin to look at some of the links that may have led up to that, more of the recent ones rather than before then, so I was going more harder on the past two years, thinking maybe the older stuff they’ve already like are not counted or they’re a little more forgiving or whatever and they’re not counting it. The three links they gave for the example, two of them are from 2009, from January of 2009, and the third one was February of ’11. I’m like, ooh, well, there’s no being more forgiving on older stuff, you’ve got to go right back to the beginning. So lesson learned.

Doc: You know, one of the things there that might play into it, Jim, I think, I have a theory at least or I’ve always believed that you may be, they have thresholds for all these different kind of links whether they’re algorithmic or manual and you may be on the safe side of a given threshold and the safe side of another given threshold, but because you’re in the 90 percentile range on both, it may combine to put you over an overall penalty threshold. So maybe a 2005 link that appears 18,000 times is so decayed that it wouldn’t hurt you by itself, but because you also have thin content, somewhat thin content, you also have very little copy above the fold, perhaps now you’re in a penalty mode and you do need to go back and look at all of them to get you back underneath that overall penalty. That’s a theory. I can’t back that up with any study cases but I see an awful lot of correlation there.

Arnie: Well, hey, guys, we’re heading towards the end here. We’re actually running into a little bit into overtime, but I do want, and I know this is tough, I know a lot of us are hesitant at times to give out any really nice little tip for a find that you’ve had, but I wonder if I could just start with Bill and we’ll go across the bottom, we’ll give Jody a chance to finish his lunch. Anyway, if you just have one tip that relates to the whole Disavow or reinclusion process, something that it was like, maybe a real surprise to you that our viewers might be interested in, really appreciate it. So Bill, do you have anything that comes to mind that you could offer?

Bill: You really need to be as thorough as you can and don’t be afraid to just really spend the time and going through the links. It’s not an easy process and you really just need to spend the time, and if you have the resources, use several people to be reviewing and whatever there are tools out there that specifically say that they will tell you whether a link is bad and are not. That’s great, but it comes down to just manually looking through and spending the time to clean this stuff up. It’s not an easy task, and you can screw things up real easily so it’s really good to hire somebody who has done this before and not just someone who’s just pitching. Hopefully there’s other people who will hear this and realize that you really have got to get somebody who really knows what they’re doing because you can really screw up things real quick.

Doc: Hear, hear.

Arnie: Alright, Doc, how about you? Can you give us a little closing comment from you?

Doc: Well certainly I agree with what Bill said. I think if you’re going to go through a link pruning process, you better bring a sharp knife. My rule of thumb, what I always tell my clients is if you feel like you can justify this link, then get rid of it, because if you need to justify it, you’re out of luck already. It shouldn’t need justification. That’s probably the biggest problem that I run into with my clients. They’re always trying to save every possible link. If you’ve been hit, odds are that link is already worthless to you anyway.

Arnie: Yep. Super. Alright, Jim, how about you?

Jim: You know, you really at times have to bring a machete in there and get rid of all the bulk stuff that either is absolutely worthless or fits patterns, or I think finding patterns is a big part of it, but it’s usually a very large percentage of people’s links. Along with that, you need to be doing changes to the website to send it additional new signals. It kind of brings up the over-optimization of websites as well, working on a site as well, then changing the whole link-building thing in bulk has to change to where people’s thoughts need to change about it’s really not a link-building game anymore; it’s a making connections and community-building and trying to get real citations from real places through all sorts of methods but it’s not like running out and getting bulk links through anything anymore. It’s over.

Arnie: Yep, yep. I totally agree. You bet. Alright, Jody, I’ll let you wrap it up.

Jody: Yeah, thank you. So I would echo the sentiments of everyone else, and you know, sorry, it is a little bit later here. But with that, I would say it’s very important that really you remove the links and do as much manually as possible before going through the Disavow and would also echo the statement of doing some really holistic link-building and getting social media signals to combat the effects and help increase the rankings. We’ve had success specifically in Google+ in trying to get engagement and that seeing that it’s not about the number of +1′s or the number, but of getting really authoritative people to engage with your content. I know specific pages, and doing that in combination of removal and Disavow will really help get the rankings back up and help get the traffic levels back to where they were prior to a penalty or something. So that’s been successful for us.

Arnie: Great, super. Well listen, I want to thank all four of you. I know we ran a little bit long but I think this is really valuable information for everybody who is watching this now or is going to watch it in the future. So again, thanks for your time guys, just hang in there for a couple seconds while we wrap this up and we’ll see you at some conference somewhere.

Source :

Search Engine Optimization (SEO) | “6 Quick Tips for International Websites”

Source        :
By                :  Jens O. Meiert and Tony Ruscoe, Tech Leads, Google Web Studio
Category    :  Search Engine Optimization (SEO)

Search Engine Optimization (SEO)

Search Engine Optimization (SEO)

Note from the editors: After previously looking into various ways to handle internationalization for Google’s web-search, here’s a post from Google Web Studio team members with tips for web developers.  Many websites exist in more than one language, and more and more websites are made available for more than one language. Yet, building a website for more than one language doesn’t simply mean translation, or localization (L10N), and that’s it. It requires a few more things, all of which are related to internationalization (I18N). In this post we share a few tips for international websites.

1. Make pages I18N-ready in the markup, not the style sheets
Language and directionality are inherent to the contents of the document. If possible you should hence always use markup, not style sheets, for internationalization purposes. Use @lang and @dir, at least on the html element:
<html lang=”ar” dir=”rtl”>
Avoid coming up with your own solutions like special classes or IDs. As for I18N in style sheets, you can’t always rely on CSS: The CSS spec defines that conforming user agents may ignore properties like direction or unicode-bidi. (For XML, the situation changes again. XML doesn’t offer special internationalization markup, so here it’s advisable to use CSS.)

2. Use one style sheet for all locales
Instead of creating separate style sheets for LTR and RTL directionality, or even each language, bundle everything in one style sheet. That makes your internationalization rules much easier to understand and maintain. So instead of embedding an alternative style sheet like

<link href=”default.rtl.css” rel=”stylesheet”>
just use your existing
<link href=”default.css” rel=”stylesheet”>
When taking this approach you’ll need to complement existing CSS rules by their international counterparts:

3. Use the [dir=’rtl’] attribute selector
Since we recommend to stick with the style sheet you have (tip #2), you need a different way of selecting elements you need to style differently for the other directionality. As RTL contents require specific markup (tip #1), this should be easy: For most modern browsers, we can simply use [dir=’rtl’].

Here’s an example:

aside {
float: right;
margin: 0 0 1em 1em;

[dir=’rtl’] aside {
float: left;
margin: 0 1em 1em 0;

International Website

International Website

4. Use the :lang() pseudo class

To target documents of a particular language, use the :lang() pseudo class. (Note that we’re talking documents here, not text snippets, as targeting snippets of a particular language makes things a little more complex.) For example, if you discover that bold formatting doesn’t work very well for Chinese documents (which indeed it does not), use the following:

:lang(zh) strong,
:lang(zh) b {
font-weight: normal;
color: #900;

5. Mirror left- and right-related values

When working with both LTR and RTL contents it’s important to mirror all the values that change directionality. Among the properties to watch out for is everything related to borders, margins, and paddings, but also position-related properties, float, or text-align.

For example, what’s text-align: left in LTR needs to be text-align: right in RTL.
There are tools to make it easy to “flip” directionality. One of them is CSSJanus, though it has been written for the “separate style sheet” realm, not the “same style sheet” one.

6. Keep an eye on the details

Watch out for the following items:

Images designed for left or right, like arrows or backgrounds, light sources in box-shadow and text-shadow values, and JavaScript positioning and animations: These may require being swapped and accommodated for in the opposite directionality.

Font sizes and fonts, especially for non-Latin alphabets: Depending on the script and font, the default font size may be too small. Consider tweaking the size and, if necessary, the font.

CSS specificity: When using the [dir=’rtl’] (or [dir=’ltr’]) hook (tip #2), you’re using a selector of higher specificity. This can lead to issues. Just have an eye out, and adjust accordingly.

Source :