Let’s face it, even with our best efforts to make navigation clear and accessible, many websites are not as easy to navigate as they could be.
It doesn’t matter if you are first page super star, or a mom n pop blog with low traffic, most efforts really are no match for the diversity of our visitors.
When I first started blogging on SEO topics for Beanstalk I took a lot of effort to make my posts as accessible as I could with a bunch of different tricks like
<acronym> tags (now they are
<abbr> tags) and hyperlinks to any content that could be explored further.
Like a good SEO I added the
rel="nofollow" to any external links, because that totally fixes all problems, right?
External links, when they actually are relevant to your topic, and point to a trusted resource, should not be marked as no-follow. Especially in the case of discussions or dynamic resources where you could be referencing a page that was recently updated with information on your topic. In that case you ‘need’ the crawlers to see that the remote page is relevant now.
Internal links are also a concern when they become redundant or excessive. If all your pages link to all your pages, you’re going to have a bad time.
If you went to a big new building downtown, and you asked the person at the visitors desk for directions and the fellow stopped at every few words to explain what he means by each word, you may never get to understanding the directions, at least not before you’re late for whatever destination you had.
Crawlers, even smart ones like Google Bot, don’t really appreciate 12 different URLs on one page that all go the same place. It’s a waste of resources to keep adding the same URL to the spiders as a bot crawls each of your pages.
In fact in some cases, if your pages have tons of repeated links to more pages with the same internal link structures, all the bots will see are the same few pages/URLs until they take the time push past the repeated links and get deeper into your site.
The boy who cried wolf would probably be jumping up and down with another analogy, if the wolves hadn’t eaten him, just as your competition will gladly eat your position in the SERPs if your site is sending the crawlers to all the same pages.
Dave Davies has actually spoken about this many times, both on our blog, and on Search Engine Watch: Internal Linking to Promote Keyword Clusters.
window.location variable when desired and your pages would still work, but how would the robots get around without a sitemap? How would they understand which pages connect to which? Madness!
Say I wanted to suggest you look at our Articles section, because we have so many articles, in the Articles section, but I didn’t want our articles page linked too many times?
Just tell jQuery to first find a matching <anchor>:
Then tell it to add an ID to that URL:
.attr( 'id', '/articles/');
And then tell it to send a click to that ID:
Finally, make sure that your element style clearly matched the site’s style for real hyperlinks (ie: cursor: pointer; text-decoration: underline;)
UPDATE: For Chrome browsers you need to either refresh the page or you have to include the following in your page header: