I’ve been thinking a lot about my linking strategy lately. Trying to get incoming backlinks, making sure I have good inner links…
But one area that I think is too often overlooked is outbound links.
HTML5 microdata is a WHATWG HTML specification used to nest semantics within existing content on web pages. The puorpose is to allow crawler or screen reader to understand what’s the meaning of a text. These information are accessible only by this software and are invisible for the user.
301 redirect is the most efficient and Search Engine Friendly method for webpage redirection. It’s not that hard to implement and it should preserve your search engine rankings for that particular page. If you have to change file names or move pages around, it’s the safest option. The code 301 is interpreted as “moved permanently”.
Using Htaccess file you can speed up the loading time of your site with Header and ETag.
Header unset ETag FileETag None # 21 DEC 2012 <filesMatch "(?i)^.*\.(ico|flv|jpg|jpeg|png|gif|js|css)$"> Header unset Last-Modified Header set Expires "Fri, 21 Dec 2012 00:00:00 GMT" Header set Cache-Control "public, no-transform" </filesMatch> # 3 HOUR <filesMatch "\.(txt|xml|js|css)$"> Header set Cache-Control "max-age=10800" </filesMatch> # NEVER CACHE <filesMatch "\.(html|htm|php|cgi|pl)$"> Header set Cache-Control "max-age=0, private, no-store, no-cache, must-revalidate" </filesMatch>
It’s worth running pages through a text-only browser, or text-browser emulator to see what e.g. a blind person using a text-to-speech converter will encounter. It will help you pick up on badly chosen or missing ALT texts for example. It also shows you the site pretty much as a search engine will see it. Incidentally the Opera browser has a built-in text-browser emulator.
Lynx Viewer is a web service allows web authors to see what their pages will look like when viewed with Lynx, a text-mode web browser.
The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code.
Google PageRank Checker it a website that lets you check your Google PageRank.
I found this well written article that explains what are canonical URL and how to use in a easy example.