UNDERSTANDING SEARCH ENGINES
Search engines can be broken down into 3 elements – Crawling, Indexing and Searching. Get the crawling right and the rest will follow.
Crawlers, also called spiders or (ro)bots, are automated programs designed to browse the internet in a structured fashion, indexing the content they find.
The information returned to be indexed by the data centers from the spiders is then evaluated based upon that content and how it is displayed within the HTML. If your site mentions ‘animation’ in a high frequency of the text used and this word is displayed prominently in Header Tags then the data centers determine that your site is about animation.
The end effect is that when someone uses a search engine to search for the word ‘animation’ your site will be considered amongst all the other sites about animation to be displayed in the search engine results page.
Crawlers navigate the web by moving from link to link. As there are a finite number of crawlers and bandwidth they don’t follow every link and pages deeper in a site are less likely to be indexed.
Only those that receive a large number of votes in the form of links are likely to be crawled on a regular basis. Links also form the basis for much of the optimisation for the web and the best way to get links is to have good content.
The more pages of good content you have, the more links you can get and the more people will visit your site. Simple.
The rest of this article assumes an understanding of SEO techniques. If you do not understand SEO then Aaron Wall’s SEO BOOK is a good place to start.
Flash is a
standard for delivering high-impact, rich Web content. Designs, animation, and application user interfaces are deployed immediately across all browsers and platforms, attracting and engaging users with a rich Web experience
Over 98% of users on the internet use browsers capable of displaying Flash and as the platform is standard there are no inter-operability factors to consider. Interactive and visually stunning sites can be created with Flash and many designers like to develop for it.
So far Flash sounds great, however as mentioned above, in order to get into search engines their spiders must be able to read and follow your page. Flash has traditionally been very limited with what crawlers can see, until July 2007 it was limited to some plain text and in the latest update this now includes dynamic text within the swf files.
Flash is currently not a W3 standard and elements, such as Header Tags, simply do not exist for the data centres to evaluate what a site is about. For this reason much of what is processed is unordered and has little relevance attached when it comes to getting your site in the results pages.
WHAT ISN’T BEING PROCESSED?
Currently Google has the most advanced methods of crawling Flash, yet it is still seemingly unable to
- Read text as symbols
- Follow links (although it can at least find deep links through swfaddress)
- Make sense of and determine the weight of specific content (Adobe has not provided any new standard that W3 could accept and has no feature to mimic currently accepted tags that search engines could adopt)
Other search engines are unable to do even as much as Google, with Ask and Live unable to even read static text. Currently Adobe is only working with Google and Yahoo so it would seem optimising for Flash would immediately eliminate up to 20% of your potential search audience.
CAN FLASH BE MADE SEARCH ENGINE FRIENDLY?
To a limited extent Flash can be made to be almost as competitive in the search engine results as a standard HTML page. However the work required to get it there is several times more.
THE EASY WAY