We all know the importance of Optimising our site for Search Engines, typically ensuring we use the right keyword density, number of keywords, content etc…
When a search engine send out its robots because of the interactive nature of Ajax and the fact it relies on using as few pages as possible but to interact with the web server, the robot does not get the same content a human user would, in fact Ajax itself is not visable to Search Engines. So reporting back it takes with it little if any information about your site.
Yahoo’s Amit Kumar added that while technology is awesome, simplicity is also crucial so engines can understand content of page. Google’s Dan Crow explained that using CSS, Ajax and Web 2.0 technologies with workarounds will accommodate for search engines in their current state of understanding, but at the same time be prepared for the future when search engines are able to better comprehend these technologies
One way around this is to create a normal page for each Ajax page, though this is not an ideal solution due to the amount of extra work required to create these.
Presenting the search engines with your optimised content is a better way to approach this especially if your site is a large one. Make sure that they can get to a unique page with valid, quality content. You can still have dynamic pages, but you may want to consider using URL rewrites to create Search Engine Friendly URL’s.