Hawaiʻi's Technology Community

Link from Pat Niemeyer:

Today we're excited to propose a new standard for making AJAX-based websites crawlable. This will benefit webmasters and users by making content from rich and interactive AJAX-based websites universally accessible through search results on any search engine that chooses to take part. We believe that making this content available for crawling and indexing could significantly improve the web. While AJAX-based websites are popular with users, search engines traditionally are not able to access any of the content on them. The last time we checked, almost 70% of the websites we know about use JavaScript in some form or another. Of course, most of that JavaScript is not AJAX, but the better that search engines could crawl and index AJAX, the more that developers could add richer features to their websites and still show up in search engines.

Full Article on Webmaster Central Blog

Views: 721

Replies to This Discussion

Recently saw a video of Google's new open source JavaScript engine V8, looks pretty cool, and could probably be applicable for new JavaScript enabled crawlers..
Actually, the whole point is of the standard is so that Search Engines don't have to run JavaScript. What Google is proposing is that when a special request is made, the server sends the result to a headless HTTPClient which will run the JavaScript as needed, and send the results back to the Search Engine, rewriting the URL, then when you receive the rewritten URL, you provide that as the starting place to the user. Now you could include V8 in that headless HTTPClient, but I don't think it is currently used. Google even suggested using Apache's httpClient, as the headless creature.
I actually did not finish reading the article and naturally assumed that a faster JavaScript processor could be used to digest pages. Thank you for the clarification :)



web design, web development, localization

© 2024   Created by Daniel Leuck.   Powered by

Badges  |  Report an Issue  |  Terms of Service