Now Google is one more step closer to success on its mission to know everything. Google just launched its, tool Knowledge Graph proposed to deliver more accurate information by analyzing the way users search.
With a desire to provide better search results comes a need for improved site-reading capacity. Google’s bots have been taught to act more like humans to interactive site content, running the JavaScript on pages they crawl.
Google has in the past offered up proposals to make AJAX content more searchable, but this put the load on Web developers rather than on Google’s bots. Finally Google started to figure out to the problem and find bots that could explore the dynamic content of pages in a limited fashion crawling through the JavaScript within a page and finding URLs within them to add to the crawl.
While digging through Apache logs, developer covered evidence that bots can execute the JavaScript, the crawlers seem to be imitating how users click on objects to activate them. Google search would get better access to the cavernous Web content hidden in databases and other sources that haven’t been index-able before.
I would say… Wow! Now, That’s The Excellence and Google Can Do Anything.
Recent Comments