The Googlebot that crawls all the web pages could until now only understand textual content and not image based content, CSS or JavaScript. But all that is going to change now, thanks to a recent Update to Google Webmaster Guidelines. Now you will have to optimize your site not just with respect to the textual content but also with respect to the images and the JavaScript too. You will have to do extensive optimization for your CSS (Cascading Style Sheets) too because the Google Spiders will now recognize them.
If your website settings are such that your robots.txt file is preventing the Googlebot from accessing your Java files then you are at a loss and Google advises you to render access to the bots. The Google Panda algorithm will negatively view your content if it is not up to the mark with respect to JavaScript or CSS file.
If in case you are not sure JavaScript and CSS is getting indexed or not, then you can do a manual check by the following process.
You need to first select Fetch option under the Crawl tools and then you need to enter the URL of the site that you want to check. If you want to test the homepage then you need to leave the field blank. Press Fetch and Render button so that the Googlebot will begin to crawl your website. Once the crawling is complete you can press the Submit to index button to see the detailed results of the indexing.
Previously Google used to advise webmasters to check their content on text only browsers like Lynx but now with this new update you will have to check the content on regular browsers to see to it that the site is optimized keeping the new Google update in mind.
Google offers the following advice for optimal indexing of your website.
Let us know how this new Update from Google will affect your SEO efforts, going forward.