SEO is not always compatible with programming languages. So you have to know how is the best way to use them without stopping crawlers from doing their job.
1. Copy the code, beginning at the starting tags, and paste it into a Notepad file.
3. Upload the file to your web server.
SO basically java script is totally unreadable by crawlers and some people are creating links with java script. Search engines can not read links created in java script. So what can happen is that crawlers can not index all the pages because they can’t move from one page to the other because of java script links.
That is why people create basic html links in the footer, that are the same as java script links in the header and spiders can read links in the footer.
Flash is another language incompatible with SEO. It might look great, but it is unreadable by crawlers. So having a 100 % Flash site will totally reduce its ranking in SERPS. Flash has a big code, and the pages open slowly. Usrs can get stuck on an opening Flash page and can’t move forward until the Flash has finished loading. It might be frustrating for users when he is in a hurry and he might never come back.
Also Flash page can stop a web crawler in its tracks, crawler can not crawl pages made in 100 % Flash. Instead, it will simply move on to the next web site on its list.
The easiest way to fix the issue with flash is not to use it at all. But sometimes some companies still need to use Flash. If that is the case, the Flash can be coded in HTML and an option can be added to test for the ability to see Flash before the Flash is executed.
This SEO technique might not be acceptable so be sure to research more about this issue.
Most of the sites on the internet are build as static websites. They don’t change beyond the regular updates. Dynamic web pages are web pages that are created on the fly according to preferences that users specify in a form or menu. The sites can be created using different programming technologies like for example dynamic ASP.
These sites don’t technically exist until the user creates them. Because a web crawler can’t make
the selections that “build” these pages, most dynamic web pages aren’t indexed in search engines.
There is a well known fix for this issue called static URLs. So basically all the dynamic URLs created on the fly can be easily changed into static URLs, that are readable by search engines.
Dynamic ASP like many of the other languages used to create web sites, carries with it a unique set of characteristics. But SEO can still be implemented on these sites. SEO just need to bi a little different than SEO for html pages. It is very easy to do that. Research on internet will provide you with all the needed data.
PHP programming language is anther problem for search engine crawlers. Search engine spiders see PHP pages as another obstacle if it’s not properly executed.
That means that PHP needs to be used with the search engines in mind. Usually PHP stops or slows down search engine spiders. But it is possible to fix these problems with correct coding skills, so these pages will be SEO friendly.
The PHP code should be designed in a special way so these pages will look like HTML pages. This requires special skills but it can still be done. Many Content Management Systems are suing PHP code in the right way and all the pages are easily readable by search engine bots.