Although an Ajax based environment does not support the MVC pattern, this can be easily implemented. An implementation can be a basis for the presentation layer or the entire browser window.
Also, it allows Ajax to implement a cyclic or nested MVC model. In this case, individual elements of a web presentation layer both have a separate controller as well as a separate model.
If the Ajax technology is used, this presents a challenge for the Web application to follow the WAI rules. For this reason, software developers need to offer alternatives, if for example a website for visually impaired users with screen readers should be accessible.
This is necessary because the majority of Ajax applications have been designed for traditional graphical web browsers.
Search engines / Deep Links
There are several ways to make an Ajax application accessible to a search engine. The various approaches differ in the level of indexing, which can be achieved, and the manner in which this is attained.
For some websites, such as a course of study of a university, it is necessary that each area can be detected by a search engine. One side, however, which provides a webmail service, will not require this.
Here are some strategies known allow indexing a website through a search engine :
On the actual website no structural changes are made. However, existing elements such as meta tags or header elements are used for indexing.
Extra Link Strategy
Additional links will be placed on the website, with which the search robot can follow in order to be able to index the entire website.
The additional links should, however, be visible, even if they are only meant for the search robot of a search engine. Invisible links are a specialty of modern search engines and they are regarded as deception.
Secondary Site Strategy
A second website is designed. This is fully accessible to a search engine and provides either the functions of the Ajax page to the appropriate search word or refers to their functions.
It should be noted that applying the extra Link Strategy and the Secondary Site Strategy by search engines could possibly be interpreted as an attempt at deception (cloaking). The best way to avoid this is to use a <meta name=”robots” content=”noindex” /> installed on the original page.