With the dawn of Web 2.0 web developers have enjoyed developing rich desktop like clients and speedy flashy websites that they could have never constructed before. Web 2.0 brought us the ability to maintain state while making requests to the server giving users the experience similar to desktop applications. What Web 2.0 didn't provide was a good model for taking results from those requests and operating on them appropriately. All the model controller code we wrote for Web 1.0 is now useless. While this doesn’t sound like a problem from the beginning it is a staggering problem considering all the controls available and that all of them have to be reimplemented in javascript. I'll admit that there are many new libraries out there but they can be difficult to extend.
In review :
Web 2.0
Good
Provides a way to make asynchronous requests
Provides a way of requesting simple lightweight data and translating that to the screen
Bad
All the web 1.0 models have to be rewritten
Some operations can be quite weighty on the client end
No clean model of what should happen on the server and what should happen on the client
I wont discuss the success of Web 1.0 but the problems were obvious. Small state changes to a page required the user to redownload the entire page. For some websites this is still a clean model but for most this causes bandwidth issues.
My solution to take the good from both and leave the bad is to build what I am dubbing as "Web Synch". Web synch is a way to take an existing proven web 1.0 architecture like cgi, php, asp, and asp.net and give it the advantages of ajax. This takes the models we built in web 1.0 and the asynchrounous technology we developed in web 2.0 and combines them in a powerful new way. This new way operates on the principal of using the web 1.0 models to render a page. Once rendered the server sorts out what has changed. The changes are then shipped to the client rather than the entire page.
Below is a suggested implementation of web synch.
Initial request comes to the server
Page is rendered
Pages rendering is shipped as the response
The response is cached on the server
Page is submitted
Submit is captured and cancelled via "onsubmit" event
Form is serialized
Ajax request made simulating Postback
Second request comes to the server
Page is rendered
rendering compared to the previous request
difference serialized to json (add,update,delete,move)
Page returns from ajax request
JSON is deserialized
changes are merged into the dom
There are a few issues with this idea. One, it is going to require some memory management to store the old rendered page on the server. Two, it will take some processing power to determine changes to the rendered page. Finally, the only way to know that the page has changed is to have well formed html (everything must have ID's).
My first iteration will be targeted for asp.net server side and prototype on the client side.
Everything about my daily life as a programmer/Electrical Engineer!
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment