My last entry The AJAX response generated a few interesting comments, as well as a thoroughly non-scientific and non-representative poll on the use of the various output formats.
I asked which output format people used. Only a minority of the commenters indicated a clear preference, and their "votes" break down as follows: XML 5 votes, JSON 5 votes, HTML snippets 2 votes, plain text 2 votes, and pure JavaScript 1 vote. So it's clear that XML and JSON are currently the most popular output formats. I'd expected JSON to end slightly below XML, but I was wrong.
In the rest of this article I'd like to reply to some points that were made: the name "AJAX", rendering speed, error handling, the "evilness" of eval()
and innerHTML
and some other remarks.
On the whole XML and JSON have found equally vocal supporters. On the XML side of the debate I noticed one fallacy: the fact that the name AJAX has "XML" in it. Although some say this means that XML is the "best" output format, I fully side with their opponents. The name AJAX has been badly chosen, and although it's far too late to turn back the clock and pick a better name, please remember that the phrase was coined by a non-technical person who wanted to point out a useful trend in JavaScript, and not by someone who wanted to lay solid technical foundations for this trend.
Therefore, the fact that "AJAX" has an X for XML doesn't mean anything. In fact, this whole discussion is meant to see if the X is useful or not.
The comments also contain an interesting discussion (comments 10, 30, 32, and 51-54) between Dave Johnson and Jim Ley, where Jim says that JSON is "orders of magnitude faster" than XML, while Dave maintains that XML is faster than JSON, provided you use XSLT for processing it.
Since I haven't done any benchmark testing I cannot comment on this debate: I simply don't know who's right. What I do know is that I'd very much like to see the test scripts Dave and Jim have used to come to their conclusions. Without actually seeing them it's impossible to find out if one of them is wrong, or that both are right under the right conditions.
Dave, Jim, please publish your test pages, ideally including a quick introduction to the scripts you're using.
Other commenters said that even if there is a clear speed difference, the average visitor isn't going to notice the difference between, say, 100 and 500 milliseconds of rendering time. That's of course true, but it's possible (though unlikely) that future AJAX applications will have to format thousands of XML nodes/JSON objects, and in that case the speed difference may become important.
Another important point is that XML documents or JSON strings may contain errors, and that a script that doesn't allow for such errors could encounter serious problems. Although this is undeniably true, I feel that any XML/JSON error that might exist is the responsibility of the server side programmer. In my case this is an important distinction, since I never create the server side programs that send the XML data to my scripts.
Even if you're creating both the client and the server side programs, though, an error in the XML or JSON means that there is an error in the templates that create the XML and JSON, and this error should simply be solved on the server side. Besides, I don't have the faintest idea how to solve an XML parse error or a JSON syntax error on the client side. One mistyped character may make an entire XML or JSON file unreadable, and that's something to take into account when you create the server side scripts.
HTML snippets is the single output format that has some error tolerance, since browsers have been able to handle broken HTML throughout their history. If error handling is of supreme importance, you're almost forced to exclusively use HTML snippets.
A few other remarks deserve an answer.
What I wonder is: What is the best way for handling separation of concerns when using AJAX? How does a (CSS-) designer know which HTML is generated or how does the (HTML-/JS-) client developer know what CSS to use or what HTML to generate?
A good point. Fortunately the answer is simple. The HTML/CSS developer is responsible for creating the HTML structures, since that's part of his/her job. If the script is being written by someone else, the HTML/CSS developer should deliver templates that specify which HTML is used where.
Workflow might become a problem here. It's possible that the development of the HTML/CSS templates and the JavaScript starts at the same time, and in that case there are no HTML templates available yet. Semantic coding might come to the rescue: a good HTML/CSS developer can probably devise an HTML structure "on the blind", without having to create all the CSS.
Therefore I feel that, if the HTML/CSS and the JavaScript are created by different developers, the HTML/CSS developer should create the HTML for the dynamic parts first, send it over to the JavaScript developer, and then both can concentrate on their tasks. This way the JavaScript developer can start coding immediately. If it turns out that the HTML needs, say, one extra <span>
element for CSS reasons, it's not that hard to add it later on.
if you leave the rendering process client-side, what about mobile? Is there a way they can render XML themselves?
Interesting point. I just don't know. Do mobile phones support XMLHTTP? Do they support enough JavaScript to interpret and restructure XML documents or JSON strings? Can anyone comment on that?
Is it possible to use HTML snippits without using the innerHTML property?
Theoretically, yes, but it kind of defeats the main advantage of HTML snippets. And it's hard to code.
PPK, is the HTML method at all useful for instances where you want data from the server that will change how multiple parts of the current page will be displayed?
Not really. You could split the HTML snippet into two and add the two parts to different parts of the page, but the format works best when you can grab the HTML and put it into some element's innerHTML
.
One of the less pleasant surprises was that some people still feel that eval()
and innerHTML
are "evil". This is pure nonsense.
It's true that eval()
may serve as a crutch for lazy programmers, for instance:
var formField = 'name'; var value = eval('document.forms[0].' + formField);
In this example you don't need eval()
since there's a better solution to the problem:
var value = document.forms[0].elements[formField];
However, in the case of JSON output from the server eval()
is very useful, and any technique that doesn't use it would run into the dozens of lines, and would be more error-prone to boot. I don't see any reason to avoid eval()
.
As to innerHTML
, it's an extremely useful and extremely powerful property, and I use it often. If you don't want to use it, be my guest, but please don't wax ideologically about it. It's there, it works fine in all browsers, it can add complicated DOM structures much more elegantly than pure DOM methods. Besides, it's faster.
This is the blog of Peter-Paul Koch, web developer, consultant, and trainer.
You can also follow
him on Twitter or Mastodon.
Atom
RSS
If you like this blog, why not donate a little bit of money to help me pay my bills?
Categories:
Comments are closed.
1 Posted by beholder on 4 January 2006 | Permalink
One accidental advantage of innerHTML is that it lets the browser sort its own DOM out. If you want to write an element with an event handler on it (e.g., onmouseover), it's a fiddle because of the cross-browser DOM handling; build the same idea in a string as HTML first as in...
<img onmouseover="foo(bar)"...>
...and the client does all the work. Heh. Try doing that with the DOM and you find it's verbose and fiddly, and furthermore you can't load the function's argument easily when you set the event handler. So perhaps it's a lazy way of hiding DOM implementation, and laziness is one of the virtues, right?
2 Posted by Jehiah on 4 January 2006 | Permalink
Good thoughts about error handling.
I don't know the full answer, but I do know that I almost always use some extra javascript logic in parsing json responses to handle error conditions (not just transport errors, but validation errors, etc). Not really anything special to JSON but I find it easier to do that way.
Also there is a scripting framwork which I was recently pointed towards (TACONITE) which wraps it's html snippets in a simple xml framework telling the client side what to do with it. This moves some javascript processing and error handling to a framework, and lets you speicify it on the server side. Nice idea, and some similar framework for json parsing would be nice.
3 Posted by Jonathan Perret on 4 January 2006 | Permalink
About eval() on JSON data being evil : it's certainly not a big deal as long as your app is downloading JSON from its origin server.
However, it is easy to imagine scenarios in which the client-side code needs to download data from other sites (think mashups). I wouldn't want to run eval() on something I downloaded from a site I don't control ! The problem with using eval() on JSON is that it blurs the line between data and code. And not being able to use eval() to load JSON suddenly makes JSON a lot less interesting in my book.
So I guess my advice would be :
* Use JSON (with eval()) in tightly-controlled, light-on-HTML (JS sucks for generating HTML ! Actually it does not suck particularly more than other imperative languages like Java/C#/Perl, but that's why we now have ASP/PHP/XSLT...) scenarios (Google Suggest is an example);
* Use XML for interoperability or if you have heavy-duty HTML rendering, which XSLT is best suited for (last I looked Google Maps used XSLT).
4 Posted by Misha on 5 January 2006 | Permalink
I believe, security issues with JSON will be solved pretty soon by additional "secure" eval() which accepts only subset of grammar defined in JSON. This "secure" eval() will be part of a few functions for conversion from JavaScript data structures to JSON strings and back.
5 Posted by Nathaniel on 5 January 2006 | Permalink
Why do we need a secure eval()? Given that XMLHTTPRequest already limits requests to the server the page originated from I don't see that being a problem. And what's the difference between javascript that's downloaded and run on page load and js that is downloaded and run later on the same page?
6 Posted by Vincenzo on 5 January 2006 | Permalink
Nathaniel, you have to read the previous article: The AJAX response: XML, HTML, or JSON?
"The most important advantage is that JSON circumvents JavaScript's same-source policy, if you import the JSON file as a new tag."
7 Posted by Dave Meehan on 5 January 2006 | Permalink
Surely this debate ought to be about appropriateness? If your writing server side for a single client side app, it would be appropriate to use the most efficient output format for that client. If updates are single blocks, then HTML is probably good. If additional processing is required in client, ie. for multi-part updates, then perhaps XML or JSON are appropriate (in your example, there seems to be little difference in the client handling code).
If your writing a web service for public consumption, then XML would seem appropriate, as its the most universally understood. However, you could, in the service call, request an alternative format.
//server/service?xslt=myfrag.xslt
It would even be possible to output JSON via a XSLT, would it not?
There is a downside here. if most calls to the server require conversion, there is additional overhead on the server to make the conversions. This could be offloaded by making more use of the client of course, at the expense of more complicated coding.
8 Posted by Robert Nyman on 5 January 2006 | Permalink
I think X deserves to be in the name, not to necessarily symbolize the return format, but for using the XmlHttpRequest object and JavaScript in an asynchronous way.
And yes, innerHTML rocks! :-)
9 Posted by Alex Lein on 5 January 2006 | Permalink
Interesting thoughts on Error Handling, but I think that's not the correct term. Maybe "Page Rendering Fault Tollerance".
Personally I've started using JSON, and im parsing a string that is about 4.5Kb, including over 90 custom objects (which also include arrays and properties) and it loads litterally in 0 milliseconds.
I'm also using a hybrid approach. I send back an XML document to my XHR, but it only contains one node (documentElement) which is either <json> or <error>. That way I can trap for server-side errors, and still eval() properly formatted JSON strings.
10 Posted by Eric on 5 January 2006 | Permalink
The problem with a need for a "secure" eval has been solved by on Crockford's site. He made a JSON object that has a parse function that will read JSON text. It checks for security with a regular expression.
http://www.crockford.com/JSON/json.js
11 Posted by Memet on 6 January 2006 | Permalink
I have a question regarding innerHTML. I've used it many a times now, not always with great success. I find that some browsers, sometimes, just decide they don't want to apply styles and classes.
Has anyone encountered this?
The other thing I've started using for myself is creating a 'template' TR in a table for example, that has a class of hidden. I clone this template when I want to add rows and switch its class to visible.
I prefer this method because I never get the styling issues, and on top of that, I can leave the HTML coder to do whatever he wants with the template.
On two different notes:
Eric, there is no way to enforce security using a regular expression. It's just not possible. You can weed out certain possible values, but that doesn't mean you've implemented a security sub-system.
Alex: I use the same approach for sending either an error node, or a success node, which contains relevant XML.
I still personally believe that from a programming practice, code should stay local. A server should not have to know javascript details about a client. get-patient-list.php?doctor_id=12 should return just that: patients, not code. That way many people can use the same command.
I guess it's a question of principle.
12 Posted by Logic on 6 January 2006 | Permalink
Eric, depending on the amount of data being processed JSON.parse could become impractical. Not sure one the exact test routines but the difference here looks significant:
http://blogs.ebusiness-apps.com/dave/?p=45
13 Posted by Andrew Herron on 6 January 2006 | Permalink
As far as I can tell, at least from a Nokia developer stand point, mobiles do not yet support XMLHttpRequest. At last check some mobile browsers did support _some_ Javascript, but certainly not anything related to AJAX. Opera has come out with a mini version of its desktop browser, and it may support some form of DOM/AJAX, but I've not tested it.
As for output formats, I've always just used plain-text and JS to set the look and feel of the text that's received. It's fast, it's simple, and I don't have to worry about malformed returns from the backend since I can verify all the data that is returned.
Personally, I don't like the idea of JSON. What people call the 'advantage' could very well destroy the use of AJAX completely. Popups were cool...until advertisers got a hold of them. The fact that data can be loaded from different servers mean that people can maliciously attack a client. If enough of this happens, then people will become aware and disable AJAX or JS completely, rendering your application useless. The checks put in place are in place for a reason, circumventing them, in my opinion, can only lead to a death similar to the popup.
14 Posted by Andrew Herron on 7 January 2006 | Permalink
For the record, Opera mini does not support JS. While I am able to navigate and post, I just had to scroll through the entire menu on the left.
15 Posted by Agustín Fernández on 10 January 2006 | Permalink
Very interesting article.
I think JSON is ideal if you want to easily share internal information about the state of the application between the server and the client. With JSON (or similar) you can work with practically the same object in both (client and server) which reduces development time.
XML has it's uses, though. Almost every program has been supporting it for years, and your data might even already be in XML. And as someone pointed out, it's not so hard to write a program that converts XML to JSON and JSON to XML (for your application). So there is nothing so special about any of the two.
I still find JSON much more readable. XML is a pain for complex data structures. And navigating the DOM for getting simple information when you could just have shared the information itself (as an array, object or string) seems silly to me.
16 Posted by James Packer on 11 January 2006 | Permalink
Another point in the XML vs JSON debate: It would seem that you could run into errors using JSON if the data that is being returned can contain apostrophes and quotes, whereas this is not a problem at all using XML. I find that these characters appearing in data is a never ending source of hassle where javascript is concerned...
17 Posted by NoXi on 11 January 2006 | Permalink
@James: JSON is just as good/bad as XML at error handling.
18 Posted by Aaron Porter on 12 January 2006 | Permalink
An error in the JSON or XTML does not necessarily mean there's an error in the template creating it.
If the JavaScript that requests additional data from the server passes invalid arguments who knows what you'll get back. Garbage in/garbage out.
This may be avoided by having error responses from the server that indicate invalid arguments and handling them appropriately in JavaScript.
19 Posted by Kanashii on 12 January 2006 | Permalink
Just a small script to test the speed of XML and JSON processing http://bemfkunud.com/kanashii/xmlvsjson.php .
20 Posted by Paulo H. Lomanto on 12 January 2006 | Permalink
JSON is good. But do not communicates with other technologies. With XML you can use some webservices that is already on market. Other point: all modern languages have native support for XML parsing.
21 Posted by Paulo H. Lomanto on 12 January 2006 | Permalink
About innerHTML and eval(): evil? Only unexperienced developers say this. eval() is one of most used functions in my AJAX applications and I never have troubles with it.
22 Posted by NoXi on 13 January 2006 | Permalink
@Paulo: all modern languages have support for JSON aswell (http://www.crockford.com/JSON/). And DO communicates with other technologies just as well as XML. XML is just more videly used...
23 Posted by Harry Fuecks on 17 January 2006 | Permalink
Just some more confusion...
Google Suggest contains a response primed to callback a preloaded function, and is fired off by eval. Very easy to implement.
In JPSpan returns a pure Javascript response inside an anonymous function. Advantage there is no side effects plus the server can send exceptions (or even complete classes / objects) to the client e.g. http://jpspan.sourceforge.net/examples/postoffice_server.php/math
Also experimented with PHP's serialize format but character encoding is an issue (multibyte strings) plus you need to watch out for security - more detail here: http://blog.joshuaeichorn.com/archives/2005/11/17/html_ajax-030-released-and-new-website/
One other point to consider, with XML responses, is E4X (http://en.wikipedia.org/wiki/E4X) would make like alot easier in Firefox 1.5
24 Posted by TarquinWJ on 18 January 2006 | Permalink
Re: mobiles
Opera Mini does support JavaScript but only on the server, with some tricks to get basic event handlers working. This means that it cannot use XMLHttpRequest.
Regular Opera 8+ on devices (available for many mobiles, inclusing Nokias) does support XMLHttpRequest - for instance, running Opera on my PocketPC, or Nokia series 60, I can use the full version of GMail.
Minimo in theory should be able to use XMLHttpRequest, but it crashes far too often on my PocketPC for me to test it (basically, it uses far too much memory and runs out before I can load any proper tests). Note that this is an alpha version, and is certainly not ready for public use yet - so you can't expect it to have any market share in the real world (yet).
The upcoming nokia browser, based on Safari's WebKit (KHTML) may also be able to run XMLHttpRequest, since Safari can, but since they have not released any public builds, I cannot say for certain.
The other popular device browsers, such as PocketIE (no DOM support) and NetFront (no XMLHttpRequest, and poor/crashy DOM) are not able to use XMLHttpRequest.
In other words, right now, the only device browser that is widely used and can do XMLHttpRequest, is Opera 8+.
25 Posted by Andrew Herron on 18 January 2006 | Permalink
I would strongly suggest not using AJAX on mobile networks for some time. Everyone knows that you pay quite heavily here in the states for data access on mobiles and if you've got AJAX doing something in the backround, you could be racking up my bill without me knowing. As a respect to the user, who may only have 1MB of data transfer, stay away from using such a technology on mobiles.
26 Posted by Eric Hobbs on 19 January 2006 | Permalink
JSON sounds like a great idea. String manipulation/allocation works well across all browsers and it tends to require less memory to represent the same data - meaning smaller transfers. However, XML has much better in browser support. If you are working with small datasets it probably does not matter which method is used JSON or XML. With XML you can use XSLT to transform very large datasets much faster. A lot off rendering speed issues are caused by bloated javascript object oriented libraries used to generate HTML code. Internet Explorer in particular can not handle creating a large amount of javascript objects ( apparently this has something do due with the garbage collector implementation, it was not designed to handle that type of use). This probably has a lot do with why innerHTML is so much faster than the equivalent DOM methods.
27 Posted by Day Barr on 20 January 2006 | Permalink
There is something else to be aware of when designing Ajax applications accessed via a mobile phone - optimisation technologies designed to reduce bandwidth usage and improve responsiveness over the relatively slow network can cause your app to break under certain circumstances - see http://daybarr.com/blog/2006/01/16/ajax_content_type/ for a more detailed explanation.
Note that this doesn't just affect people using a browser on their actual mobile handset (as TarquinWJ points out, there is limited device support for Ajax techniques anyway) but also affects anyone using a fully-featured browser on a laptop connected via their mobile.
I would also like to address Andrew Herron's suggestion NOT to use Ajax on mobile networks due to cost issues. I say it depends. Of course this is true for constantly polling chat applications etc. but you can actually save your users some bandwidth by appropriate use of in-place updates instead of full-page refreshes.
Is there an *automatic* way to serve bandwidth optimised content to users when they surf on the move using their mobile, and then the whizzy but heavy version when they get back to the office and hook their laptop into the LAN? Detect IP ranges from mobile networks? Ich.
28 Posted by Lander P on 22 January 2006 | Permalink
Thanks for all the nice information on this site.
I’m new to the Ajax technology but are about to write a web applications where I write both the server and client side. At the moment I’m choosing between JSON and Snippets as the server response. From my standpoint going with XML would require much javascript logic on the client, JSON/Snippets seems much easier client wise.
If I decide to go with Snippets cant I as well use a hidden IFRAME as transporter instead of XMLHttpRequest? Its there any benefit using Snippets/XMLHttpRequest that you don’t get in a Snippets/IFRAME?
29 Posted by Jacob on 24 January 2006 | Permalink
Lander P: IFRAMEs are not valid for strict XHTML 1.0 (and probably strict HTML 4.x too). They're fine in transitional variants. They are also better supported than AJAX based solutions, and so compatibility with older browsers can be maintained.
For loading large blocks of a page (or snippets), Personally, I would recommend IFRAMEs *unless* you need it to be strict or you know that only newer browsers will be used. AJAX type development has been done for years using hidden IFRAMES - it's tried and tested, and it works. AJAX allows a little more control over some things though, and is a "neater" solution that doesn't destroy semantics on the page.