Prev: Interactive web-based graphs for iPads?
Next: FAQ Topic - How can I disable the back button in a web browser? (2010-06-17)
From: RobG on 21 Jun 2010 20:04 On Jun 22, 4:40 am, "S.T." <a...(a)anon.com> wrote: > On 6/18/2010 7:18 PM, Andrew Poulos wrote: > > >> The fact that most CLJ regulars (the vocal ones, at least) can't > >> comprehend why there's such demand for libraries is the same reason > >> their critiques of libraries are technically accurate yet largely > >> ignored. They can't see in context, nor anything less rigid than a > >> true/false view. > > > There's only a demand because people take on javascript projects that > > are beyond their skill level. > > This is always what I suspected a portion of the animosity was based on. > A fear that opening up DOM manipulation and AJAX to the masses cheapens > a particular skill set. Perhaps you are reflecting your own fears, as far as I know that has never been used as justification for criticism of various libraries in this group. More to the point is that the publishers of various libraries say that they remove the various quirks of cross-browser scripting, yet only appear to do so for a small number of current browsers. Anyone with an "unsupported" browser may be left unable to access on-line resources for no reason other than the site developer's choice of development tools. A fundamental principle of the WWW is that it provides universal access to information regardless of user agent or platform. Developing sites that only work in a small number of browsers is the antithesis of that ideal. It is also very difficult to get library authors to improve their products - how long did it take jQuery to ditch browser sniffing? How many of the current crop continue to do it? Even the supposed champion of web devices, Apple, uses browser sniffing on their MobileMe site to *prevent* access by their own web-enabled products such as iPhone and iPodTouch. I expect it's because the site is based on a couple of script libraries (like spoutcore and prototype.js) and is so bloated that it is unusable on a typical web-enabled phone or similar device. There are companies like Sencha with their massive ext.js framework who are pushing a library for touch devices that is over 200KB minified. It is full of useless functions like: isPrimitive : function(v) { return Ext.isString(v) || Ext.isNumber(v) || Ext.isBoolean(v); }, ... isNumber : function(v) { return Object.prototype.toString.apply(v) === '[object Number]' && isFinite(v); }, isString : function(v) { return Object.prototype.toString.apply(v) === '[object String]'; }, so passing a String or Number object to isPrimitive() returns true. Why are such functions considered necessary at all? In what circumstance will isNumber or isString be better than using typeof? Why coerce a primitive to an Object to find out if it's a primitive? Why two function calls when none is necessary? What do the inclusion of such functions, and their use within the library, tell you about the architecture and mindset of the authors? These functions are pure ECMAScript, anyone with a basic understanding of the language should be able to see their silliness. Javascript libraries and frameworks are promoted as making web development easier. A consequence of their adoption is bloated, slow and limited sites that work only for those with the latest and greatest devices and software and high-speed access to the web. The developers who use them trumpet the fact that they can knock up a site in no time at all, forgetting that for a good number of their prospective visitors the site so produced is dysfunctional. When alerted to that, they respond that they don't care about 0.1% of traffic (or some other fictional number), it's not important in "the real world". They also ignore the fact that if a visitor finds a site dysfunctional, they are unlikely to report it and will simply go elsewhere. The lack of complaints is often held up as evidence that no one is having difficulty, a further demonstration of the cluelessness of the author. The alternative to large, bloated libraries and obese frameworks like Cappuccino and Qooxdoo are small, concise libraries built to provide sufficient functionality and no more. That is the line that has been promoted here for many years, it has never been seriously challenged. The only opposing argument has been that not everyone has the skill to develop or collect such a library. But somehow such people *are* sufficiently skilled to select a library or framework instead. Go figure. -- Rob
From: Garrett Smith on 22 Jun 2010 00:13 On 2010-06-21 05:04 PM, RobG wrote: > On Jun 22, 4:40 am, "S.T."<a...(a)anon.com> wrote: >> On 6/18/2010 7:18 PM, Andrew Poulos wrote: >> >>>> The fact that most CLJ regulars (the vocal ones, at least) can't >>>> comprehend why there's such demand for libraries is the same reason >>>> their critiques of libraries are technically accurate yet largely >>>> ignored. They can't see in context, nor anything less rigid than a >>>> true/false view. >> >>> There's only a demand because people take on javascript projects that >>> are beyond their skill level. >> >> This is always what I suspected a portion of the animosity was based on. >> A fear that opening up DOM manipulation and AJAX to the masses cheapens >> a particular skill set. > > Perhaps you are reflecting your own fears, as far as I know that has > never been used as justification for criticism of various libraries in > this group. > > More to the point is that the publishers of various libraries say that > they remove the various quirks of cross-browser scripting, yet only > appear to do so for a small number of current browsers. Anyone with an > "unsupported" browser may be left unable to access on-line resources > for no reason other than the site developer's choice of development > tools. > > A fundamental principle of the WWW is that it provides universal > access to information regardless of user agent or platform. Developing > sites that only work in a small number of browsers is the antithesis > of that ideal. > "Anyone who slaps a 'this page is best viewed with Browser X' label on a Web page appears to be yearning for the bad old days, before the Web, when you had very little chance of reading a document written on another computer, another word processor, or another network." -- Tim Berners-Lee in Technology Review, July 1996 > It is also very difficult to get library authors to improve their > products - how long did it take jQuery to ditch browser sniffing? How The initial design was based around browser detection. Later on, it was retrofitted with feature tests. Apple, Yahoo, and Google still use browser detection. The IEBlog mentioned how the content GMail sends to clients varies: <http://blogs.msdn.com/b/ie/archive/2010/04/26/feedback-on-the-ie9-platform-preview.aspx> > many of the current crop continue to do it? Even the supposed champion > of web devices, Apple, uses browser sniffing on their MobileMe site to > *prevent* access by their own web-enabled products such as iPhone and > iPodTouch. I expect it's because the site is based on a couple of > script libraries (like spoutcore and prototype.js) and is so bloated > that it is unusable on a typical web-enabled phone or similar device. > Massive waste of effort. All the energy that went into developing those frameworks, yet the functionality Apple needs is not all that complicated. > There are companies like Sencha with their massive ext.js framework > who are pushing a library for touch devices that is over 200KB > minified. It is full of useless functions like: > > isPrimitive : function(v) { > return Ext.isString(v) || Ext.isNumber(v) || Ext.isBoolean(v); > }, > > ... > > isNumber : function(v) { > return Object.prototype.toString.apply(v) === '[object > Number]'&& isFinite(v); > }, > That returns true for both number objects and primitive numbers. > isString : function(v) { > return Object.prototype.toString.apply(v) === '[object > String]'; > }, > Returnds true for String objects and string values. Typical typechecking antipattern stuff. Of course with Ext, they don't use closures much. If they did, they could save a private variable as: var _toString = Object.prototype.toString; return{ isNumber : function(v) { _toString.call(v) == "[object Number]"; } }; > so passing a String or Number object to isPrimitive() returns true. What about null and undefined? They're not primitives to Ext-js developers or what? > Why are such functions considered necessary at all? In what Wher type checking functions are used, it is usually indicative of bad design. Most often, they are used for fake overloading strategies. > circumstance will isNumber or isString be better than using typeof? Probably the circumstance where the developers get tired of repeating typeof checks and > Why coerce a primitive to an Object to find out if it's a primitive? Obviosly coercing an object to a primitive does not tell you if it is a primitive, and if you don't know what it is, then this function won't tell you. Pretty useless and futile, isn't it? [...] > Javascript libraries and frameworks are promoted as making web > development easier. A consequence of their adoption is bloated, slow > and limited sites that work only for those with the latest and > greatest devices and software and high-speed access to the web. The > developers who use them trumpet the fact that they can knock up a site > in no time at all, forgetting that for a good number of their > prospective visitors the site so produced is dysfunctional. When > alerted to that, they respond that they don't care about 0.1% of > traffic (or some other fictional number), it's not important in "the > real world". They also ignore the fact that if a visitor finds a site > dysfunctional, they are unlikely to report it and will simply go > elsewhere. The lack of complaints is often held up as evidence that no > one is having difficulty, a further demonstration of the cluelessness > of the author. > Right, all the classic arguments. I always get a kick out of the line "hardly any of our customers use that browser." It would seem obvious that a user would not return to a site if it didn't work. That's also what I like about IE9 coming out. I just wish it didn't have compatibility mode. I really want to see all those badly written sites break. The easiest way for the companies to learn the consequences of hiring such "developers" is for the projets to fail. Then they'll start figuring on different strategies and hire people who know how to develop cross-browser sites. The other consequence to javascript libraries is at the other end of the stick. Self-promotion, hidden in the guise of generosity, often harmfully imparts bad practices and even lies to a very large number of unqualified web developers who don't read specs. These guys won't know when a blog is full of it or if a library he's been told was great actually isn't. When these so-called experts censor their blog comments and hide any correction, they help promote themselves fraudulently. They do this to fool all of the dummies who won't ever read usenet into believing in them. In the process, they hurt the web by promoting misconceptions instead of the truth that it needs. In the worst case, they react personally, as has happened to me a few times. They may even slander the correct person's character. It's really easy to do that, too, because we all know someone who will do anything to try and "win" a fight and call names, etc. It's a likely and believable characteristic so switching "he corrected me" to "he insulted me" comes off as believable, especially from one of those so-called experts. Library teams tend to involve a lot more marketing than just blogs. They hire paid designers. Cappucino did. The APE knockoff did, too. jQuery has even PR team! PR, for code. So much effort goes into marketing and the result is that it fools all the fools, just as can be expected. As a user, I've found that reporting UI bugs to a site doesn't go over well. In many cases, the company will deny the bug and in some cases they either blame me or take offense. Ironically, such reactions are about the worst they could possibly do for themselves. The best thing would be to try and understand what it is that the user was expecting and where that expectation was not met. Next would be to look into why. Own your mistakes and get more respect. It's easy to complain to each other and much more difficult to reach out to the rest of the web by writing and publishing such things. There should be no reason for JS Conferences to be full of BS. There should be no GWT or Google Closure library. There should never ever have been a Dojo and Ajaxian promoting jQuery is one of the worst things that they have done for my industry. Part of why I wrote the code guidelines and code reviews documents is that people need to see good code reviews on these things. http://jibbering.com/faq/notes/code-guidelines/ http://jibbering.com/faq/notes/review/ > The alternative to large, bloated libraries and obese frameworks like > Cappuccino and Qooxdoo are small, concise libraries built to provide > sufficient functionality and no more. That is the line that has been > promoted here for many years, it has never been seriously challenged. > The only opposing argument has been that not everyone has the skill to > develop or collect such a library. But somehow such people *are* > sufficiently skilled to select a library or framework instead. > Another argument that has been given is that there isn't enough time to develop a reusable code base. A good set of reusable abstractions (a library) saves time. The popular libraries today have such crazy designs that they provide counter evidence to that argument. Then again, most wouldn't know any better anyway. Garrett
From: Johannes Baagoe on 22 Jun 2010 01:28 Garrett Smith : > "Anyone who slaps a 'this page is best viewed with Browser X' label on a > Web page appears to be yearning for the bad old days, before the Web, > when you had very little chance of reading a document written on another > computer, another word processor, or another network." > -- Tim Berners-Lee in Technology Review, July 1996 That is all nice and good, but should I wait till everybody has caught up with Google Chrome before I publish things like these? http://baagoe.com/en/RandomMusings/hash/avalanche.xhtml It is not a matter of using proprietary extensions - I don't do that as a rule, although that page is not valid XHTML, since it contains SVG. It is simply that to the best of my knowledge, no other javascript engine is fast enough for the task. On my platform (Ubuntu), Firefox nearly chokes, Opera isn't much better, and I expect IE won't perform well on Windows, either (I can't test, not being a Microsoft customer). Safari might be all right, if I understand its workings rightly. -- Johannes
From: Garrett Smith on 22 Jun 2010 02:38 On 2010-06-21 10:28 PM, Johannes Baagoe wrote: > Garrett Smith : > >> "Anyone who slaps a 'this page is best viewed with Browser X' label on a >> Web page appears to be yearning for the bad old days, before the Web, >> when you had very little chance of reading a document written on another >> computer, another word processor, or another network." >> -- Tim Berners-Lee in Technology Review, July 1996 > > That is all nice and good, but should I wait till everybody has caught > up with Google Chrome before I publish things like these? > > http://baagoe.com/en/RandomMusings/hash/avalanche.xhtml > No, that's a test page. There were plenty of sites that failed on Opera 10 beta user agent string because it had "10" in it and seeing that, determined that the site cannot work. Unsupported browser pages were popular in the late 90s. Recently Robert Sayre made blog post that used some of those old, nostalgic icons. <http://blog.mozilla.com/rob-sayre/2010/06/04/check-out-these-html5-demos/> This was in response to the recent example of the Apple "HTML 5" demos, which turned out to be using browser detection to determine "unsupported browsers". > It is not a matter of using proprietary extensions - I don't do that > as a rule, although that page is not valid XHTML, since it contains SVG. > > It is simply that to the best of my knowledge, no other javascript engine > is fast enough for the task. On my platform (Ubuntu), Firefox nearly > chokes, Opera isn't much better, and I expect IE won't perform well on > Windows, either (I can't test, not being a Microsoft customer). Safari > might be all right, if I understand its workings rightly. > It was fast on Firefox 3.6.3. I'm scared to try IE but I'll try Opera... No freeze, completed in a few secs. I'm using Windows 7 on a 3yr old decent laptop. Garrett
From: Johannes Baagoe on 22 Jun 2010 04:49
Garrett Smith : > Johannes Baagoe : >> http://baagoe.com/en/RandomMusings/hash/avalanche.xhtml >> On my platform (Ubuntu), Firefox nearly chokes, Opera isn't much >> better, and I expect IE won't perform well on Windows, either >> (I can't test, not being a Microsoft customer). > It was fast on Firefox 3.6.3. I'm scared to try IE but I'll try > Opera... > No freeze, completed in a few secs. I'm using Windows 7 on a 3yr > old decent laptop. Weird. On mine, both Opera 10.01 and Firefox 3.6.3 take a few seconds not to complete, but simply to display the first snapshot - that is 1 % of completion. Chromium 5.0.375.70 completes in about 20 seconds. -- Johannes |