New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.
Starred by 192 users

Comments by non-members will not trigger notification emails to users who starred this issue.

Issue metadata

Status: WorkingAsIntended
Owner: ----
Closed: Apr 2015
Cc:
HW: ----
NextAction: ----
OS: ----
Priority: ----
Type: ----



Sign in to add a comment

Wrong order in Object properties interation

Reported by insidea...@gmail.com, Nov 27 2008

Issue description

Code:

var a = {"foo":"bar", "3": "3", "2":"2", "1":"1"};
for(var i in a) { print(i) };

produces following output:
foo
1
2
3

expected output:
foo
3
2
1

 
Showing comments 58 - 157 of 157 Older
While a number of people have chimed in saying that they don't think this is a bug, not one person has said please don't fix it.

Because of course fixing the behavior wouldn't break anyone's sites.

So the stars represent a bunch of people who either don't care, or really really want the in-order iteration behavior back.

Whatever the ratio between these two groups, this is a crystal-clear message from Chrome's user base that this is a bad decision.

Comment 59 by timd...@gmail.com, Oct 28 2010

I've starred this because I'm interested in the subject. I'd actively prefer it if the current behaviour was retained. I've argued the case on here before so I won't bore you with it aagain now.
Really Tim??  In your comment #36 you say:

---
1. I agree that it would be preferable for there to be a standard in place for the iteration order of object properties, and I understand the benefits in terms of code expressiveness and performance. 
---

So you understand the benefits in terms of expressiveness and performance.. and you don't want them?

Does it have something to do with getting more users for the JS collection classes you work on? (http://code.google.com/p/jshashtable/)

Regardless, I stand corrected.  There is one person who actively wants the random iteration behavior.  Given that I'm getting emails from customers thanking me for arguing the case here, I think the reality is that almost all people starring the issue want the old behavior restored.  But if we very very very generously ballpark it at 5 wanting random iteration behavior, 40 who don't care yet subscribed, the rest (72) care.

Still more stars than any other enhancement, and more stars than all of the top bugs combined.

Comment 61 by timd...@gmail.com, Oct 29 2010

OK, I possibly deserved that for not at least restating my basic position.

I want there to be a standard and I want browsers to implement it. I'd like it to be an official standard so that future implementers of ECMAScript have a reference for how their JavaScript engine is required to behave rather than an unwritten de facto standard to which they are not required to adhere and may not be aware of. Whether or not pressuring the Chrome team to mimic the behaviour of some other browsers is going to help is unclear to me. I would have thought that engaging with the ECMAScript working group would be a better idea. The following URL might be useful: http://www.ecmascript.org/community.php

Regarding my own jshashtable project, the problem that addresses has nothing to do with property enumeration order so is unaffected by this issue. My having written jshashtable and my interest in this issue are just manifestations of my general interest in this area of the language.
Thanks for clarifying Tim.  If I may restate, you do not actually prefer the current behavior as such, rather, you want an ECMA standard and you want the behavior to match that.

So we are back to not one person objecting to a fix, and a huge number of stars strongly implying that people find this far more important than other bugs and enhancements.
@pcxunlimited: I totally agree on this one. Since when IE has been the flag to follow. As developer we make sure our code works first on the "other standard" browsers and later we test if this works on IE many flavors and start to put all kinds of hacks to be compatible. This is done only because we have no choice that Microsoft shoveled IE on every Windows computer. Now we will need to add Chrome and Opera on this pool too. I had not done it yet to show our clients how ugly thinks can get using the wrong browser and probably we will not do it as there is not enough market share from Chrome yet.

@charles.kendrick: As you asserted, this is not a bug, but it is what developers had been used to work. Human can eat meat, feed somebody for years with cook meat, then one day day change it to raw meat, what is it going to be the reaction? This is fault of ECMA for not be specific on this standard and now our meat as we know is mess up. 

On Issue 37404 Comment 16, I went on detail that the real issue is that data coming from other places were we have not control and the possibility to "taint" the object properties before they send on JSON messages for a Chrome hack to work. If I receive a "clean" JSON, I cannot use the build JSON function, I have to code my own JSON parser to taint the properties before they go to a real object. Then, before send or commit any data, it must be untainted of hacks. The speed gained on V8, it will be lost on workarounds.

The most disturbing fact is that any other languages that I have at my reach to test (correct me if there is any other obscure language that doesn't do it), creates and keep properties on the same order that were inserted.

The people that care of this issue could fight on ECMA to clear up this loose end, make their opinion with Opera and Microsoft (as it's done here), but we should start somewhere where people is listening.

I stopped using Chrome as my main browser, I use it only to read my mail. We also stopped pushing Google Chrome Frame from our sites as it's having this same problem due to the V8 guts. My main browser and all my developer team is back to Firefox. Too bad, I really liked to keep driving this "Chrome V8" fast bi-turbo car, but it keep messing up with anything I put on the trunk. 
@timdown: From your knowledge, has been this issue discussed on ecmascript.org? It will be good to put the efforts on a open and on-going discussion instead of creating a new repeated discussion.
THIS IS A BUG. BECAUSE:

Some language as PHP use associative arrays. JSON standart can send that array as object
Example: {"2": "Value 1", "1": "Value 2"} => php array(2 => 'Value 1', 1 => 'Value 2')

Browser get JSON data after AJAX request and all browsers, except Goggle, mamnipulate with Object properties correctly

Associative arrays with numeric keys is very convenient.

Example:

		if (json) {
			for (var index in json) {
				if (json.hasOwnProperty(index)) {
					add_option(el, index, json[index])
				}
			}
		}

I and many developers use this concept many years. For iteration as property in object is very convenient. And no matter what keys (properties) data types (integers or strings).
I really like the speed of V8 platform and NodeJs, but I do not want to rewrite hundreds of projects, and many other developers do not want to do this too. Chromium is a very fast browser, but I have not desire to rewrite code just for people, what use google browseer, because it is not so popular

Comment 67 Deleted

Comment 68 by timd...@gmail.com, Nov 5 2010

Sigh. The fact that PHP implements ordering in its associative arrays is completely irrelevant to whether this is a bug in one browser's JavaScript implementation. Yes, it would be convenient if browsers and specification agreed an enumeration order for object properties.

If you've written hundred of projects that rely on an enumeration order in JavaScript then I have little sympathy: ECMAScript specifically mentions that the order is implementation dependent (i.e. undefined). At no point as far as I'm aware (correct me if I'm wrong) has there ever been uniform behaviour across the dominant browsers, unless it was when Netscape 2 ruled the world.

For object iteration order, there has been uniform behavior across all versions of IE, all versions of Netscape Navigator, all versions of Mozilla/Firefox, all versions of Safari, earlier versions of Chrome, all versions of Opera prior to 10.

In other words there was has been reliable behavior across 99.5+% percent of all browsers since 1994 (16 years now) until the introduction of Google Chrome and the unfortunate focus on synthetic benchmarks at the expense of real world performance.
Tim, i love Google, but many users use project, what was been written before Google Chrome. And if this prjects use Prototype or jQuery or JSON FORMAT in ajax requests - can be some errors (invalid order in tables, lists (cities, countries etc))

Javascript is implementation ECMA Standart and have own functionality (DOM etc).
More years browsers implement for in loop in order such as PHP, PERL, Python etc.

If you want performance for this loop or for other operations - it's very well!. But Allow use old functionality (without numeric keys sorted). And people will love Google and Chrome more :)
ADD data structure that would optimize the numeric keys by new algorithm. Let Arrays optimize indexes, but not objects.
In fact, for arrays it is really very convenient (do not need to sort by key).

But for json data as object - do not change properties order
Let NodeJs change object numeric keys order, I and other developers will be use
[[2, 'Value 1'], [1, 'Value 2']] for JSON and LinkedLists for best performance).
But not change at least iteration in Object for Browsers (ideology of many years)
ECMAScript doesn't say nothing about order object properties and order for in iteration. But this standart doesn't specify that numeric key will be sorted on any object (Array too). no order does not mean the asc order for numeric keys and pasting them to the "begining". But rather means that, in the order in which they are created (as in many languages). Legally there are many questions
Backward compatibility will be flexible if changes affect only the object Array, and all browsers will change the behavior of the elements order for the arrays only (not for other objects)

Comment 75 by bret...@gmail.com, Dec 2 2010

Not to open up another can of worms here, but it seems that JavaScript 1.7 features (if others will eventually support them) would also be significantly reduced in value without a fixed order: https://developer.mozilla.org/en/New_in_JavaScript_1.7#Looping_across_objects

Comment 76 by bret...@gmail.com, Dec 2 2010

(Maybe try back later, seems as I posted this, the wiki just broke.)

Comment 77 by bret...@gmail.com, Dec 2 2010

Sorry for the spam, but now it seems it is already back again... Anyways, the code in question is this:

let obj = { width: 3, length: 1.5, color: "orange" };
for (let [name, value] in Iterator(obj)) {
  document.write ("Name: " + name + ", Value: " + value + "<br>\n");
}

Comment 78 by cor...@aldomx.com, Dec 28 2010

I have not seen this real case:

If I run the following MySQL query and export it to a JSON object

SELECT id, name FROM table ORDER BY name ASC

V8 will simply make the "ORDER BY" statement run in vain...

Is this a "Working As Intended" behavior?
@c78: Yes, for V8, you need to add some special character as prefix for your "id" column, that turns "id" into a String. Then whenever you use this id, you need to remove that prefix. It's so simple to work with, and very fast.

Comment 80 by bret...@gmail.com, Dec 30 2010

@c79: We should also point out that even this is not reliable for other browsers, even if they seem to adhere to it, since iteration order is unfortunately implementation dependent (as already mentioned, allowing IE, for example to have deleted then re-added properties appear in their old position rather than at the end).

Comment 81 by menage@google.com, Dec 31 2010

For a real-world example of why this would be good to fix: it breaks Facebook event attendee pages quite badly. The intention is that your friends are listed first and then non-friends, so you can see at a glance which of your friends are attending. With the Chrome behaviour (which I saw reappear starting today, on Chrome 8.0.552.231) they're effectively randomly ordered, forcing you to look through several pages of guests (on a big event) to find friends.

Comment 82 by timd...@gmail.com, Dec 31 2010

If Facebook's JavaScript developers are relying on a particular iteration order for object properties then more fool them. This is not Chrome's fault.
Yes yes Tim, we all get your drift after all your comments that everyone else is just being stupid about expecting browsers to retain same base logic over time.
A lot of things have become standards after being implemented the same way in different browsers and beople expecting them to keep working that way.
It's a shame that this has not been the same for to the Object element in javascript, because right now the situation is, how to put it, kinda stupid.

There are lots of applications that depend on logically correct looping (first added items get returned first, {20:'Some item 1',10:'Some item 2'}) because ALL browsers did it like that. Then came on the synthetic benchmarks that drive the performance up and up and at some point someone thinks, hey, this sort of ordering is not in spec, lets toss it out, and gain a few more points on a synthetic benchmark.

Then we end up with a years long bug with people complaining about the behaviour and bunch of people telling them to basically shut up, accept it, all the while suggesting BAD workarounds that impact the performance negatively (imagine that, performacne improvement for synthetic benchmarks causes performance loss in real life examples :P).

Now, why are the proposed workarounds bad ?
One workaround is to prepend a non numeric character in front of the key name (at server), so 1 turns to '_1' and later just, strip it away either on the client side when building the select list (usual case for needing ordering) or on server side after submit. It involves changing server side logic on a lot of sites plus and possibly on client side also (even if you can manage to send _1 to client and accept _1 back and convert to 1, then when client side depends on SELECT tag having correct id values and is doing dome lookups on page for that id, all such code needs changes too). So put shortly, this involves a lots of works and creates lots of bug conditions.

Second workaround (also suggested by tim) is to use arrays for that. Either in form of [[2,'element 1'],[1,'element 2']] or [{2:'element 1'},{1:'element 2'}]. Now what's wrong with that ? Well, a lot to be honest. When before you had the ID and wanted to get the value for it, you just did var val = map[id]; With such structure you have to manually loop the entire array, check if the id/key matches and then return the value.
var val = undefined; for(var i = 0; i < map.length; i++) if (map[i][0] == id) {val = map[1];break;}
Or for the case when using [{},{}] style then if part should be if (id in map[i]) {val = map[i][id];break;}
And what's wrong with that ? Well, looping 1000 times in a 1000 item list for example takes 1 000 000 iterations of checks, when instead letting the internals do binary lookup or something like that would only sum up to 20 000 internal iterations perhaps. Plus it requires changes both on server and client side to send out information in different structure and to process it correctly also.

So, in conclusion: there is no viably fast workaround for getting previous logic back. It's not fast or good to reimplement all the logic on server side and on browser side. It is not possible to keep the correct id:value association AND the original order without mangling the id value or losing hugely in the performance of id lookups (you could argue that why not loop the "special" map and create temporary copy that acts as real map and would do faster lookups but again, issue of changing all the code, doing unnecessary looping etc).

So, can someone explain to me again, how exactly throwing the ordering out for synthetic benchmarks and thus changing the de facto behaviour to something else is logical and performance wise for real life scenarios that need ordering AND fast id based lookups on the same time???? And by the way, it's recession time, making changes that involve globally hundreds of thousands of hours of manpower is kinda resource-wasteful when people are all trying to be most optimized and less resource-wasteful, wouldn't you say :P

Comment 84 by timd...@gmail.com, Dec 31 2010

I'm certainly no fan of the hoop-jumping that browser manufacturers have performed to look better on synthetic benchmark tests and I would love for object property iteration order to be predictable and reliable in all browsers. I also agree the workarounds you mention are bad (although there are others). What I'm certainly not saying is that everyone should shut up and accept it. I've suggested before that engaging with the ECMAScript working group (see http://www.ecmascript.org/community.php) might be a better idea in the long run than pressuring a single browser manufacturer.
Well, it seems to me that before all those synthetic benchmark races, all browsers worked predictably, even though ecmascript spec did not actually define anything about ordering and looping. Then the javascript race started (with chrome in 2008 as seen from the start date of this bug) and chrome guys decided to speed up their js engine and dropped the "ordering as added" feature because it was not actually defined in ecmascript rules by that time. After that Opera dropped the reordering thing from 10.5 and if I'm correct the latest IE beta and platform preview have dropped that feature too.

So it all started with Chrome guys dropping a feature that has been around a long time and everyone expects to work, just to gain some advantage in synthetic benchmarks. And now everyone except firefox and safari is following the lead and breaking tons of webpages. I was starting to suggest people use chrome because of it's startup speed and nice html5 rendering but after seeing this bug in action on many pages that used to work normally, I'm suggesting people to use firefox. Reasoning is: I cannot fix pages that I don't own plus IE 6+7+8 combo with firefox hold big percentage of browser share, so it's not especially important for medium sized webpages if some users have some minor non-critical issues.

Instead of using any code workarounds, best workaround is to use another browser that does not try to overdo itself with synthetic benchmarks scores but still retains some logic. Sorry Chrome...

Comment 86 by timd...@gmail.com, Jan 1 2011

This is what we disagree on: I consider relying on a non-standard feature (such as an object property enumeration order observed in some browsers) that it is possible to work around to be poor programming practice, precisely because future implementations may not support the feature you're relying on. I don't really see how you can argue that this is not the case. Rejecting an implementation outright for not conforming to a non-existent standard is ludicrous.
But the point is, up to the point where Chrome decided to change behaviour, all other major browsers did it that way. So the initial decicion was in my opinion not the best one to be made: changing something that was working the same way on every major browser (even Chrome did it that way before) seemingly purely for better synthetic benchmark score. And sad thing is, Opera and IE seem to be hopping on the same wagon with latest releases. And you know the saddest part, I tested IE platform preview 3, which still had the "loop in order of definition" behaviour and the looping seemed to be almost 2 times faster than Chrome, which does reordering on object (for performance reasons...).

But yes, now when other browsers are also jumping on same wagon of benchmarks over logic and compatibility, the only way to force browsers back to previous behaviour is to get this into ecmascript ASAP through their community feedback.
I think the people that comment on a thread like this, are part of a much smaller community than we realize. Most JS programmers barely know what ECMA is, let alone their most latest specs. Yes, Tim it is bad practice to program something that is a non-standard feature. However when all browsers behave a certain way, is there any reason for most programmers to think it's not standard? 

Semantically I agree with you 100%. But with practicality in mind, it's kind of like saying <br> doesn't work anymore because we should all be using XHTML <br />. It's taking something a huge amount of websites do, and invalidating it, putting the brunt of the work on more developers than I'd like to count. I realize this ticket first came in 2008, but I honestly hadn't gotten any related bug reports of my sites until recently, as Chrome has gotten bigger chunks in the market. We should really be thinking of how much more of the "internet" will be breaking due to this change as it becomes more widespread.

Comment 89 by timd...@gmail.com, Jan 3 2011

On a similar theme, how many sites do actually rely on the enumeration order of object properties? I've never seen it, except in examples posted in comments on this issue.
Tim, I think the major thing regarding that qustion is none of us really know or can know. I doubt there's a data source of x websites using objects in such a way. However this issue hits me the same way every time so I suppose we can extrapolate one instance where it would be common. I frequently use objects which I iterate through to print out a list that is sortable. I make the property an ID to easily go back and forth between functions and access the right data. As mentioned earlier, it's much easier to do obj[x] than loop through an array until finding the target. This makes the "underscore fix" the best choice for me, but that creates needless lines of code. 

So now I wonder, how many people do this for that reason? What other reasons are there for a developer to code this way that would seem completely logical? There are probably countless real-world examples that can be given, but I don't know them. 

My question is, has anyone tested the speed difference of object iteration, including extra lines to handle the differences? For instance, instead of just iterating, now I need to iterate and substr. I then need to append back the _ at a later point. It seems to me that the iteration itself may be faster, but not necessarily all operations put together. I'm guessing synthetic bookmarking must be that much faster for people to be so strongly for it, but throwing this out there anyway.
@timdown
I remember creating a self service page for a phone company a year ago that relies on the object iteration order. Back then it worked in all browsers, even in Chrome, but now a user will change his/hers phone subscription incorrectly.. I'm no longer with the company as they out sourced all their development. I'm just saying - it's a big deal.
"I've never seen it, except in examples posted in comments on this issue." That has got to be one of the most common excuses in history.
I've never seen many things in my life, that doesn't mean that they don't exist and they don't have a logical background to them :P

Can I ask how many different websites have you developed at all? And how deeply do you investigate every site you visit? I bet you have seen many sites, without even realizing it, that depend on widespread iteration logical order. I have developed around 15-20 different websites on my own, and 2-3 of them that are most AJAX centric were built with autocomplete searchboxes and/or multi dependent selectboxes (where you select a value from one list and second list is populated according to first selected value.

As a side question, is anyone who's actively commenting here even related to chrome development?
Indrek, this is CCing to a bunch of chrmoium dev's but none have answered. They may not even be reading this because it was closed Nov 2008. I don't know if opening a new ticket and pointing this way will annoy them more or get them discussing again...

I've developed over 100 websites but this issue impacts only the ones I've done in the past 2 years or so - just a coincidence as to when this was created. Older websites were not nearly as JS intense as they are in more recent years. I would bet that more sites over time would be affected by this. Sites before 2005 ( just randomly picking that year out ) for instance, probably rarely run into this.
Opening another bug will be of no help, there were 2 or 3 bugs open at some time, all others were closed and marked as duplicate of this, and this report just keeps sitting around.

I agree that the bug will more likley affect newer sites which use much more js and ajax overall. Has anyone got any idea who exactly to ping to get any well-reasoned response from developers (beyond: behaviour not in spec, we don't care it was like that for years in all major browser, will not fix) ??
I'd just like to add for all the frustration of current developers that a great deal of this annoyance can be overcome by simply accepting reality. I agree that ordering should be maintained to allow for true ordered associative-arrays-like objects. However, ordered associative arrays do not exist in JavaScript like in PHP or other languages. 

What I have done in production on a regular basis is to use an array for ordering (the proper implementation):

    var res = [{id: 27, name: 'foo'}, {id: 12, name: 'bar'}, {id: 33, name: 'baz'}];

and then use a generated id lookup map:

    var map = {27: 0, 12: 1, 33: 2};

Results can then be looped in order:

    for (var i in res) {console.log(i,res[i]);}

or looked up as an associative array:

    console.log(map[27],res[map[27]]);

This requires minimal overhead in the form of a single additional hash, and can easily be generated by looping through the res object just once. 

The status of this bug annoys me and I wish it was fixed, but some of the JS devs' complaints about it being "broken" or cumbersome to work around are a bit dramatic considering the simplicity of the workaround. I'd still prefer if ordering was maintained however, as the speed gain does seem to be somewhat synthetic, though who can really speak on behalf of all existing devs' implementations.
atcrabtree, I do like that solution better than underscores. It's relatively simple, but any solution to this improvement/bug does require extra logic and that means an extra chance for any developer to mess up. I thought we had enough bugs to deal with already. :)

Indrek, I'd assume the CCed chromium devs are the ones to ping if we wanted to hit them up directly. I'll probably end up drafting up an e-mail to em sometime soon. It's unfortunate the only Google responses have been merges and Comment #9:

"This is working as intended.  If you run the dev channel release on Windows, you will see the same behavior that you are seeing on Linux (returns keys numbers first and the rest in insertion order)."

With no counter arguments the community has brought up over the last 2 years and about 100 comments later.
Let me chime in also since everybody's throwing their two cents in the fire.

I wish Chrome would keep the order of iteration for properties as they were added. Throughout browser history, browsers have mimicked  existing de-facto standards even if they weren't in the actual standards. Having said that, I'm glad I made a conscious decision not to rely on that 'feature' when I first noticed it specifically because of things like this.

I always use arrays when I need to rely on order. If I also need quick access to objects by a key, I run the array through the following method to create a map keyed by specific property

/**
 * Given an array and a property name to key by, returns a map that is keyed by each array element's chosen property
 * This method supports nested lists
 * Sample input: list = [{a: 1, b:2}, {a:5, b:7}, [{a:8, b:6}, {a:7, b:7}]]; prop = 'a'
 * Sample output: {'1': {a: 1, b:2}, '5': {a:5, b:7}, '8': {a:8, b:6}, '7':{a:7, b:7}}
 * @param {object[]} list of objects to be transformed into a keyed object
 * @param {string} keyByProp The name of the property to key by
 * @return {object} Map keyed by the given property's values
 */
function mapFromArray (list , keyByProp) {
  var map = {};
  for (var i=0, item; item = list[i]; i++) {
    if (item instanceof Array) {
      // Ext.apply just copies all properties from one object to another,
      // you'll have to use something else. this is only required to support nested arrays.
      Ext.apply(map, mapFromArray(item, keyByProp));
    } else {
      map[item[keyByProp]] = item;
    }
  }
  return map;
};

I think the above is better than appending characters to keys from both the server and client side. 

To web devs: future proof your code, don't rely on non-standard behavior. 

To Chrome devs, don't be such hard a$$es about keeping an existing behavior that many are asking for.

--Juan

Comment 98 by timd...@gmail.com, Jan 4 2011

"Can I ask how many different websites have you developed at all?"

I have no idea. I've been a professional web developer for 13 years, so it's certainly in three figures. In common with I imagine almost everyone in the world, I don't look in detail at the JavaScript of many of the sites I visit, and I'm sure there are often many appalling pieces of code running in my browser under my very nose. None of which changes anything. I was curious because I've been surprised by how widespread the practice seems to be judging by the comments on this issue, seeing as I've never seen any tutorial, book or even Stack Overflow answer recommending relying on property enumeration order.
@timdown This post is cute:

http://stackoverflow.com/questions/280713/elements-order-for-in-loop-in-javascript

They quote John Resig saying Chrome will fix this, back in 2008. Then it's edited in 2010 saying not to rely on it because it's not going that way anymore. Another post in 2009 also says not to do it this way. It's funny that there seemed to be more hope a few years ago. Unfortunately for me, and many programmers, I have been doing JS for 10 years or so, under this assumption, and therefore never looked for the "correct" answers posted in 2009+. ( I also never read ECMA - just saw it worked in every browser - as I mentioned before I doubt many people DO read that thing ) It's odd how something would not cause an issue for so many years and then in 8th year or so, I find out I suck. I can't even think of what JS resources were available back then that taught good standards.
@mendesjuan: You say "future proof your code, don't rely on non-standard behavior." Are you saying that we should rely on "standard behavior"? If so, do you mean "standard" as defined by official standards bodies like W3C or Ecma?

In web development, we have learned NOT to rely on standards, because they are typically implemented in only a subset of the browsers our audiences use. Instead, we methodically test in every browser, and rely on resources like quirksmode.org to determine what behaviors to expect. We rely on what works in the majority of popular browsers at the time. For the most part, this is the safest approach, since browsers are typically backwards-compatible.

So the advice to developers is: Read the standards. Read the empirical data. Don't rely on either. Do the best you can, and make sure you keep a maintenance contract on all the sites you develop.

On the bright side, all this mess means job security for web developers, since it requires a vast amount of knowledge as well as practical experience.

On the other hand, it's bad news for users, because only the best companies can afford to hire exceptional developers.
Got more answers from someone at Chrome. They don't have the actual numbers on the speed boost for this. The thing is,  even if operations to get data out is more complicated, and took more time, it's still better overall for an AJAX application. This is because tracking order keeps the extra data in memory, which slows down the app overall, even after iteration processes are done. Along with additional RAM usage comes extra garbage collection and the such. With that in mind, I highly doubt this behavior is going to be reset. I'm most likely going with the mapping solution such as atcrabtree and mendesjuan had listed. So it's extra pain now but clears the way for better user experiences in the future.
@spamfry: Don't twist my words, I obviously didn't mean that you should read the standard and code according to it disregarding the actual implementations. What I said was that if a feature is not in the standard but is a de-facto standard, you should be careful. If there's an easy workaround, I would go with that, that's why I suggested that method instead of appending characters to numeric ids. 

We all use innerHTML and I also use document.activeElement. However, relying on order of object iteration seemed like too much. Actually, we used Rhino for scripting parts of our app and property iteration was out of order so we chose not to use it even within the scripts within a page.

Note that I'm totally against removing an existing feature and I'll blast chrome for it. I'm just saying being careful paid off for me.
I created a simple page, that tests creation of object with different styles, measures time and show how things used to work and how they work now.
http://www.upload.ee/test/objtest.html

"They don't have the actual numbers on the speed boost for this." Well, if they don't have any actual numbers comparing before/after then it's kinda hard to just take their word for it, especially since none of them has commented here with any information.

"This is because tracking order keeps the extra data in memory, which slows down the app overall, even after iteration processes are done." Is this a direct quote from the developer who is responsible for that code ? In another bug thread there was sample of source code responsible for this behaviour and basically all it did was something like that:
if (is_numeric(key)) return value_by_int_index(key);
else return value_by_string_hash(key);

Same sort of logic for get and set. So I would love to hear from an actual Chrome developer why exactly does it take so much of an overhead to store numerical keys in added/defined order but it's ok to keep that overhead for pure string keys. When I tried {2:2} style where key was direct integer and looped it later on, typeof key returned string, so internally all keys are strings anyways...

And also, IE9 beta and Platform preview 7 finish that test page with 165ms vs Chromes 235ms, and it also beated Chrome in such a test with platform preview 2 or 3, whichever still retained the "loop as defined" order for numeric keys also.

"So it's extra pain now but clears the way for better user experiences in the future." How exactly is better user experience to create double load on clients: first let the js engine parse the original json that is something like [[12,'12']] and then loop through it and create object element for lookup to be like {12:'12'}. All the memory usage concers go out of the window, right there. You just doubled the size of data in memory. And more likely [[12,'12']] takes more memory than {12:'12'}, even with "expensive ordering data" and whatnot, because there is an array, which holds another array, which has only 2 elements on it plus length counter (for every key value pair there is length counter stored).

How come it is logical to try to preserve unnamed amount of memory and gain unnamed amount of performance (which IE still did better even while keeping the order) and ALSO logical to duplicate stuff in memory. Saving memory just to use it up later on to gain same effect, just with much more work involved....

As I stated before, all those "workarounds" come with memory and/or performance impacts, which all invalidate the performance/memory gain excuse/cause pointed out before. This sort of optimization only helps for synthetic benchmarks, but for real life cases this behaviour just plain sucks and does not gain anything except extra work for developers who need id => value mapping to retain the order.
An actual quote of what was said to me was:

"Not only does it use more RAM, but it also
makes things slower because there will be more memory to deal with
when performing garbage collections."

Not listing which dev put that since he was kind enough to answer an e-mail, I'd be a prick if everyone started bombarding him for it.

I think they are under the assumption most uses of objects don't require object iteration. For instance I probably have defined 50 objects on a certain page, but only a couple require order. Therefore I only need an additional 2 arrays to make a proper mapping. This instead of having all 50 objects care about order, when it's not required at all times. So basically we have to put the extra leg work in when it's needed, which isn't ( I'm assuming ) a majority of the time. The reason Chrome didn't benchmark is because the thought of using up memory when a user did not need it used was a practice they did not want to follow. 

It's not just "saving memory just to use it up later." If your app isn't refreshing the page once, it's saving memory to have it available later. There is a limit to what can be done in a browser, and the chief limiter is memory. The more we have to work with the more the app can be doing at once.

Also the ECMA proposal sitting around is currently to make the way Chrome does it the standard, not the other way around. http://wiki.ecmascript.org/doku.php?id=strawman:enumeration

As noted by just about everyone here, no one on the Chrome team is reading/responding to this thread. No matter which side of the debate you are on I would believe our time would be better spent going over there than here. Otherwise we are just talking to ourselves.
The V8 devs are reading this, and discussing it.  You have almost convinced me,
for example.  The problem is that JSObjects and JSArrays are implemented with the
same code, and changing the behavior of one of them means that every access to any property of an object or array would need an additional check to see if the object
is a JSObject or JSArray.

Changing both to keep the insertion order would increase the size of all large
JSArray objects by a factor of 3, and plenty of application code relies on using
lots of large arrays.

A solution that might have minimal impact is based on the fact that elements with
numeric indices are stored in one of two ways.  A dense set of elements, like
elements with keys 1,2,3,4,5,6,7,8,9, and 10, are stored as "fast elements", which
take one word per element, and have optimized fast methods.  A non-dense set of
elements, like elements 2415436, 5403932, and 3060393, are stored as "slow elements",
a hash table which includes an insertion order field for each element.  The fact that
this insertion order is maintained, for "slow elements", is because V8 does maintain
the insertion order for almost all object properties, just not those with numeric
indices.  This is done for compatibility reasons - it just does not go the final step
to include properties with numeric indices.

If JSObjects only used "slow elements", and the for-in loop used insertion order for
JSObject slow elements, and sorted order for JSArray slow elements (and fast
elements), then JSObjects could maintain insertion order, without much slowdown.
The only disadvantage would be if people were using JSObjects like arrays.  A
JSObject with elements 1 through 1000 would now be much slower and bigger.

But, again, it is not just synthetic benchmarks that depend on large arrays, so we
cannot make all arrays in the system 3 times their current size.  V8 implements
JSArray and JSObject the same way, and making Arrays and Objects behave differently
is hard to do without slowing down everything in the system, because every use of
something would need to check if it is an array or object.  If this change was easy
to make, it would have been done already.

Hmm, would it be possible to check data type upon insertion?
As I understand, fast and slow system works internally in parallel (thus getting the numerical keys first in key value order and after that string keys in defined order).

So when doing obj[1] = 123; (typeof 1 == 'number') it would go to the fast system, and when doing obj['1'] = 123; (typeof '1' == 'string') it would go to slower system that stores ordering?

But since for(var i in obj)alert(typeof i); always shows string, then looping would still be broken, unless the key returned would retain it's original type (number). I did not find in ECMAScript specs if such a behaviour would be allowed or not plus it would again be different from the other browsers... Anyone else got any ideas?

Would it be possible to separate JSArrays and JSObjects code into two separate codeflows so that arrays would still be fast and small, but objects would still act as they were before?
Hey guys, look at anothe google project
http://code.google.com/p/google-gson/
they using LinkedList, LinkedHashMap, LinkedSet - all for preserve same order in Java object and JavaScript object

Comment 108 by ryan...@gmail.com, Jan 26 2011

I just want to applaud google for, thus far, sticking to their principles.  This is NOT a bug.  People who expect key-value objects to preserve order of keys are bad coders.  Period.

I you want ordered data, use a standard array (numerically indexed starting at 0).  If, instead, you look up data by key values one at a time, use a key-value pairing.  It's very easy. 

@theasp:  LinkedList and LinkedSet are NOT key-value objects.  The difference between arbitrary key value pairings and indexed arrays exists conceptually, beyond languages.  But it also exists explicitly in Java.  Thus it makes sense for LinkedSet and LinkedList to be ordered and preserve this.  LinkedHashSet is an exception to most of the Java rules.  Ordinarily in java a Map implementation (HashMap, Hashtable, etc) do NOT preserve order of keys.  

Comment 109 by jad...@gmail.com, Feb 6 2011

Let's get real! This discussion is all about high-and-mighty "good coders" finally getting a chance to take a shit on the heads of the "bad coders". Notice that what amounts to a moral sin (making an assumption without ECMA approval) is mentioned over and over as the justification for bringing massive punishment (breaking tons of code all over the world). Remind me of this:

As the two travelers entered the town, the Puritan children looked up from their play--or what passed for play among those somber little kids--and spoke seriously to one another.

“Behold, verily, there is the woman of the scarlet letter; and, of a truth, moreover, there is the likeness of the scarlet letter running along by her side! Come, therefore, and let us fling mud at them!”

Sigh!
http://en.wikipedia.org/wiki/XMLHttpRequest
"The latest revision of the XMLHttpRequest Level 2 specification is that of 7 September 2010, which is still a working draft."
Seen as XmlHttpRequest is still a draft and not a solid standard yet so it can change anytime, can we expect some backward incompatible tweaks and modifications in near future to it also, in order to speed up some benchmarks ?

I'd like to see someone justifying breaking changes to AJAX with "well, it was not in written and approved standard, so it was your own fault you used it like that, no matter that ALL the browsers worked in a same friggin way". According to such attitude and the state of actual specifications vs drafts in the current html5 era, developers should not be doing much anything except fussing around and waiting.

In reality, browser makers are the ones who actually make up those standards boards and they are the ones who are actually implementing those features and improving the drafts/standards in the process. But instead of improving the jsobject spec, it is kept purposefully vague to allow browser vendors to get away with anything, even with breaking changes from all other browsers, with an excuse "well, it wasn't in a written standard actually, so you should not have expected it to work that way, no matter that all the other browser vendors, including us, did it the same way for years"...

And on a side note about memory consumption: I bet that using [{key:value},....] or [key1,value1,key2,value2] style PLUS lookup map for speed {key1:value1,key2:value2,...} will still take up more memory than when using the "order of insertion" logic. So for actual implementations, the memory savings are sort of nonexistant. And things require more coding because you have to keep two variables instead of one, and use one of them for sorted looping and other for value lookups when you have the key...
There seems to be a widespread feeling that this used to work the way people expected it, but then the V8 team broke it in order to be mean.

What actually happened was that originally the order was completely arbitrary in V8.  At a later point it was changed so that non-numeric indices were in insertion order, and numeric indices were sometimes in insertion order.  Whether or not the numeric indices were in in insertion order was dependent on internal V8 heuristics that decide whether to use an array or a hash map implementation for the numeric indices.  Making heuristics in the V8 implementation visible in this way was felt to be undesirable so it was normalized so that numeric indices were always iterated in numeric order regardless of the internal representation.  Numeric iteration order was always a possibility, but with the last change it was made predictable.

There has never been any difference between the internal representation or iteration order of arrays vs. other objects in V8.

Here is an independent test of the way arrays and objects perform in various engines (a little out of date now): http://news.qooxdoo.org/javascript-array-performance-oddities-characteristics  If this bug ever gets 'fixed' you can wave goodbye to some of the nice performance results in that graph.
It wasn't broken to be mean in my opinion, it was broken to gain additional points in benchmarks scores, as you yourself point out "If this bug ever gets 'fixed' you can wave goodbye to some of the nice performance results in that graph". In IE9 PlatForm Preview 3, the object creation and iteration seemed to be on par with speed of Chrome, and it was iterating in the order of insertion. I don't know how it did it internally, but the fact was, that it did and did it with good speed already...
It was never any less 'broken' than it is now.

According to phistuck above IE9 iterates in the same order as V8.
Please see previous discussion:
1. No one is asking for array iteration order to be changed
2. Faster object iteration without order preservation is a net loss because it requires extra objects and extra instructions to be burned to handle this common use case

Comment 115 by jad...@gmail.com, Feb 7 2011

And so the utility of JSON takes a huge step backward in the name of synthetic benchmarks and ideology.
Obviously, the right solution is to let people keep the order if we want to.

People want SortedMaps. Either make objects behave like that (ok you don't want) or suggest another way.

Currently I'm using { "_1": .., "_2": .. } hack. Ugly isn't it? 
But this variant is best in terms of code shortness and convenience.

Dear browser developers, please be nice and let me use SortedMaps and not continue hacking around.
Its not necessarily bad coding to want your key value pairs reserved.  Consider the example of displaying data in a grid where you expect rows to be in order via their primary key from the database.  Because we have the elegance of associative arrays where our keys can be meaningful, we SHOULD be able to utilize those keys in the code.  So you create a bunch of data on the server side and create a JSON object out those rows KEYED by their primary keys in the database.  Instead of having the primary key be a property of each object. So you SHOULD be able to loop through my rows and they should be in the same order as they were when i created the JSON object.  However, because the keys are not preserved, in Chrome this doesn't happen. 

So i have 2 solutions.  Either add the _ and substr(1) it out later.  Or don't use the primary key as the key to my associative array.  Which isn't a good solution in my head because that makes the data read correctly, and its a very nice elegant solution when looping.  While looping your key is your row id, and your value is an object of the data. Done deal.

Its not bad coding to want your keys to be meaningful.
I just don't get it.  This has been practically a de facto standard for years, and whatever benchmark gains are made come at the steep price of usability, common sense and functionality.

"Real world benchmarks", you know... the ones that matter, not the ones used by the sales guys... will be slower once we have to rebuild functionality for order preserving behavior.

Comment 119 by timd...@gmail.com, Jun 28 2011

jadema: Not really: there has been no backwards step because the JSON standard has never had ordered object properties. It's right there on the JSON home page: "An object is an unordered set of name/value pairs". I think (and most would agree) that it would be preferable if the properties were ordered.
Hi All May I know the solution for this Issue Please.
 Issue chromium:32385  has been merged into this issue.
I managed an easy enough fix for my long suffering at the hands of this "issue"... my keys were all numeric for an object literal being returned by my api (see attached file for test file with sample api output) - by adding an underscore to the beginning of the key, thus making all the numeric keys, strings, the sort order fixed itself. 

Easy to do in my position where the keys where there to avoid duplication of array items on the server side but might not be the right fix in your case.

I still believe this should be fixed and numeric keys iterated over in ascending order where possible.


chrome.html
5.3 KB View Download
I'm surprised too see such discussion on that. Order is unspecified by a standard, and no one should expect it or foce it to be specified.

If you have a case where you want to iterate over values in specific order, then use following:

Object.keys(obj).sort(compareFn).forEach(function (key) {
  var value = obj[key];
});

compareFn is function with which you determine order you want (you may omit it if you want alphabetical order).

Whoever asked for people who don't want this fixed: here I am. When I want sorted map, I'll use sorted map. Until then, I don't want to be penalized in performance on every property access just because some people are as lazy as to never even read standard of language they use, as they directly admit in this conversation.
mn.medikoo: The problem is not that lazy developers can't think of a way to programmatically re-order their object properties. The problem is that we need to be able to specify values in a specific order (not necessarily an order that can be defined by a function) and trust that will be preserved.  A list of shoe sizes, for example, would be something like "...12, 13, 1, 2, 3...". I don't want to force an ordering, I simply expect the language to preserve the order as specified.

rowaasr13: The problem is not that lazy developers expect the JS engine to automagically sort our object properties for us.  We don't expect objects to behave like a "sorted map", simply an "ordered map". The problem also has nothing to do with whether a JS dev has or hasn't read the ECMA standard. JS is very different from other languages, in that the standards have been largely useless for most of the language's life. For years the World's Most Popular Browser violated the standard more often than not, yet developers were still tasked with writing functional applications for this platform and other more-compliant platforms using the same code. If the written standards aren't standard at all, what's the point in reading them? Even today, an application written using only the ECMA documentation as your guide would likely fail in some areas for some users on some browsers, which is unacceptable. Only when all the major ECMA implementations (eg, browsers) conform identically to the standard, and those implementations gain a majority in the worldwide userbase will it be useful for developers to start coding against it.  Until then, we will all be writing JS code that violates the standard (knowingly or not) in certain conditions or on certain browsers.  This isn't x86 where it's extremely rare to have the same instructions produce different results on different CPU vendors (which isn't a perfect counter-example, but you get my point).

The problem is that most JS developers for the past X-teen years have relied on JSObject to function like an Ordered Map with keys and values. The fact that the same structure also happens to represent objects and their properties is simply an unfortunate coincidence. Developers have been assuming all this time that a JSON "object literal" (either written inline or returned from AJAX) is actually a "map literal". This is a natural assumption: the entire HTML document is one giant ordered literal object.

We JS devs tend to view the world and our code in "document order":
  * Our code will execute top-to-bottom
  * If I enumerate the children of this div, they will be enumerated top-to-bottom
  * If I insert this new element X before element Y, X will appear visually above Y (where "above" is a relative term)
  * (natural, but apparently now-incorrect assumption) If I construct a JSON object and carefully order the members in a way that makes sense in the real world, that order will be preserved later during enumeration (for text insertion, object creation, whatever). This is exactly the problem described by the OP.

In HTML / DOM, order is very important, and not just because we want things in alphabetical order.  There are usually some other human elements involved.  Size 13 shoes should appear before Size 1 just as January should appear before February.  Forget performance or memory usage, this is a basic fundamental requirement.  This is something that JS devs have assumed the language is capable of for years, because in most historic implementations it was.  Whether JSON objects are the proper vehicles for this functionality or not is another discussion, but I believe enough precedent has been set that you cannot now say this cornerstone of functionality has been an accident all along and JS was never meant to support this.

We need the ability to provide a literal mapping object that preserves order. I understand there are several implementations of OrderedMap classes, but the whole issue is in declaring object *literals*. In today's world of JSON and AJAX, we need to be able to communicate this concept of an ordered map from server to client.  AJAX servers need to be able to communicate Map objects whose key/values are in a specific order and trust that the client will receive those key/values in the same order.  To have the JS engine step in mid-transfer and re-order the object values is completely unacceptable.

To a lesser extent, static literals defined inline should also always be preserved.  Just like the entire rest of the [X]HTML document is order-preserved, an object literal that reads in a specific order from top-to-bottom should enumerate in that same order so it can be rendered into the DOM in that same top-to-bottom order.

To my knowledge there isn't yet a special literal for this purpose, so JSON objects are still the only option.

My favorite suggestion so far is in c106: consider string-of-int keys as strings instead of ints. This allows developers to write ordered maps by ensuring that all keys are defined as strings (even if those strings are all-numeric). This also preserves the performance characteristics of arrays and objects with integer keys.

Comment 126 by dudy...@gmail.com, Jun 24 2012

I'm a developer who has hit this limitation with surprise too, and I desire v8 implement objects as ordered and associative arrays. I understand the ES standard does not require this, so my appeal is not to keep this item as an Issue, but to move this to a Feature Request.

The points I'd like to rebut include:

1. that this ticket's type and status are correct
2. that the "Issue" discussed helps should stay as implemented for performance and memory benefits
3. that the "Issue" can effectively be sidestepped to provide desired results

For the first item, I see this Issue's status and type as creating a "bicycle shed" and causing the item involved to become "paint". To lessen this effect, I propose changing this "Issue" from a bug to a feature request. Specifically, this is not "WorkingAsIntended", and I'd argue that's the biggest reason why everyone posts here, and not for the "Working" part as much as the "Intent" part. If V8 only grounds their intent on implementing ES standards, and those standards don't enforce a direction, then there plainly is no intent whatsoever. So if the "Issue" is working by any form of "intent", my only choice at present is to assume the V8 designers made their choice out of their own wisdom. From the discussion above, I only gathered performance, memory, and standards as a purpose for "intent". I question that wisdom in the subsequent points stated below.

As for item 2, I don't see this as helping performance when JSON is hurt. My grounds for this are that JSON performs best and conserves the most when it's information density is maximized. When you need to otherwise post-process or "color" the data coming from JSON, you cause developers to both put more load on the wire and increase the cost of TCP transfer (and that only worsens if you can't use web sockets). I see copper and fiber as both being a costly part of any application, and managing post-processing of data is also costly to me as a developer. If you believe that post-processing is still viable and/or trivial, see my next point.

And lastly item 3, I don't see workarounds, "hacks", or alternatives as being an argument. This is a discussion about why this Issue should be a "Feature" or "Bug" or "WorkingAsIntended". Posts should be directed at arguing for or against that, not about whether so-and-so is a "good developer" or should "eat the cost". The true argument is, in supporting a language, would a given feature lend more or less to those utilizing the language. If you argue that I can out of my own effort mitigate the limitations of the V8 object, I argue back that you can implement the feature in V8 without the limitation and neither require additional effort nor cost any to the end developer. Don't forget that while it's expected of V8 to implement ES5, it's not limited to this. Purely looking at V8 as a product I challenge the choice to limit V8's object to dropping insertion order. This is substantiated as, by my own observations, you'd only lose the ability to create an "unordered" associative array. This unordered flavor has only performance and size gains, but no added functionality. These costs are very different from functional costs. I know the lunch isn't free, but performance and size on a client are still not as big a deal as performance over a wire (see previous paragraph) or functionality which makes V8 plainly more competitive and productive.

Lastly, I'm aware I can point out these arguments inside an ES discussion, and so I'll voice my opinion there too. That said, I also find it meaningful and orderly to post my opinions, thoughts, and observations in this feature/bug report.


In summary, I'm not sure why this discussion is so long either. This is NOT a big deal and it's also NOT a bug. I'm a developer, I see this as a feature, and I'd like it. If you want to lessen the hype this Issue has gradually gathered, just file it in a different place. Bicycle shed no longer... there's nothing to add if this is plainly a feature request... but people will forever argue this if the discussion is about "intent". V8 always has the choice to implement this feature this way... it just makes me more likely to choose a different JS interpreter that does implement the features I desire.
 Issue 2353  has been merged into this issue.
What if preserving the order has an overhead which some devs don't want to incur? And as the JS language is we certainly don't want multiple data types, one for ordered and another for unordered list?

Comment 129 by jan...@gmail.com, Nov 9 2012

Work around for this issue is adding leading 0 for all integer indexes.

for example:

var a = {"foo":"bar", "03": "3", "02":"2", "01":"1"};
for(var i in a) { print(i) };

produces following output as expected:
foo
3
2
1

Comment 130 Deleted

Comment 131 Deleted

Thank you, #129! This works indeed!
Cc: dslomov@chromium.org
My comment to the developers that think that this is all good and working as intended, is that you must not be reading the books teaching people how to program because to date every programming book I have read puts a point at the index of a value in an array. 
If you create and array you are not supposed to think Okay now that I have my list, lets see what chrome thinks of it.
 
For now I will have my systems check if the user is on chrome and tell them sorry adaptive systems do not work on chrome because chrome doesn't know how to count.

snarky comments might not be very helpful =) objects are essentially dictionaries, and dictionaries are not ordered sets, and enumeration order through these dictionaries is not really specified. If your users expect ordered sets, the best thing to ask them to do is to use ordered sets (there is a particular enumeration order of dictionaries, and it is interoperable in some ways, but it's not something you should rely on just because of the nature of the data structure)
Also, the best way to test would be something like this, which is completely platform-independent, and should run on nearly everything. I also made sure to make it as small as possible minified. 

```js
var iteratesObjectsInOrder = (function (list, prop) {
  for (prop in {3: 1, 2: 1, 1: 1}) {
    list.push(prop);
  }
  return list[0] == '3' && list[1] == '2' && list[2] == '1';
})([]);
```

But, as far as I can tell, V8 does currently iterate in order. It still is unwise to rely on it, as #135 said. Also, even though that is the case, if you add or subtract a key, anything could happen. 
Actually, this snippet would be better, since it's smaller. It should still run on everything. 

```js
var iteratesObjectsInOrder = (function (str, prop) {
  for (prop in {3: 1, 2: 1, 1: 1}) {
    str += prop;
  }
  return '321' == str;
})('');

// Minified:
var iteratesObjectsInOrder = !function(s,p){for(p in{3:1,2:1,1:1})s+=p;return'321'==s}('');
```
Just a quick note regarding #136 and #137: Trying to detect if the iteration is "in order" doesn't make sense, this is only semidecidable from the outside: By testing the result of a few iterations you may detect that the underlying implementation doesn't return things in insertion order, but you will never be able to prove that it will always obey that order.

In your example, it's e.g. easy to come up with hash functions/table sizes which keep the illusion of obeying insertion order (whatever that means for object literals).
You guys are hilarious. Fifo, plain and simple. Whatever you've implemented is something else and I really don't want to see the code.
Why are people quoting the spec as an excuse for not preserving the order of the keys? The spec doesn't say that the order of the keys should be altered. That means that you are free to implement the conventional and useful behavior of preserving their order. Doing that would not contradict the spec.
How much more impressive would it be if instead of hearing "we refuse to implement the conventional behavior because the spec says it's optional", we heard "while the spec regards the order of the keys as optional, Chrome guarantees that the order of the keys will be preserved". Which of those statements would make you love Chrome more?

Comment 142 by j...@6bit.com, Mar 30 2015

Geez people, use the right tool for the job. Object properties are not sorted. Use an ES6 Map shim, then when ES6 lands you can extract the shim and keep the shitty grin on your face when all your code is fast.

https://github.com/WebReflection/es6-collections
I'd be more impressed to hear "we deliberately randomize order to encourage programmers to not rely on ordering of object keys since it is specified as unordered". This is what Go did to their map type when  people started asking about key order. Swift looks like it will do something similar.

To use an unordered structure and expect ordering is monumentally stupid.
For the record, the ECMAScript 2015 language specification clearly defines object property iteration order: http://tc39.github.io/ecma262/#sec-ordinary-object-internal-methods-and-internal-slots-ownpropertykeys

V8 implements that order as spec'ed. Feel free to rely on that.
Since my last comment, I gave up on this chrome "decision" 5 years ago (remember is not a bug) and we end up deploying Firefox for our Enterprise application. The reason, because our application consumes JSON object that are produced by a third party servers that we do not have control. If JSON.parse was applied, at that point it was too late, the original sorting was lost. So... it was no about bad programming or using wrong tools.

Now that the time has go by, it has give us time to dig into those black-boxes to beg some solutions from the vendors. Sometimes involve consultants that are long time gone.

Where we have control of the JSON creation at server side, we started using the workaround on Comment #129. However this didn't solve the root of the problem when the JSON strings that comes already pre-made and no way to "fix" them. We needed a universal process. We now use keys index array.

When it was possible we requested to send us also the sorted keys index array of the object to regain the original order. In other cases we received the name of property that was used to sort the object, so we used the following function to get the keys index array (My 3 cents into this blackhole).

/**
* Sort the object keys using the content of the keys or use a specific property for complex object.
* Returns an array with the sorted object keys.
* @param Object o Object to be sorted.
* @param string p Property to use for sorting reference.
* @param boolean r Reverse order
* @return array 
*/
function sortObjKeys(o,p,r){
    if(o==undefined || !(o instanceof Object) || (p==undefined && o[Object.keys(o)[0]] instanceof Object) || (p!==undefined && o[Object.keys(o)[0]][p]==undefined)) return null;
    r=!!r;
    if(p!==undefined) return Object.keys(o).sort(function(a,b){var t;if(r){t=a;a=b;b=t;}a=(o[a][p]+"").toLowerCase();b=(o[b][p]+"").toLowerCase();return(a<b)?-1:((a>b)?1:0);});
    else return Object.keys(o).sort(function(a,b){var t;if(r){t=a;a=b;b=t;}a=(o[a]+"").toLowerCase();b=(o[b]+"").toLowerCase();return(a<b)?-1:((a>b)?1:0);});
} 

For example:
- Simple, one level object:
var o=JSON.parse('{"3":"a","1":"b","2":"c"}');
var i=sortObjKeys(o); //i will contain the array ["3","1","2"]

- Complex object:
var o=JSON.parse('{"3":{"n":"a","s":"b"},"1":{"n":"b","s":"c"},"2":{"n":"c","s":"a"}}');
var i=sortObjKeys(o,"n");      //i will contain the array ["3","1","2"]
var j=sortObjKeys(o,"s");      //j will contain the array ["2","3","1"]
var k=sortObjKeys(o,"s",true); //k will contain the array ["1","3","2"]

Sometimes, there was nothing we could do. We do not received any extra information. We cannot use the build JSON parse. We need to use a specialized JSON parse that manually parse the object and keep track in the element order and it will also return the required keys index array.

So... yeah... any speed and memory optimization gained by V8 or any other JS engine, it is wasted back. Welcome to the future... Peace.

Comment 146 Deleted

Comment 147 by spa...@gmail.com, Feb 1 2016

So many people here keep saying things like "any speed and memory optimization gained by V8 or any other JS engine, it is wasted back" while talking about one specific use case.

As someone who frequently parses JSON objects that actually represent *objects* in some typical programming language, I don't find myself iterating over keys with any significant frequency. I find myself writing a lot of code that looks like `var o=parsedjsonobject["somekey"]` almost exclusively. It is this code, the correct and normal usage of a json object, that the optimizations in question apply to. My code runs faster because of the decisions made here.

Comment 148 by bret...@gmail.com, Apr 18 2016

@jkummerow, perhaps I am not reading the spec correctly, but the current version at least indicates at https://tc39.github.io/ecma262/#sec-enumerate-object-properties (which is itself indicated from the for-in discussion at https://tc39.github.io/ecma262/#sec-runtime-semantics-forin-div-ofheadevaluation-tdznames-expr-iterationkind ) that , "The mechanics and order of enumerating the properties is not specified but must conform to the rules specified below" and the only rule relevant to OwnPropertyKeys is "EnumerateObjectProperties must obtain the own property keys of the target object by calling its [[OwnPropertyKeys]] internal method"--which doesn't suggest to me that for-in must iterate in the order of those keys.

However, if one needs the reliable iteration order, one could apparently use Object.getOwnPropertyNames (at least for own properties) as that refers to https://tc39.github.io/ecma262/#sec-getownpropertykeys which relies on OwnPropertyKeys which relies on "chronological order of property creation" as per https://tc39.github.io/ecma262/#sec-ordinary-object-internal-methods-and-internal-slots-ownpropertykeys which--I suppose (though I can't find confirmation)--would mean that object literals would be treated as having properties added in the order they were defined inside the literal.
9.1.11.1 of the spec clearly states that indexed properties are to be treated first.

Comment 150 by bret...@gmail.com, Apr 18 2016

True, but the rest are not, so although not relevant to the OP I know (but for the sake of others unhappy with not being able to use objects such as {AZ: "Arizona", IL: "Illinois"}) in a reliable iteration order, that particular section at least does indicate a reliable property order creation (after integers).
The ECMA V6 specification regarding ordering allows you to sign JS objects:
  https://cyberphone.github.io/openkeystore/resources/docs/jcs.html#ECMAScript_Compatibility_Mode

Most server JSON implementation can be used as well

Comment 152 Deleted

> 9.1.11.1 of the spec clearly states that indexed properties are to be treated first.

Here’s a link to the relevant section (in case the chapter numbering changes in the future): https://tc39.github.io/ecma262/#sec-ordinaryownpropertykeys See step 2.

Comment 154 Deleted

Comment 155 Deleted

Dear Chromiumers,
Could you (only for my poor understanding) verify that the following interpretation of the current ES standard is correct?

ECMAScript's JSON.parse() internally rearranges order of properties with names expressed as integers, making a parsed JSON string like

 '{"2":"First","A":"Next","1":"Last"}'

actually serialize by JSON.stringify() as

 '{"1":"Last","2":"First","A":"Next"}'.


Properties with non-integer names are though kept and serialized in the order they are declared or parsed.
Anders, yeap.. this is what happen, and we have been discussing for years.

Now, ready for another round of ideas. Today that most of ES6 has been implemented, it comes Map as a better parsing solution than Object from JSON notation. Previously, our customized JSON parser generated a separated array of the properties on the insertion order on a Object. Now, we switched to Map everywhere when an Object was required, we don't need the ordering array anymore.

With this switch we get a lot of advantages: keys keeps insertion order, easy iterable, no prototype hell, able to remove elements, etc.

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map

The initial pie on the sky of JSON notation is about data structures transfer (never about class transfer), unfortunately it inherit (the elephant in the room) Object, which, specially on JS, never mean to be a data structure, but it has been used and abused to be. The correct data structure for this case, should be Map.

Now... if there were an extension of the native JSON to parse into Maps and stringify from Maps, the balance of the Force should be restored.

Cheers,
Showing comments 58 - 157 of 157 Older

Sign in to add a comment