New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.

Issue 90 link

Starred by 56 users

Issue metadata

Status: Fixed
Closed: Sep 3
HW: ----
NextAction: ----
OS: ----
Priority: ----
Type: ----

Sign in to add a comment

V8 doesn't stable sort

Project Member Reported by, Sep 24 2008

Issue description

The WebKit test LayoutTests/fast/js/comparefn-sort-stability.html fails in 
Chrome using V8. The result is:
  FAIL arr[0] should be 2. Was 1.
  FAIL arr[1] should be 1. Was 2.

It uses === and I can get this to work by reordering the expected objects. 
Apparently SquirrelFish uses stable sorting and V8 and JSC don't. I'm told 
the spec doesn't require stable sorting, but I'm filing this bug to track 
it and any decision about it.

Comment 1 by, Sep 24 2008

The relevant Mozilla bug on this issue is:

Comment 2 by, Sep 24 2008

The Mozilla bug says they did it because IE is stable (I didn't verify this). If IE, 
Firefox, and soon Safari will all be stable, V8 should probably be stable.
There was some discussion about this earlier today on v8-dev.  The analysis there was 
that IE is *not* stable.
Leave it non-stable.

Some people in the Mozilla bugtracker wanted it stable so they could do two-pass
sorting to sort by two columns. Doing two-column sorting is possible without two
passes. Just write a comparison function like:
if ( == {
    return a.two < b.two ? -1 : 1;
} else {
    return < ? -1 : 1;
Status: WorkingAsIntended
We'll keep it as-is.
 Issue 426  has been merged into this issue.
I was asked to update this bug.  I have observed that Array.sort is stable in IE6 and 
FF, but not in Chrome.  Here's a test case:

It shows an alpha-sorted list of two-letter words.  Click on the list to sort it by 
second letter.  Click again to resort alphabetically.  In Chrome, the order of words 
ending in A (for example) is random.  In IE6 and FF, it is alphabetical.

A quick tests suggests that Safari's sort is also stable.
@jmittle: can you put that test case somewhere that non-Google-employees can 

Comment 10 by, Jul 15 2010

 Issue 779  has been merged into this issue.
In V8 2.3.0 (possibly earlier) sort seems to be stable. Is this by accident or on purpose?

node> ["Foo", "Bar"].sort(function(){ return 0; });
[ 'Foo', 'Bar' ]

(I ran this in Node v0.1.101 which uses v2.3.0).
Small arrays are sorted using a simpler algorithm which happens to be stable.
You need a larger array to trigger the quicksort algorithm which is non-stable.

Comment 13 by, Sep 11 2011

Correct.  IE, FF, Safari (JSC) are all stable.  V8 is the odd man out.
It seems like we should re-visit this bug now that new information has come to light.

Comment 15 by, Nov 21 2011

Are you kidding me?

I think it's absolutely unacceptable that sorting is unstable merely due to the size of the array. I have no problem with v8 switching to a different algorithm internally to take on different dimensions of arrays, but I do if that means different results in the generic sense. Especially since there's absolutely no need for sorting to be unstable.

If it's not stable, sacrifice speed to use a slightly slower algorithm that _is_ stable. Developers will assume that large arrays are slower to sort. They will not expect that to cause sorting to fail.
@comment #15:
As you can see, this issue is not a new one. As per ECMA-262 spec (Array.prototype.sort), "The sort is not necessarily stable". Relying on something explicitly not required by the spec is risky to begin with. For the use case of sorting a table with multiple columns there is the way of using a custom compare function.

That being said, we certainly are reconsidering how sorting should be implemented.
I think the argument being made is that having sort be stable for small arrays and unstable for large arrays is unnecessarily confusing.  It violates the principle of least surprise.  I think using an unstable sort for small arrays as well as large would be better than what we have now.

That said, I'd argue for switching to a stable sort.  There are certainly cases where an unstable sort is not suitable.  Doing a stable sort requires either implementing a stable sort in JS (inefficient) or creating a hacky wrapper (inefficient):
  function stableSort(array, compare) {
    var array2 =, i) { return { i: i, v: v } });
    array2.sort(function(a, b) {
      var r = compare(a.v, b.v);
      return r == 0 ? a.i - b.i : r;
    return { return v.v });

I can see the value in avoiding making people rely on unspecified behavior, but I think given that all of the other major browsers implement a stable sort and it could present a useful major speedup for code which can assume particular behavior (node.js, browser-sniffing client code), it would be better to switch to a stable sort.
For what it's worth, I did a few benchmarks:
Clearly v8 should switch to whatever sorting algorithm Safari is using - it appears to be 6x faster!
Array.prototype.sort is always unstable.  You can get lucky with an unstable sort and it can coincidentally leave the results in the same order as an unstable sort would.  It so happens you always get lucky for small arrays.  This does not mean the sort is stable.

As can be seen here there is no sorting algorithm that is O(nlogn) in time, O(1) in space and is also stable.  You have to compromise when choosing your algorithm.  Given that you can get a stable sort using a map (see Seth's jsperf example) when you need it there is no reason to use a worse algorithm in the common case where the sort does not need to be stable.

I made a new version of Seth's test that fixed a small typo (the unstable random test was using the ordered array).

I filed a separate issue on the performance:

"as an unstable sort would" should of course read "as a stable sort would" in the above comment.
"It so happens you always get lucky for small arrays.  This does not mean the sort is stable."

Um, actually that does mean the sort is stable for small arrays.  If you always get lucky, it's not luck.

I decided to find out which browsers do what with sort, since otherwise we're just operating on speculation.  See:

Sort is stable: Firefox 8.0, Safari 5.1.1, IE 6, 7, 8.
Sort is unstable: Opera 11.52, IE 9, Android 2.3 browser.
Sort is stable for length <= 10, unstable for length > 10: Chrome 15..17.

I guess stable sorts are less prevalent than I thought.

I'm done obsessing over this.  I think to avoid misleading people it'd be good to make Chrome's sort consistently unstable or stable.
 Issue 1882  has been merged into this issue.
I don't care if it's stable or not, but IMO it should use one tactics for both small and big arrays. The "feature" confused me much — all tests have been passing normally, I didn't even thought it would behave differently on 11-elem array.

Comment 24 by, Jan 1 2012

The algorithm used is Quicksort, which splits the array into smaller pieces and sorts those recursively, but using Insertion Sort when recursing on small arrays. This is a common implementation of Quicksort, and is used because Insertion Sort has a lower overhead but a worse complexity. I.e., because it's faster than using Quicksort all the way down.
That means that small arrays are sorted using Insertion Sort, but Insertion Sort is also several times during sorting of larger arrays, on smaller element groups.

I.e., it's not because we single out small arrays for special treatment, it just happens that the base case of the Quicksort is also used for arrays that are so short that we don't need to split them at all.

Comment 25 by, Dec 12 2012

so if someone wants a browser-independent stable sort then they're out of luck + have to write their own or use someone's Javascript implementation? (which is likely to be much less efficient than a builtin implementation)

Any chance Chrome could provide a stable sort algorithm in whatever implementation-dependent portion of the javascript namespace is appropriate?
Chrome changes won't help, since Opera and IE9 have unstable sort too. This invalidates the argument of "Chrome's sort should be stable because everybody else's is" that other people mentioned before.

The spec doesn't guarantee stability; don't rely on it. If you want it 'stable', pass a comparison function that never says two elements are equal.
The built in implementation is not going to be hugely faster than one you write yourself.  The built in one is written in JavaScript, after all.  Of course there is a cost to being stable.  See

If you want a stable sort in the platform, you should lobby the EcmaScript standardization committee to add it to the standard library.  Then, whenever IE11 is obsolete you will have it available on the web platform.  V8 does not normally add stuff that is not part of the ECMAScript standard.

Comment 28 by, Jun 12 2013

The sort is annoying because it takes a sorted array and resorts it in a different order.  When playing Lord of Ultima and opening a Command Overview window and sorting on troop strength, the game redisplays the sorted arrays every second.  Because the sort is unstable, the rows constantly jump around.  Certainly the LoU developers can fix the problem by using a custom sort and sub-sorting on an internal unique identifier field.  But since they won't do that because it isn't a problem on firefox or IE, it would be nice if Chrome would provide a better sort.

Note that a sorted array can be sorted in O(N) time instead of O(NlogN) time.  So the performance of Chrome is going to suck for this case as well.

Comment 29 by, Jun 12 2013

Can someone maybe disable comments on this issue?  It's been closed in 2009 with WorkingAsIntended.  Which is absolutely correct.

Even if v8 would implement a stable sort, it wouldn't change anything, since developers still couldn't rely on it, because -- as demonstrated -- some other JavaScript engines still don't do the same.  And clearly the specs don't demand a stable sort.  So v8 is not broken.

Anyone's software that relies on a stable sort *IS BROKEN* and needs to be fixed.  Please stop posting about your particular website or game that would benefit from changing v8's sort and contact those developers instead, so they get a chance at fixing their broken software.

Comment 30 by, Jun 12 2013

Other developers have been contacted.   Chrome should make it easy for people to use software on chrome instead of turning people away to use other browsers.  Developers frequently do not fix their software.  Chrome looks bad when software works fine on firefox and IE but cannot be used on Chrome.

Note that I raised two new points that were not previously raised.  I raised a performance issue: the sorting of a sorted array is likely to have poor performance, and I raised the issue that sort(sort(A)) != sort(A).

Not listening to the community seems unreasonable.  Disabling comments here will simply force people to create new bugs that need to be marked duplicates of this bug.  Responding to the specific issues raised by the community would be a more mature approach.

Comment 31 by, Jun 12 2013

There's simply no way to make everyone happy and I believe v8 made a good decision about this.  The v8 developers aren't ignoring the community: Think of all the software developers that *aren't* complaining, because sort()'s performance doesn't suck.  Their opinions count too, right?  A decision had to be made it's been made in favor of performance.  Again, you can't please everyone.  A lot of people are pleased, though.  v8 isn't here to scratch everyone's itch.

sort((sort(A)) != sort(A) is not very convenient, I agree, but it's something you'll have to expect when dealing with an unstable sort.

Chrome doesn't look bad if it has unstable sort.  After all IE9's and Opera's sort are unstable as well.  Software developers that can't code are the ones looking bad.

It's been said already and I'll say it again: Use your own sorting algorithm.  There's absolutely nothing wrong with that.

Comment 32 by, Jun 12 2013

The opinions of people who don't complain only count if you poll them.

If the decision was made in favor of performance, then there is an implementation bug.  Sorted and near sorted inputs are common.  Moving data around when it doesn't need to be moved is not the high performance implementation.  If we can't have a stable sort, let's at least make sure that we don't gain performance if we modify the algorithm so that sort(sort(A)) == sort(A).
Complaining on this bug I suspect is only going to result in it being closed for comments. :)  (Not my decision, just speculating.)

The original issue tracked by this bug has long since been resolved.  Note that:
now PASSes in Chrome.

As noted in Comment 12, small array sorts are stable.  Only large array sorts are unstable in v8 (or at least were last I knew).

It's possible that the compatibility landscape has changed since this bug was originally filed.

I suspect that *if* all other modern browsers (Firefox, Safari, IE 10) use a stable sort already for large arrays there is an argument that v8 may need to do so for compatibility.  It's not clear in this bug if that's the case.  Instead of adding additional "me too" comments, I suspect it's more useful for someone to write (or find) a comprehensive test of sort stability (for various array sizes) and run it against other modern browsers and report the results.

Since the ECMAScript/JavaScript spec does not require Array.sort to be stable, there is no need for v8's sort to be stable *unless* web compatibility dictates otherwise.  So long as there is at least one other major browser with an unstable sort, I see no reason for v8 to change Array.sort to be stable.  I would personally like Array.sort to be stable or at least for JavaScript to provide an optionally stable sort (as I've run into this bug writing web apps in the past)  But that's a personal preference, not my professional judgement.

Comment 34 by, Jun 12 2013

Clarification:  My comment #28 is *not* a "me too" comment.  I specifically included two additional points not previously mentioned relevant to the discussion.
In any case, V8 is completely spec compliant on this issue. It implements the spec as performant as possible while maintaining spec compliance. For additional requirements, it's not hard to add your own implementations. Additional javascript libraries exist like sand on the beach as well.

If you want to argue that the specification is incomplete, please involve yourself with the ECMA standard body, for example by posting to es-discuss ( Maybe you can convince the committee to add stable sort the next revision of the specification.

Arguing that moving elements indicates that the algorithm is less performant is a fallacy. Additionally, the average runtime in big-O notation may not even matter. You also want to consider worst case behavior, memory requirements, and fixed overhead. For example, we use insertion sort for arrays below a certain size because it runs faster even at O(n^2).

The V8 implementation of sort is done in javascript. If you can come up with a stable sort that performs equally well or better, we would be very happy to replace the current implementation. We accept patches!

Comment 36 by, Jun 13 2013

> Instead of adding additional "me too" comments, I suspect it's more useful for
> someone to write (or find) a comprehensive test of sort stability (for various
> array sizes) and run it against other modern browsers and report the results.

Such a test was already posted here in the comments[1]. Is this a good test or not? 


Comment 37 by, Jun 13 2013

I’ve re-opened the (old) sorting algorithms discussion on the es-discuss mailing list. Please discuss this issue there instead of in this thread. Thanks!
 Issue 4189  has been merged into this issue.
well... i am normally not the guy to be polemic and re-raise old issues, but seriously, this is the quintessence i read out of this thread and some mails so-far, and it's sort of weird and unexpected reading this here and not at the ie6-team-forums of 199x: 

"So the standard says there's no need for a stable sort and a qsort is a little faster. But there are some people complaining. Well, ok. Instead of being the ones having the influence to tell the standardizers that their standard lacks reality-compliance (at least acknowledged by nearly all other browsers, definitely the major ones in actual versions), lets do it faster instead of reliable and violate the  principle of least surprise, cause V8 shall not be best, but the fastest engine ever! At least for a count over 10, because being consistent is bad, mhkay...? Let's do our own thing instead of being web-compliant with nearly every other browser out there. lets be these guys! And if you little communityguy really need such weird thing as a stable sorting algorithm, write it yourself, man, you are a bad programmer if you can't write a hack, didn't you work with IE7? Man, THAT was a mess!"

As mentioned, I expected this behaviour from Microsofts ie-team in 199x or so, but definitely not from chrome-v8-devs in 2015. Sorry for the inconvenience. It was just my picture of how the "good" devs and architects work being ruined.

Please fix this. Last commenter said it best.
If "saying it best" is to use the old "you are just as bad as IE" then I'm not impressed.

Take a look at  The stable sorts are almost as fast as the unstable ones, and they are only a few lines.

V8 has always put a lot of weight on speed. This is why the stable sorts are so fast.  Consider the fastest, the stableSortPair.  It is only 23% slower than the unstable sort.  It's that fast because:

* Stable objects have their properties inlined
* The generational GC cleans up the Pair objects very quickly
* Property accesses like .obj are fast
* push() is very optimized.
* Array.prototype.sort is very fast (see what I did there?)

By prioritizing speed, we give you a tool where you can build your own stuff, and it goes fast.  You don't have to beg us for an accelerated library function, you have the freedom to build your own stuff and it will go fast.  In fact sometimes you can go faster than the library functions, because they have all sorts of code to take care of sick edge cases that you don't have in your web pages.  You don't do Array.prototype[2] = "fish" and then expect [,,,,,][2] to be "fish", right?  You don't sort sparse arrays, right?  Of course you don't.

And if everyone uses it, you can ask TC39 to put in the standard library. Just like C++, which has sort and stable_sort.  This will save a tiny bit of download.  Seriously tiny, check out the jsperf link.

By the way there are a lot of JS implementations of merge sort on the net, and they mostly use array shift.  Please don't use them.  array shift is fundamentally slow because it has to shift down all the elements. We optimized the hell out of a common case using a technique that causes cancer in small rodents, but if you try to do it on a large array, performance will collapse.

Comment 42 by, Jun 17 2015

Can you provide a link to specific instructions for patching a page to use a different sort?  (The instructions should provide an example of patching the sort with a specific "best" stable sort.)

Why is the benchmark used in the most representative benchmark?  Is it well known that most arrays use small objects?  Or that most arrays are unsorted most of the time?

"you have the freedom to build your own stuff"...  Poorly argued.  You have the freedom to build your own stuff because you have become an expert on this stuff as part of your job.  We would have to sacrifice time to achieve a similar level of expertise, and we would prefer to use that expertise on other subjects.

Can you point us at the source code for the unstable and stable sorts?  Can you point us at the benchmarks and unittests used to decide which sort should be used as the default sort?
This whole discussion is futile: Being stable or not is just one of many properties of a sorting algorithm, see e.g. Do you care about average performance? Memory? Do you care about worst behavior? Asymptotic complexity or raw speed on "your usual" input sizes (i.e. constant factors)? etc. etc.

Depending on your use case, any of these properties may be mission-critical or totally irrelevant, that's the reason why there are so many sorting algorithms. There is simply no "best" algorithm, so most language/library specifications offer you only a small handful (typically 1 or 2) of them and leave the rest up to application/library/framework writers, who only know what would be "best" for them.

Furthermore, using any kind of language or library is a game: There are strict rules, which you have to follow and on which you can base your assumptions. Everything beyond the rules is wishful thinking, and there are often good reasons for the rules, even if you don't see them. Of course you can rightfully complain about the rules, but those complaints should be directed to the authors of the rules, not to the manufacturers of the game. And if you don't like the rules, there are always other games to play... ;-)

Comment 44 by, Jun 18 2015

The makers of the game have a great deal of latitude.  They can provide documentation and advertise it well to make it easier for people to swap in and out different sorting algorithms.  They can provide a collection of algorithms that each target a different segment of the design space, again with good documentation and benchmarks to explain where each algorithm shines the best.  When you say that it is trivial to write, test, and install new sorting algorithms, you are allowed to provide examples that show parts of that, you are allowed to provide pointers to easy to use and relatively complete benchmarks, and you are allowed to make it easier for people to contribute to the collection.
Sample code.

[1,2,3,4,5,6,7,8,9,10,11].sort(function(){return 0;})
> [6, 1, 3, 4, 5, 2, 7, 8, 9, 10, 11]

This is regrettable in the specifications.

Comment 46 by, Jun 21 2015

One of the memes here is that it's so easy to create your own sort that you should.  This is strongly coupled with a meme of: since there are so many different sorting algorithms each optimal in its own niche, you should create your own sort.  This, of course, argues that the default sort should be most useful for those programs that don't want a sort highly optimized for a specific niche.  They want an easy to use sort.  They want a sort that won't produce O(N^2) behavior on some input; they want a sort that is fast on mostly sorted arrays; and they want a stable sort.

If there is a special consideration that is mission critical to an application, then that application should write and install its own sort.  If you're sorting huge arrays, or you know you're sorting certain kinds of data distributions, that's when you're willing to spend more time getting the details just perfect.  When you're doing day-to-day sorts on relatively small arrays with only the hundreds of elements you can display on a web page, then you don't care so much about every last detail and just want something easy-to-use.
Re #45: That's not regrettable, that's just what you asked for: Your comparison function just says "I don't care about anything, for me everything looks equal.", and the result is 100% consistent with that egalitarian comparator.

Re #46: Replacing "I" with "they" and thereby defining *your* use case as *the* use case everybody has doesn't really work, but nice try... ;-)

Stable sort is vastly overrated nowadays IMHO (unless you still have to handle magnetic tapes or punch cards and must do multi-phase sorts): If you want to sort in a stable manner, just make this requirement explicit, i.e. make the position in the input part of the sort key. If the position is not directly available from the objects to be compared, temporarily pair them with the positions. Note that it's not possible to do this the other way round: If a stable sorting algorithm using addition O(n) memory would be used, everybody had to pay the additional costs, even if "stable" is not an issue at all.
If a stable sort and non-stable sort developers can select, and if there is almost no disadvantage, it is my opinion for the future that stable sort is easy to use.
 Issue chromium:502698  has been merged into this issue.
#48> performance is the disadvantage. As the number of items in the array grows, stable sorts perform worst and worst for a variety of reasons, from algorithm complexity to memory usage to cache misses depending on which specific algorithm you compare and what criteria you focus on.

Now, given:
1) it is possible to get stable results with an unstable sort, using a good compareFn.
   (that's how it should be done anyway, performing several sorts in a row is inefficient).

2) it is *not* possible to get unstable sort performance with a stable sort.

…then it seems a rather logical choice to provide an unstable sort.

And about developers being surprised… you make a mistake, you realize it, you fix it, you learn and don't do it again. You don't blame the library just because you misread the documentation.
“I tried to use a knife to eat soup and cut myself, let's change knives to they don't cut anymore”. I don't think dumbing down and hampering tools everytime someone trips on them would do any good in the long run.

#50: Oh cmon, are we at this silly net <-> reality - comparisonlevel now? Pleas don't. 

Or, in your words: "I always used a knive THIS way and I saw in every other family, a knive is used this way. So I tried to use it this way, but I accidentally cut my finger because the knives in THIS house are working the other way round"... 

Sorry, I fixed my problem which kind of resurrected this evergoing discussion in minutes - but I just don't like the mindset of the guys defending nonsense here, so I had to post this.

Comment 52 by, Aug 14 2015

Speaking in programming parables, you make a function that reads input and returns output.  Function that does not read anything but its input, especially does nothing random nor time dependent.  So you expect given function to return exactly the same output for given input.  You do repeated tests at all available platforms and all seems well.  Then you make input just slightly bigger and one certain environments begins to behave stochastically.  I assume we all could agree that in general this is not good.

But parables aside, I think that V8 devs made clear statement: (paraphrasing:) "Unless you convince ECMA board to change volatility regarding stable sort in next version of ECMAscript standard, don't bother us, we are covered and we are pushing performance here.  Interoperability is not our concern."

Then well, ECMA is said to value consensus highly (and I think we have one pretty consensus here) so this might be a possibility.
Here's an idea: if you want a stable sort implemented, it's not that hard to implement. Array.prototype.sort is implemented in pure JavaScript. 
#52> an unstable sort will still output the exact same output for the exact same input. There is nothing random nor stochastic in its behavior. Quite the opposite actually, sorting algorithms are probably among the most studied and formalized algorithms.

The consensus point is a good one though.

...except Internet Explorer sort went unstable since IE9.
...and still is in IE11 (it starts stable, then switches to unstable somewhere between 100 and 1000 items, kind like Chromium does, but with a higher breaking point).
...and Opera's sort is also unstable (since version 11 if I remember well).

This change in IE9 teaches us something btw: even if one had all tests in place and passing, as long as one's assumptions are broken, one's code can still fail because the library one is using is not bound to those broken assumptions, and may be updated in a way that breaks one's code anytime.

That being said, I agree it would be great to have a stable sort available in the norm as well. The point should not be “make sort() stable” though, but “add a stable sort alongside sort()”. They are different tools, with different strengths and limitations.
My student directed me to this bug report.  I want to correct an important incorrect assertion that may be coloring the discussion.  Comment #19 asserts that "As can be seen here there is no sorting algorithm that is O(nlogn) in time, O(1) in space and is also stable.  You have to compromise when choosing your algorithm. "

This is false.  To see this, consider the sorting implementation (located at ) .  Notice that any time this implementation compares two items, it knows the indices of the two items in the array.  Therefore, it is easy to natively implement the "wrapper" that Seth mentions in comment #17.  This wrapper is consistent with the original comparison function when the two items differ, but when they compare equal, instead of returning 0, it declares the item with smaller index to be the smaller item.

The important thing is that this wrapper never returns 0.  This means it gives the sorting algorithm no flexibility; there is only one possible output: the one that puts smaller items before larger (ie, consistent with the original comparison order) and also puts items with *equal* values in the order consistent with their original index order (because that's what the modified comparison function demands).  In other words, the result is a stable sort.

This variant uses no more time (asymptotically) or space than the original sorting algorithm.  It is slightly slower because it does an extra comparison of indices given an equality outcome, but this index comparison is negligible compared to the function call overhead associated with the original comparison ( There may be an argument for tolerating instability when you want to maximize performance in a low level language sorting a dense array of numbers, but here the use of a function call for each comparison means this implementation just isn't worrying at that level.)  Also it is clear that the code isn't worried about this kind of extra work on equality, because for example at line 985 of the sorting implementation code that I linked to above, the algorithm does a swap when c02 >= 0 .  This is "wasteful" since you'd get exactly the same outcome if you swapped on c02 > 0, so it obviously isn't worried about the equality case causing a little more work.   Finally, arrays with lots of equalities are going to be the fastest to sort (when you choose a pivot that has a lot of equal keys, all those equal keys are gathered around the pivot (between low_end and high_start in the code) and are not passed in to the recursive subproblems.  This means the recursive subproblems shrink faster so the overall runtime is less.) So it isn't important to optimize performance on equality outcomes.

In fact, the line 985 that does wasted work on equality appears to be the only origin of instability in the sorting algorithm.  As I mentioned above, instability only happens if you (unnecessarily) swap items that are equal.  It is exactly the c02=0 case that causes this to happen: it discovers that two items are equal, and it goes ahead and swaps them anyway.  On my quick skim of the code, it appears that every other swap takes place only when a comparison yields (strict) inequality---never equality.  Such comparisons should never cause instability in sorting.

I may have missed something on my quick skim, but the general principle I explained above still applies: any unstable sorting algorithm can be made stable with just a few small tweaks that don't affect running time or space.

The rest of the argument seems to be built on the idea that since the spec doesn't require stability, providing it would be a snare since it would lead people to rely on it and they would get bugs in other browsers with unstable sorts.  I would argue the opposite: that many programmers are going to assume stable sorting without checking for it, so are likely to write code that is buggy when sorts are unstable.  So you are likely to be protecting a lot of programmers from themselves by providing them with a stable sort.  Firefox has already done so ( ).  And there's hope that other browsers would follow suit and take us to a *de facto* stable sort standard even if the spec doesn't demand it.  A future spec could recognize this de facto standard and adopt it.

#55: Thanks for your contribution. Unfortunately it's not that easy :-)

Line 985 only deals with selection of the pivot element (median-of-three). You're right that the condition there could be changed, however doing so doesn't eliminate all sources of instability.

Another source of instability is in lines 1007/1008, where the pivot element is unconditionally stored at the beginning of the range of elements that is to be sorted, to get it out of the way of the later shuffling operations. This can be avoided at the cost of having to keep track of where it originally came from, and where it must go after the partitioning step.

However there's a third source of instability that I believe is inherent to Quicksort: during the partitioning step, the elements are divided into three groups: less-than-pivot, equal-to-pivot, greater-than-pivot. The sizes and, hence, indices of these three groups are not known up front, but are incrementally adjusted as the algorithm progresses. In particular, to avoid moving the middle group back and forth as more space is required for the first and last groups, Quicksort assumes that the equal-to-pivot elements can freely be reordered. In our implementation, this happens in lines 1026/1027, where the last not-greater-than-pivot element is swapped to the current scan position to make room for a newly discovered greater-than-pivot element.

(You could store the equal-to-pivot elements in a separate temporary array, but that would require O(n) memory in the worst case. The wrapper approach in comment #17 solves the problem by using O(n) memory to store the original index of each element.)

Whether it's "a few small tweaks" or a radically different implementation: if you can propose a stable sorting algorithm that's competitive with what we have in terms of speed and memory consumption, I'd love to hear about it (and so would, presumably, the world at large) :-)
Thanks for your detailed explanation which clarifies the stability problem and exposes the flaw in my own reasoning that an item won't move until you put it into its pivot destination---which I relied on to say that you'd always know an item's original position when you compared it.  I guess this is why firefox switched to mergesort.

I understand the downside of allocating an extra O(n) space to hold location information, with consequences for garbage collection.  Does the same hold true for using O(n) space on the call stack?  Because it's pretty easy to write a recursive pivoting procedure that uses no additional space aside from the stack.

If you think of scanning through the array, then if an element is small (less than or equal to the pivot) then it needs to move to the index equal to the number of small items preceding it, while if it is large, it needs placed just before all the items that follow it and are large.  The former quantity, the number of small items before a given small item, is known when an item is scanned so can be passed in to the recursive call.  The latter quantity, determined by the number of large items following a given large item, can be returned by the recursive call.  In pseudocode:

//Pivot items of a in range including "from" up to but not including "to" given
//that the small items up to but not including position i have been packed into
//the space up to but not including low_end

Pivot_Helper(a, from, to, pivot, i, low_end) {
   var item=a[i]
   if (i==to) {
       return to-1
   if (item <= pivot) {
       return Pivot_Helper(a, from, to, pivot, i+1, low_end+1)
   } else { //item > pivot
       hi_start = Pivot_Helper(a, from, to, pivot, i+1, low_end)
       a[hi_start] = item
       return hi_start-1

Pivot(a, from, to, pivot) {
   Pivot_Helper(a, from, to, pivot, from, from)
Using O(n) stack space is out of the question, as the stack is way too limited (less than 1 MB). Observe how the existing implementation limits itself to O(log n) stack space even in the worst case (i.e. pathologically unlucky pivot elements) by avoiding the recursive descent into the larger half of the array (lines 1036-1042).

Ran into a particular behavior for ES6 where when you provide a comparing function the sort will work up to 10 elements and when passing in more than 10 it will fail.


myArray.sort((l, r) => l.toLowerCase() < r.toLowerCase()); // works up to 10 elements but fails for more

Affected Chrome, node v5.10, webdriver chrome

Is this in any way related?

Comment #59: this is not related. You are using the sort function wrong. The function needs to return a positive, 0 or negative number. Yours only returns true or false.

Try with (l, r) => l.toLowerCase() - r.toLowerCase().

Next time please open a new issue and let developers decide whether to merge it with another issue.

Comment 61 by, Aug 10 2016

 Issue chromium:632329  has been merged into this issue.
 Issue chromium:689256  has been merged into this issue.
 Issue chromium:729468  has been merged into this issue.
 Issue 6851  has been merged into this issue.

Comment 65 by, Feb 24 2018

 Issue 7491  has been merged into this issue.
Is It worth report a similar issue that uses the Array.sort function?
Status: Started (was: WorkingAsIntended)
Good news, everyone! is now green. 🎉
Status: Fixed (was: Started)
Array.p.sort is stable as of
 Issue chromium:880061  has been merged into this issue.

Sign in to add a comment