Thank you. I use the sorting-algorithms visualization and it is usually a great hook for getting students to grok the rest of the content. It'll be great to have more resources.
Thank you. When I was in college (many decades ago) I just though insertion sorting was it. Then bubble sorts. I did not expect the mental flip I did when we starting researching quick sorts. That changed my perspective on obvious but incorrect assumptions forever.
These visualizers I think can get you there even faster. I just love this stuff.
I found out that my usual way of sorting cards as a kid was a variant of quicksort.
This is to sort the cards first into black v. red, then the black cards into spades v. clubs, then the spades into high v. low, then finally sort the low ones by inspection (a kind of insertion sort I guess), sort the high ones by inspection, sort the clubs similarly, etc.
Like most quicksorts, this definitely uses O(log(n)) space as you have a deck of reds, a deck of clubs, and a deck of high spades while you're handling the low spades...
Just for fun: my friend and I 3D printed a model of SelectionSort: https://gfycat.com/HospitableTenderDiamondbackrattlesnake. Each successive layer behind the front one is one iteration of the algorithm. We also made mergesort and quicksort versions, and I'm working on a model of Dijkstra's right now.
To be honest the 3D model doesn't really confer a greater understanding of the algorithm as hoped, but it's nice to play with.
Interesting approach. If this was part of a series and perhaps in a different material + color-mapped (think Shapeways full color sandstone for instance), I could see someone very passionate about algorithms wanting to buy/own/print them.
I went through the list of algorithms in Wikipedia [0] and a reader pointed out that block sort [1] is an interesting case. It has all the best properties listed, except the worst case swaps, which I can't find in Wikipedia.
Well, spaghetti sort runs in linear time: O(n) time is spent setting up the device, and O(n) reading out the result. The actual sorting step is O(1) !!!
Unfortunately, it's limited to sorting numbers, and uses O(n) additional space (to be precise, it uses O(n) additional volume).
I once wrote a generic algorithm visualizer where you write your own algorithm, and a view function. Then the framework records the state of the algorithm as it changes and passes it to the view function. Then you can play the states like you play a video, go back and forth etc. Here is an example with bubble-sort: http://awal.js.org/alpg/?gist=8b5c6679edc3b85106fb (hit run, then play/rewind/etc.)
It would be even nicer if there was some way to find out who made it and/or how to contact the author(s); and/or how and whether one can add more algorithms (e.g. I'd love to see timsort and introsort).
As it is, this site seems to be completly anonymous. Which is of course a valid choice by the author(s), but IMHO quite sad :-(.
This is interesting, I'd never thought about considering the number of swaps that an algorithm does. It seems like low swapping algos should be better in multithreaded environments due to less locking.
There can be a lot more to consider than just the number of comparisons (which a lot of sort algorithm texts concentrate on if they fixate on one thing) which can have an effect on which sort is better for a given circumstance.
If an exchange is very expensive then you might prefer an order of magnitude more comparisons in order to reduce the number of exchanges needed. The structure of your data makes a difference too especially if you are trying to sort in-place with little or no extra memory: an exchange by insertion is very efficient with a linked list (just rearrange the links) but can be very expensive with a fixed array (shifting the last element to the front involves moving every other element up one).
Sometimes the comparison might be rather expensive at times: if you are trying to sort data stored over many distributed nodes then you need to be careful to pick an algorithm that can constrain itself as much as possible to the local data on each node.
Concurrency can be a big issue even if not running on distributed data: some algorithms are much more "lock heavy" than others.
And even for a single threaded local only sort on modern CPUs cache use can make a big difference: an algorithm that you intuit should run quickly because it can move objects very far at each step might not be all that good because it much more rarely sees cache hits when looking at data than one that works on smaller local chunks in its inner loop.
No one method fits every use: sometimes you want an exchange class sort, sometimes insertion class, sometimes , ...
Not sure how I feel about this, but the second link you posted is remarkably similar to my alpg[1] project. UI buttons, rendering, usage of Github gist for sharing, url format of shared links, etc. I wrote alpg 3 months back, and the that one has first commit 13 days ago... :| A bit weirded out by this but I am biased ofcourse. Should I care?
To clarify I'm not the author of either of the links I simply shared what I found. If you did decide to care you'd have to take it up with the repo owner.
I've always found selection sort to be the most intuitive. When I was in school, a professor mentioned that bubble sort is sort of the "easy" sort that people would discover on their own, but I always thought it seemed complicated compared to selection sort.
Selection sort is basically "Find the next smallest item in the remaining unsorted list, and use it as the next value."
Both seem to me like strategies that you'd intuitively hit upon. I think bubble-sort is a bit more mathsy but a certain type of mind would hit upon it. Because bubble-sort is "Go through the list swapping adjacent items depending on value, Keep doing that until you get no swaps." which is pretty trivial as well.
I remember heap-sort kind of blowing my mind when shown to me. When this kind of topic comes up I am always reminded of D. J. Bernstein's Crit-bit Trees: https://cr.yp.to/critbit.html
For visualizations of numerous other algorithms, check out http://pyalgoviz.appspot.com, where both the algorithms and the visualizations are editable. You can replay both the execution and the visualization step by step.
This page just goes to show how illuminating a visualizer can be to understanding something we all (programmers) do every day, either on our own or using a tool.
http://www.cs.usfca.edu/~galles/visualization/Algorithms.htm...
Many algorithms and nice visualizations.