bsder 6 days ago

Linked lists get a bad reputation because they wound up being a default data structure--and that's almost always a mistake on modern computers.

Back in days of yore, small memory and CPU meant that linked lists and arrays were really the only effective data structures. Allocating and computation were super expensive, so following a link or having an index is really the only thing that works.

On today's computers, CPU is so much faster than memory that it is almost always better to recompute a value than try to look it up. Locality is so important that linear actions trump asymptotic complexity until you get quite a lot more elements than you might expect. Memory is so cheap that it is worth exponentially overallocating in order to amortize complexity. etc.

At the end of the day with modern programming, you should almost always be reaching for something simpler than a linked list (array/vector) or you should be reaching for an abstraction that is more complex than a linked list (hash/dict, queue/deque, etc.).

With modern CPUs, if you are using a linked list for anything other than building a more complex data structure, you're probably making an incorrect choice on almost all fronts.

2
Validark 6 days ago

> Locality is so important that linear actions trump asymptotic complexity until you get quite a lot more elements than you might expect.

The reverse of this can be true. In the following paper, they found that William's Heap Construction (one by one, O(n log n) time) beat Floyd's Heap Construction (O(n) time) when the input is larger than the size of off-chip cache.

https://doi.org/10.1145/351827.384257

xxs 4 days ago

>actions trump asymptotic complexity

Which operation of a linked list has any better complexity. The only one I can think of is a frequent removal of random elements in the middle -- and opt not to use tombstones.