xxs 4 days ago

Linked lists have higher constant const memory wise, much slower to iterate, effectively removals of a random in the middle are the same as vector. When used as queues the ring buffers are better for most applications.

Their prime use is that adding new elements lack a terrible worst case, aside the fact it puts more pressure on the memory allocator - need to be concurrent friendly.

If there is a full control of the memory allocation (e.g. kernels) linked lists make sense, for virtually all other cases there are better data structures.

1
boomlinde 4 days ago

I'd first like to point out that yours is an argument from the perspective that performance is crucial and dependent on the use or non-use of linked lists. This is not necessarily the case. Performance may not be crucial, and even when it is, linked lists can find use in areas where they're not even nearly the bottleneck, for being the most obvious data structure that satisfies the required operations on the collection. That pointer chasing is slow is not an argument against the use of linked lists in general.

What if I only ever iterate over a list of some hundred entries in the rare case of an error? Should I be reaching for the most performant data structure or the one that most obviously and clearly maps to the operations the program needs to perform on it at some infinitesimal cost?

> Linked lists have higher constant const memory wise

Than a fixed vector with a known maximum number of elements, yes. With something like a dynamic array you are however usually better off using a linked list memory-wise, unless you want to reallocate every few insertions or your node data is smaller than the link.

> effectively removals of a random in the middle are the same as vector.

You're conflating concepts. Seeking is O(n). Removal is O(1). There are cases where seeking for the purpose of removal is either amortized (e.g. you're already iterating over all the nodes, executing an associated handler for each which may or may not determine that the node should be unlinked in-place) or completely unnecessary (some other data structure already tracks a reference to the node and some code associated with that determines if/when it is removed).

One might similarly argue that I am conflating compacting (O(n)) with removal (O(1)) in the vector case, but in the case of linked lists, the difference is just a natural consequence of design, while for a typical implementation of vector operations they are the same operation and require an additional data structure to be anything but that in order to track the holes.

Moving elements around in memory also comes with a host of other issues, for example if there are many references to the elements, requiring maintaining and indirectly accessing memory via an index array.

> aside the fact it puts more pressure on the memory allocator - need to be concurrent friendly.

This is again a conflation of concepts: that of using linked lists and that of making space for the nodes or preparing an allocator for concurrent application. The nodes could be on the stack for some applications, in a static pool for others.