This exact demo will crash with vanilla JavaScript (in Chrome 134.0). This React would also crash — unless the computation relies on WASM
Make a demo with react-virtualized[0] and see if it crashes. Hint: It will not[1]. React can easily render 1 million rows with high performance without relying on WASM [2]
Here is the demo of react-virtualized[3], in which I entered 10m as the row count and scrolled to the bottom without crashing.
[0] https://github.com/bvaughn/react-virtualized
[1] https://www.youtube.com/watch?v=1JoEuJQIJbs
[2] https://medium.com/@priyankadaida/how-to-render-a-million-ro...
[3] https://bvaughn.github.io/react-virtualized/#/components/Lis...
*Update: Here I made a table with 1 million rows with search, filtering, and pagination. In plain Javascript:
https://htmlpreview.github.io/?https://gist.githubuserconten...
Could you give a code example? Also, by crash, do you mean the mentioned stack overflow error?
If so, why would the stack be involved when talking element count?
Because he constructs a giant JSON by joining individual entries. Rendering that directly on the DOM will always cause the performance issues (even at the 10k entries). That's why you need to use virtualized list, it can be done in plain JS or using libraries like react-virtualized.
This works, plain JS 150k rows
<style>
#viewport {
height: 600px;
overflow-y: scroll;
position: relative;
border: 1px solid #ccc;
width: 400px;
margin: auto;
}
.item {
position: absolute;
left: 0;
right: 0;
height: 30px;
padding: 5px;
box-sizing: border-box;
border-bottom: 1px solid #eee;
font-family: Arial, sans-serif;
}
</style>
<div id="viewport">
<div id="content"></div>
</div>
<script>
const viewport = document.getElementById('viewport');
const content = document.getElementById('content');
const itemHeight = 30;
const totalItems = 150000;
const items = Array.from({length: totalItems}, (_, i) => ({
id: i + 1,
name: `User #${i + 1}`
}));
content.style.height = `${totalItems * itemHeight}px`;
function render() {
const scrollTop = viewport.scrollTop;
const viewportHeight = viewport.clientHeight;
const start = Math.floor(scrollTop / itemHeight);
const end = Math.min(totalItems, start + Math.ceil(viewportHeight / itemHeight) + 10);
content.innerHTML = '';
for (let i = start; i < end; i++) {
const div = document.createElement('div');
div.className = 'item';
div.style.top = `${i * itemHeight}px`;
div.textContent = items[i].name;
content.appendChild(div);
}
}
viewport.addEventListener('scroll', render);
render();
</script>
The exact error is "Maximum call stack size exceeded" when the WASM- engine is replaced with this JS engine:
https://github.com/nuejs/nue/blob/master/packages/examples/s...
There is currently no demo about the crash, but you can setup this locally.
`events.push(...arr)` puts all arguments on the call stack before calling the method, which causes the error. Don't push tens of thousands of items at once.
If you're architecture is based on event sourcing, this is kind of the point
It has nothing to do with architecture, but rather understanding how the DOM works. The DOM is notoriously slow, so you should never render a huge number of rows at once. You can render millions of rows in plain JavaScript without impacting performance.
Here, I have recreated your JS example with searching and filtering and it does not crash. It's trivial to reuse a similar approach with the real backend and real events from the event source.
https://htmlpreview.github.io/?https://gist.githubuserconten...
*Update: Here is 1 million rows table with search, filtering and pagination. In plain Javascript:
https://htmlpreview.github.io/?https://gist.githubuserconten...
The point should not be using the spread operator at all costs. There are other ways in javascript to push multiple elements in an array that are more efficient than the spread operator.
The fact you didn't even stop to wonder why the error was a stack overflow when you weren't using recursive functions is also telling.
You're solving a problem nobody has. If you encounter this problem, you shouldn't think "ah, let's yeet the JS engine because it clearly isn't good enough for my awesome SPA", you should think "hm, maybe I shouldn't render 10000000000 records in the DOM".
What's next? "Oh I have a memory leak, let's get a subscription on RAM modules and just keep adding them!"
No. Back when supporting ie 9 we had tables with a million rows and dozens of columns and it runs fine.