What size data is "in class" for Xorq? Can it process data out-of-core?
Yes, "we" are out of core to the extent that the engines used in the deferred expressions we execute are out-of-core (our "batteries-included" engine is a modified Datafusion).
We have previously demonstrated the capability of doing iterative batch training by way of our "batteries-included" engine. I'll try to post a reference later but need to run now due to family obligations.
this is an example of an out-of-core processing: https://www.xorq.dev/posts/trino-duckdb-asof-join
Anecdotally, TPC-H 10 TB is pretty doable now a days with DuckDB, so xorq goes as far as your engine may take you...