I can totally see how not having to manage a standalone RDBMS makes sense. But, what's the real-world advantage over something like SQLite?
I mean, the idea of an in-memory relational engine for things like games or embedded totally makes sense, but this seems to target large datasets and deep analysis.
As far as I understand with this model you pretty much re-ingest data from the "raw" source on startup every time. Is this correct?
Judging by the rise on interest I'm sure there's an obvious use case I'm not seeing either.
think BI tools, analytics dashboards for exploratory analysis, or even just exploratory analysis on the terminal with it's rich query capabilities.
you can keep analytics data in SQLite,
but DuckDB will process it faster/easier for the analytics use cases.
> think BI tools, analytics dashboards for exploratory analysis, or even just exploratory analysis on the terminal with it's rich query capabilities
I thought about that, but I'd never use DuckDB for it because DuckDB is locked into a single process. I can't figure out a benefit of being suck with one core when I always have between 2 and 32 available to me.
I can totally see how not having to manage a standalone RDBMS makes sense. But, what's the real-world advantage over something like SQLite?
I mean, the idea of an in-memory relational engine for things like games or embedded totally makes sense, but this seems to target large datasets and deep analysis.
As far as I understand with this model you pretty much re-ingest data from the "raw" source on startup every time. Is this correct?
Judging by the rise on interest I'm sure there's an obvious use case I'm not seeing either.