-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Stream SQLite writes #24
feat: Stream SQLite writes #24
Conversation
for batch in data_batches { | ||
if batch.num_rows() > 0 { | ||
sqlite.insert_batch(&transaction, batch, on_conflict.as_ref())?; | ||
while let Some(data_batch) = batch_rx.blocking_recv() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@peasee - can we perform a read query while streaming results (refreshing)? I have a concern that we can lock table for quite some time
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can perform read queries for tables except the ones still accelerating. If you try querying a table that's accelerating, the query will hang until the acceleration finishes before returning values.
This is the existing behavior on main
.
This is different as we previously collected all data first and was writing only after that thus less locking time
Thanks,
Sergey
________________________________
From: peasee ***@***.***>
Sent: Wednesday, July 31, 2024 7:22:46 PM
To: datafusion-contrib/datafusion-table-providers ***@***.***>
Cc: Sergei Grebnov ***@***.***>; Comment ***@***.***>
Subject: Re: [datafusion-contrib/datafusion-table-providers] feat: Stream SQLite writes (PR #24)
@peasee commented on this pull request.
________________________________
In src/sqlite/write.rs<#24 (comment)>:
@@ -156,9 +158,9 @@ impl DataSink for SqliteDataSink {
sqlite.delete_all_table_data(&transaction)?;
}
- for batch in data_batches {
- if batch.num_rows() > 0 {
- sqlite.insert_batch(&transaction, batch, on_conflict.as_ref())?;
+ while let Some(data_batch) = batch_rx.blocking_recv() {
You can perform read queries for tables except the ones still accelerating. If you try querying a table that's accelerating, the query will hang until the acceleration finishes before returning values.
This is the existing behavior on main.
—
Reply to this email directly, view it on GitHub<#24 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AAHPUTG2BR47UO2YR5OX2VLZPGLXNAVCNFSM6AAAAABLZQUS5WVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMZDEMJRGQZTSNZXGE>.
You are receiving this because you commented.Message ID: ***@***.***>
|
🗣 Description
Through some manual testing, this improves both write performance (by 20-40%) and memory usage (by 80+%) during write operations with 60 million row tables.