dsbulk unload stuck when config -maxConcurrentFiles (write concurrency) greater than 1

Description

dsbulk version: 1.10.0

I'm unloading 10000000 rows from C* table with by using LIMIT query

dsbulk unload -query "SELECT col1, col2 FROM keyspace.table LIMIT 10000000" -maxRecords 1000000 -header false -verbosity high --connector.csv.compression gzip -url table.csv.gz

The command generates 1 read concurrency & 4 write concurrency, checking the logs I didn't find Operation UNLOAD_20230216-042948-286777 closed. line as usual, and still see dsbulk process when checking with ps aux

This bug was not found in dsbulk version: 1.9.1 or set -maxConcurrentFiles 1

Issue is synchronized with a Github issue by Unito
Repository Name: dsbulk
Issue Number: 463

Activity

Show:

Details

Assignee

Reporter

Priority

Created February 16, 2023 at 4:46 AM
Updated February 16, 2023 at 4:46 AM