Glob Patterns & Multi-File Mode
Several commands support glob patterns for batch processing multiple files.
Supported Commands
Section titled “Supported Commands”| Command | Glob Support | Output Behavior |
|---|---|---|
split | ✅ | Creates subdirectory per input file |
analyze | ✅ | Aggregates results across all files |
convert | ✅ | Requires output directory |
validate | ✅ | Validates each file, aggregates results |
Basic Patterns
Section titled “Basic Patterns”# All SQL files in current directorysql-splitter analyze "*.sql"
# All SQL files in a directorysql-splitter validate "dumps/*.sql"
# Recursive (all subdirectories)sql-splitter analyze "backups/**/*.sql"
# Compressed filessql-splitter validate "dumps/*.sql.gz"Important: Quote glob patterns to prevent shell expansion.
Pattern Syntax
Section titled “Pattern Syntax”| Pattern | Matches |
|---|---|
* | Any characters except / |
** | Any characters including / (recursive) |
? | Any single character |
[abc] | Any character in brackets |
[a-z] | Any character in range |
Examples
Section titled “Examples”# All .sql files in dumps/sql-splitter analyze "dumps/*.sql"
# All .sql and .sql.gz files recursivelysql-splitter validate "backups/**/*.sql*"
# Files starting with "prod_"sql-splitter analyze "prod_*.sql"
# Files from 2024sql-splitter validate "backup_2024*.sql"—fail-fast Flag
Section titled “—fail-fast Flag”By default, glob processing continues even if one file fails. Use --fail-fast to stop on first error:
# Stop on first errorsql-splitter validate "*.sql" --fail-fast
# Continue despite errors (default)sql-splitter validate "*.sql"Output Behavior
Section titled “Output Behavior”analyze
Section titled “analyze”Outputs aggregate statistics across all files:
sql-splitter analyze "dumps/*.sql" --json{ "files_processed": 5, "total_tables": 42, "total_rows": 150000, "results": [ { "file": "dumps/users.sql", "tables": 1, "rows": 1000 }, { "file": "dumps/orders.sql", "tables": 1, "rows": 50000 } ]}validate
Section titled “validate”Reports validation status for each file:
sql-splitter validate "*.sql" --json{ "total_files": 3, "passed": 2, "failed": 1, "results": [ { "file": "a.sql", "summary": { "errors": 0, "warnings": 0 } }, { "file": "b.sql", "summary": { "errors": 0, "warnings": 1 } }, { "file": "c.sql", "summary": { "errors": 2, "warnings": 0 } } ]}Creates a subdirectory for each input file:
sql-splitter split "dumps/*.sql" -o output/output/├── dump1/│ ├── users.sql│ └── orders.sql├── dump2/│ ├── users.sql│ └── products.sqlconvert
Section titled “convert”Requires an output directory (not a file):
# Correct: output directorysql-splitter convert "*.sql" --to postgres -o converted/
# Creates:# converted/# ├── file1.sql# ├── file2.sqlProgress with Globs
Section titled “Progress with Globs”The --progress flag shows per-file and overall progress:
sql-splitter validate "backups/**/*.sql.gz" --progress[1/5] backups/2024/jan.sql.gz ✓[2/5] backups/2024/feb.sql.gz ✓[3/5] backups/2024/mar.sql.gz ⚠ 2 warnings[4/5] backups/2024/apr.sql.gz ✓[5/5] backups/2024/may.sql.gz ✗ 1 error
Summary: 4 passed, 1 failedParallel Processing
Section titled “Parallel Processing”For parallel processing of many files, use external tools:
# Process 4 files in parallelfind dumps -name '*.sql.gz' -print0 | \ xargs -0 -n1 -P4 sql-splitter validate --strictCI/CD Usage
Section titled “CI/CD Usage”# GitHub Actions example- name: Validate all SQL dumps run: | sql-splitter validate "migrations/*.sql" --strict --json > validation.json if jq -e '.failed > 0' validation.json; then echo "Validation failed" exit 1 fiSee Also
Section titled “See Also”- Exit Codes - Understanding exit codes for scripting
- Unix Piping - Composing commands