Data Management Commands
Data management commands allow you to export, import, and share repository indexes, enabling efficient collaboration and CI/CD integration.
export
Export repository index to a file for sharing or backup.
Synopsis
supersmall export --output [options]
Description
The export
command saves your repository index to a file, allowing you to:
- Share indexes with team members
- Cache indexes in CI/CD pipelines
- Create backups before major changes
- Transfer indexes between environments
Options
| Option | Required | Description | Default |
|--------|----------|-------------|---------|
| --output
, -o
| Yes | Output file path | |
| --format
| No | Export format: json
, binary
| json
|
| --compress
| No | Compress output with zlib | false
|
Export Formats
#### JSON Format
- Human-readable
- Larger file size
- Easy to inspect and debug
- Compatible with standard tools
#### Binary Format
- Compact representation
- Smaller file size
- Faster to load
- Not human-readable
Examples
#### Basic export
supersmall export --output project-index.json
#### Binary format
supersmall export --output project.index --format binary
#### Compressed export
supersmall export --output project-index.json.gz --compress
#### Export with path
supersmall export -o ~/backups/project-$(date +%Y%m%d).index
File Sizes
Typical file sizes for a medium project (10k files):
- JSON: ~5-10 MB
- JSON compressed: ~1-2 MB
- Binary: ~3-5 MB
- Binary compressed: ~0.5-1 MB
Output
📤 Exporting index...
✅ Index exported to: project-index.json
📊 Size: 4.2 MB
---
import
Import repository index from a file.
Synopsis
supersmall import --input [options]
Description
The import
command loads a previously exported index, useful for:
- Restoring indexes from backups
- Sharing pre-built indexes
- Speeding up CI/CD workflows
- Avoiding re-indexing large repositories
Options
| Option | Required | Description | Default |
|--------|----------|-------------|---------|
| --input
, -i
| Yes | Input file path | |
| --verify
| No | Verify index after import | false
|
Import Process
1. Loads index from file
2. Detects format automatically (JSON/binary)
3. Decompresses if needed
4. Validates index structure
5. Saves to .supersmall/index.json
6. Optionally verifies file integrity
Examples
#### Basic import
supersmall import --input project-index.json
#### Import with verification
supersmall import --input shared-index.json --verify
#### Import compressed file
supersmall import -i project-index.json.gz
Verification
The --verify
option checks:
- All indexed files still exist
- File hashes match (no modifications)
- Repository structure unchanged
Verification output:
🔍 Verifying index...
✅ Index verification passed
Or if changes detected:
🔍 Verifying index...
⚠️ Index verification failed - some files may have changed
Output
📥 Importing index from: project-index.json
✅ Index imported successfully!
📈 Stats: 156 files, 2,341 symbols
Use Cases
Team Collaboration
Share indexes with team members to avoid redundant indexing:
Developer A: Export after indexing
supersmall init
supersmall export -o team-index.json --compress
Developer B: Import instead of indexing
supersmall import -i team-index.json
CI/CD Integration
Cache indexes to speed up builds:
GitHub Actions example
name: Build with SuperSmall
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Cache SuperSmall index
uses: actions/cache@v3
with:
path: .supersmall/index.json
key: supersmall-${{ hashFiles('*/.ts', '*/.js') }}
- name: Initialize or import index
run: |
if [ -f .supersmall/index.json ]; then
echo "Using cached index"
else
supersmall init
supersmall export -o index-backup.json
fi
- name: Generate context
run: |
supersmall query "${{ github.event.pull_request.title }}"
Backup Strategy
Create regular backups of your index:
#!/bin/bash
backup-index.sh
DATE=$(date +%Y%m%d-%H%M%S)
BACKUP_DIR="$HOME/.supersmall-backups"
mkdir -p "$BACKUP_DIR"
supersmall export -o "$BACKUP_DIR/index-$DATE.json" --compress
Keep only last 10 backups
ls -t "$BACKUP_DIR"/index-*.json | tail -n +11 | xargs rm -f
Docker Integration
Include index in Docker images:
```dockerfile
Dockerfile
FROM swift:5.10
Copy source code
COPY . /app
WORKDIR /app
Copy pre-built index
COPY .supersmall/index.json /app/.supersmall/
Install SuperSmall
RUN swift build -c release && \
cp .build/release/supersmall /usr/local/bin/
Verify index
RUN supersmall import -i /app/.supersmall/index.json --verify
Multi-Repository Management
Manage indexes for multiple repositories:
#!/bin/bash
export-all-indexes.sh
REPOS=("auth-service" "api-gateway" "frontend-app")
EXPORT_DIR="./indexes"
mkdir -p "$EXPORT_DIR"
for repo in "${REPOS[@]}"; do
echo "Exporting $repo..."
cd "/path/to/$repo"
supersmall export -o "$EXPORT_DIR/$repo.index" --compress
done
Best Practices
Exporting
1. Use compression for network transfers
2. Include timestamps in filenames
3. Use binary format for production
4. Keep JSON format for debugging
Importing
1. Always verify after importing shared indexes
2. Check index age before importing
3. Update index after import if needed
4. Handle import failures gracefully
Storage
1. Store indexes in version control for small projects
2. Use artifact storage for large indexes
3. Implement retention policies
4. Document index creation parameters
Security
1. Don't include sensitive data in indexes
2. Verify index sources before importing
3. Use checksums for integrity
4. Limit index access appropriately
Format Specifications
JSON Format Structure
{
"version": "1.0",
"repository": "/path/to/repo",
"indexedAt": "2023-10-15T14:30:00Z",
"languages": ["typescript", "javascript"],
"files": [...],
"symbols": [...],
"dependencies": [...]
}
Compression Details
- Algorithm: zlib
- Compression level: Default (6)
- File extension: .gz
recommended
- Automatic detection on import