S3-Compatible CLI
Command-line tool that allows you to manage your files and directories using familiar S3 commands, making it easy for users experienced with AWS S3 to work with Koneksi storage.
Complete How-To Guide
Features
S3-Compatible Commands: Use familiar commands like
ls
,cp
,mv
,rm
,mb
, andrb
Cross-Platform Support: Works on Linux, macOS, and Windows
Recursive Operations: Support for recursive directory uploads/downloads
Progress Indication: Visual feedback during file transfers
Retry Logic: Automatic retry for failed operations
Flexible Configuration: Support for config files and environment variables
How It Works
The CLI translates S3 concepts to Koneksi's storage model:
Buckets → Koneksi Directories
Objects → Koneksi Files
S3 URIs → Koneksi paths (e.g.,
s3://bucket/key
→ directory/file)
Installation
Prerequisites
Go 1.23 or later (for building from source)
Valid Koneksi API credentials (client ID and client secret)
Option 1: Download Pre-built Binary
Visit the releases page here.
Download the appropriate binary for your platform:
koneksi-s3-linux-amd64
for Linuxkoneksi-s3-darwin-amd64
for macOS (Intel)koneksi-s3-darwin-arm64
for macOS (Apple Silicon)koneksi-s3-windows-amd64.exe
for Windows
Make the binary executable (Linux/macOS):
chmod +x koneksi-s3-*
Move to a location in your PATH:
sudo mv koneksi-s3-* /usr/local/bin/koneksi-s3
Option 2: Build from Source
Clone the repository:
git clone https://github.com/your-org/koneksi-s3-cli-standalone.git cd koneksi-s3-cli-standalone
Build the binary:
make build
Install system-wide:
sudo make install
Option 3: Platform-Specific Builds
Build for specific platforms:
# Linux
make build-linux
# macOS (Intel)
make build-darwin-amd64
# macOS (Apple Silicon)
make build-darwin-arm64
# Windows
make build-windows
# All platforms
make build-all
Configuration
Configuration Methods
The CLI supports three configuration methods (in order of precedence):
Environment Variables (highest priority)
Configuration File
Command-line Flags (for specific options)
Setting Up Configuration
Method 1: Configuration File
Create a configuration file at ~/.koneksi-s3.yaml
:
# Required credentials
client_id: "your-client-id"
client_secret: "your-client-secret"
# Optional settings
directory_id: "default-directory-id" # Default directory for operations
timeout: 30 # Request timeout in seconds
retry_count: 3 # Number of retries for failed requests
retry_wait_time: 1 # Wait time between retries in seconds
retry_max_wait_time: 30 # Maximum wait time between retries
Method 2: Environment Variables
Set environment variables with the KONEKSI_
prefix:
export KONEKSI_CLIENT_ID="your-client-id"
export KONEKSI_CLIENT_SECRET="your-client-secret"
export KONEKSI_DIRECTORY_ID="default-directory-id"
export KONEKSI_TIMEOUT=30
export KONEKSI_RETRY_COUNT=3
Method 3: Mixed Configuration
You can mix configuration methods. For example, store credentials in the config file but override the directory ID via environment:
# ~/.koneksi-s3.yaml contains client_id and client_secret
export KONEKSI_DIRECTORY_ID="project-specific-directory"
Verifying Configuration
Test your configuration by listing buckets:
koneksi-s3 ls
If configured correctly, you should see a list of your directories.
Basic Usage
Understanding S3 URI Format
The CLI uses S3-style URIs to reference remote locations:
s3://bucket-name/
- References a bucket (directory)s3://bucket-name/path/to/object
- References an object (file)
Essential Commands
List Buckets (Directories)
# List all buckets
koneksi-s3 ls
# Example output:
# 2024-01-15 10:30:00 my-documents
# 2024-01-15 11:45:00 project-files
# 2024-01-15 12:00:00 backups
List Objects in a Bucket
# List objects in a bucket
koneksi-s3 ls s3://my-documents/
# List objects with a prefix
koneksi-s3 ls s3://my-documents/reports/
# Example output:
# 2024-01-15 10:30:00 1024 file1.txt
# 2024-01-15 11:00:00 2048 file2.pdf
# 2024-01-15 11:30:00 512 reports/monthly.xlsx
Upload Files
# Upload a single file
koneksi-s3 cp local-file.txt s3://my-documents/remote-file.txt
# Upload with progress indication
koneksi-s3 cp large-file.zip s3://backups/
# Upload to a specific path
koneksi-s3 cp report.pdf s3://my-documents/reports/2024/january.pdf
Download Files
# Download a file
koneksi-s3 cp s3://my-documents/remote-file.txt local-file.txt
# Download to current directory (keep original filename)
koneksi-s3 cp s3://my-documents/report.pdf .
# Download with a new name
koneksi-s3 cp s3://my-documents/old-name.txt new-name.txt
Create and Remove Buckets
# Create a new bucket
koneksi-s3 mb s3://new-bucket/
# Remove an empty bucket
koneksi-s3 rb s3://old-bucket/
# Force remove a non-empty bucket (use with caution!)
koneksi-s3 rb --force s3://old-bucket/
Command Reference
ls
- List Command
ls
- List CommandList buckets or objects in a bucket.
Syntax:
koneksi-s3 ls [s3://bucket[/prefix]]
Examples:
# List all buckets
koneksi-s3 ls
# List root level of a bucket
koneksi-s3 ls s3://my-bucket/
# List with prefix
koneksi-s3 ls s3://my-bucket/documents/2024/
# Recursive listing (if supported)
koneksi-s3 ls -r s3://my-bucket/
cp
- Copy Command
cp
- Copy CommandCopy files between local filesystem and S3, or between S3 locations.
Syntax:
koneksi-s3 cp [options] <source> <destination>
Options:
-r, --recursive
: Copy directories recursively--dry-run
: Show what would be copied without actually copying
Examples:
# Upload a file
koneksi-s3 cp file.txt s3://bucket/file.txt
# Download a file
koneksi-s3 cp s3://bucket/file.txt file.txt
# Copy between S3 locations
koneksi-s3 cp s3://bucket1/file.txt s3://bucket2/file.txt
# Recursive upload
koneksi-s3 cp -r local-folder/ s3://bucket/remote-folder/
# Recursive download
koneksi-s3 cp -r s3://bucket/remote-folder/ local-folder/
# Dry run to preview operations
koneksi-s3 cp -r --dry-run local-folder/ s3://bucket/
mv
- Move Command
mv
- Move CommandMove or rename files between local filesystem and S3, or within S3.
Syntax:
koneksi-s3 mv <source> <destination>
Examples:
# Move local file to S3
koneksi-s3 mv file.txt s3://bucket/file.txt
# Move from S3 to local
koneksi-s3 mv s3://bucket/file.txt file.txt
# Rename within S3
koneksi-s3 mv s3://bucket/old-name.txt s3://bucket/new-name.txt
# Move between buckets
koneksi-s3 mv s3://bucket1/file.txt s3://bucket2/file.txt
rm
- Remove Command
rm
- Remove CommandRemove objects from S3.
Syntax:
koneksi-s3 rm [options] s3://bucket/key
Options:
-r, --recursive
: Remove objects recursively--dry-run
: Show what would be removed without actually removing
Examples:
# Remove a single object
koneksi-s3 rm s3://bucket/file.txt
# Remove with confirmation
koneksi-s3 rm s3://bucket/important-file.txt
# Remove recursively
koneksi-s3 rm -r s3://bucket/folder/
# Dry run
koneksi-s3 rm -r --dry-run s3://bucket/folder/
mb
- Make Bucket Command
mb
- Make Bucket CommandCreate a new bucket (directory).
Syntax:
koneksi-s3 mb s3://bucket-name/
Examples:
# Create a bucket
koneksi-s3 mb s3://my-new-bucket/
# Create with specific naming
koneksi-s3 mb s3://project-2024-data/
rb
- Remove Bucket Command
rb
- Remove Bucket CommandRemove an empty bucket.
Syntax:
koneksi-s3 rb [options] s3://bucket-name/
Options:
--force
: Remove non-empty bucket
Examples:
# Remove empty bucket
koneksi-s3 rb s3://old-bucket/
# Force remove non-empty bucket
koneksi-s3 rb --force s3://old-bucket/
Advanced Usage
Working with Large Files
For large file uploads, the CLI automatically handles:
Progress indication during transfer
Automatic retry on failure
Efficient streaming to minimize memory usage
# Upload large file with progress
koneksi-s3 cp large-dataset.zip s3://backups/datasets/
# The CLI will show:
# Uploading large-dataset.zip to s3://backups/datasets/
# Progress: [##########----------] 50% (512MB/1GB)
Batch Operations
Uploading Multiple Files
# Upload all .txt files
for file in *.txt; do
koneksi-s3 cp "$file" "s3://bucket/texts/$file"
done
# Upload directory structure
koneksi-s3 cp -r ./project/ s3://bucket/projects/2024/
Downloading Multiple Files
# Download all files with prefix
koneksi-s3 cp -r s3://bucket/reports/2024/ ./local-reports/
# Download specific pattern (using shell)
for month in jan feb mar; do
koneksi-s3 cp "s3://bucket/reports/2024/$month-report.pdf" .
done
Scripting and Automation
Backup Script Example
#!/bin/bash
# backup-to-koneksi.sh
BACKUP_DATE=$(date +%Y%m%d)
BACKUP_BUCKET="s3://backups"
# Create daily backup bucket
koneksi-s3 mb "$BACKUP_BUCKET/daily-$BACKUP_DATE/"
# Upload important directories
for dir in documents projects config; do
echo "Backing up $dir..."
koneksi-s3 cp -r "$HOME/$dir/" "$BACKUP_BUCKET/daily-$BACKUP_DATE/$dir/"
done
echo "Backup completed for $BACKUP_DATE"
Sync Script Example
#!/bin/bash
# sync-projects.sh
# Upload new or modified files
find ./projects -type f -mtime -1 | while read file; do
s3_path="s3://project-bucket/${file#./}"
echo "Uploading $file to $s3_path"
koneksi-s3 cp "$file" "$s3_path"
done
Working with Different Directory IDs
Override the default directory ID for specific operations:
# Set directory ID for session
export KONEKSI_DIRECTORY_ID="project-specific-id"
# Upload to specific directory
koneksi-s3 cp file.txt s3://bucket/file.txt
# Reset to default
unset KONEKSI_DIRECTORY_ID
Performance Optimization
Parallel Uploads
# Upload files in parallel using GNU parallel
find . -name "*.jpg" | parallel -j 4 koneksi-s3 cp {} s3://images/{}
# Or using xargs
find . -name "*.pdf" | xargs -P 4 -I {} koneksi-s3 cp {} s3://documents/{}
Efficient Recursive Operations
# Use recursive flag for better performance
koneksi-s3 cp -r large-directory/ s3://bucket/
# Instead of individual file uploads
# DON'T DO THIS - It's slower:
find large-directory -type f -exec koneksi-s3 cp {} s3://bucket/{} \;
Troubleshooting
Common Issues and Solutions
Authentication Errors
Problem: "Authentication failed" or "Invalid credentials"
Solutions:
Verify credentials are correctly set:
# Check environment variables echo $KONEKSI_CLIENT_ID echo $KONEKSI_CLIENT_SECRET # Check config file cat ~/.koneksi-s3.yaml
Ensure no extra spaces in credentials
Verify credentials are active in Koneksi system
Connection Timeouts
Problem: "Request timeout" errors
Solutions:
Increase timeout setting:
export KONEKSI_TIMEOUT=60 # 60 seconds
Check network connectivity:
# Test API endpoint curl -I https://api.koneksi.com/health
Use retry configuration for unstable connections
File Upload Failures
Problem: Large file uploads fail
Solutions:
Increase timeout for large files:
export KONEKSI_TIMEOUT=300 # 5 minutes
Check available disk space for temporary files
Verify file permissions
Directory/Bucket Not Found
Problem: "Bucket not found" errors
Solutions:
List available buckets:
koneksi-s3 ls
Check bucket name spelling
Verify directory ID is correct:
echo $KONEKSI_DIRECTORY_ID
Debug Mode
Enable verbose output for troubleshooting:
# Set debug environment variable (if supported)
export KONEKSI_DEBUG=true
# Run command with verbose output
koneksi-s3 -v cp file.txt s3://bucket/
Getting Help
Built-in Help:
# General help koneksi-s3 --help # Command-specific help koneksi-s3 cp --help koneksi-s3 ls --help
Version Information:
koneksi-s3 --version
Quick Reference
Essential Commands Cheatsheet
# List all buckets
koneksi-s3 ls
# List bucket contents
koneksi-s3 ls s3://bucket/
# Upload file
koneksi-s3 cp file.txt s3://bucket/file.txt
# Download file
koneksi-s3 cp s3://bucket/file.txt file.txt
# Upload directory
koneksi-s3 cp -r folder/ s3://bucket/folder/
# Move/rename file
koneksi-s3 mv s3://bucket/old.txt s3://bucket/new.txt
# Delete file
koneksi-s3 rm s3://bucket/file.txt
# Create bucket
koneksi-s3 mb s3://new-bucket/
# Remove bucket
koneksi-s3 rb s3://bucket/
Configuration Quick Setup
# Create config file
cat > ~/.koneksi-s3.yaml << EOF
client_id: "your-client-id"
client_secret: "your-client-secret"
directory_id: "your-directory-id"
timeout: 30
retry_count: 3
EOF
# Or use environment variables
export KONEKSI_CLIENT_ID="your-client-id"
export KONEKSI_CLIENT_SECRET="your-client-secret"
Common Patterns
# Backup pattern
koneksi-s3 cp -r /important/data/ s3://backups/$(date +%Y%m%d)/
# Archive pattern
tar czf - /path/to/files | koneksi-s3 cp - s3://archives/backup.tar.gz
# Batch upload pattern
find . -name "*.log" -mtime -1 | while read f; do
koneksi-s3 cp "$f" "s3://logs/$(basename "$f")"
done
# Download and extract pattern
koneksi-s3 cp s3://archives/data.tar.gz - | tar xzf -
Tips and Best Practices
Use Recursive Flags: For directory operations, always use
-r
flag for better performanceOrganize with Prefixes: Use S3-style prefixes to organize files (e.g.,
s3://bucket/year/month/day/
)Test with Dry Run: Use
--dry-run
flag to preview operations before executingScript Repetitive Tasks: Create shell scripts for common operations
Monitor Transfers: For large transfers, run in screen/tmux session
Regular Backups: Schedule regular backups using cron jobs
Secure Credentials: Never commit credentials to version control
Additional Resources
GitHub Repository: https://github.com/koneksi-tech/s3-cli
Issue Tracker: https://github.com/koneksi-tech/s3-cli/issues
Last updated