logo_smallAxellero.io

File Manipulation

Manipulate and transform files in the sandbox environment with operations like copying, moving, renaming, and batch transformations.

File Manipulation

Perform comprehensive file manipulation operations within the sandbox environment including copying, moving, renaming, batch transformations, and advanced file operations with safety controls and validation.

🔧 File Operation Capabilities

File manipulation includes safe copying, moving, renaming operations with collision handling, permission preservation, and comprehensive validation to ensure data integrity during file operations.

Overview

The File Manipulation tool provides comprehensive file operation capabilities within the sandbox environment, supporting copy, move, rename operations with advanced features like batch processing, collision handling, and operation validation.

Key Features

  • File Operations - Copy, move, rename files and directories safely
  • Batch Processing - Perform operations on multiple files simultaneously
  • Collision Handling - Smart handling of naming conflicts and overwrites
  • Validation - Pre-operation validation and post-operation verification
  • Operation Logging - Comprehensive logging of all file operations

Methods

fileManipulation

Perform file manipulation operations in the sandbox environment.

ParameterTypeRequiredDescription
operationStringYesOperation type: 'copy', 'move', 'rename', 'batch'
sourcePathStringYesSource file or directory path
destinationPathStringYesDestination file or directory path
optionsObjectNoOperation-specific options
preserveMetadataBooleanNoPreserve file metadata (default: true)
overwriteBooleanNoAllow overwriting existing files (default: false)
createDirectoriesBooleanNoCreate destination directories (default: true)
validationBooleanNoValidate operation before execution (default: true)
{
  "operation": "copy",
  "sourcePath": "/sandbox/data/source.csv",
  "destinationPath": "/sandbox/backup/source_backup.csv",
  "options": {
    "preserveTimestamps": true,
    "verifyIntegrity": true
  },
  "preserveMetadata": true,
  "createDirectories": true
}

Output:

  • success (Boolean) - Operation success status
  • operation (String) - Type of operation performed
  • sourcePath (String) - Source file/directory path
  • destinationPath (String) - Destination file/directory path
  • filesProcessed (Number) - Number of files affected by operation
  • bytesTransferred (Number) - Total bytes copied/moved
  • operationTime (Number) - Operation duration in milliseconds
  • validation (Object) - Pre and post-operation validation results
  • conflicts (Array) - List of naming conflicts encountered
  • skippedFiles (Array) - List of files skipped during operation

File Copy Operations

Single File Copy

File Move and Rename Operations

File Moving

Batch File Operations

Advanced Batch Processing

def batch_file_operations(operations_list):
    """Perform multiple file operations in batch."""
    
    batch_results = {
        "total_operations": len(operations_list),
        "successful": [],
        "failed": [],
        "summary": {
            "copy": 0,
            "move": 0,
            "rename": 0,
            "total_bytes": 0
        }
    }
    
    for i, operation in enumerate(operations_list):
        print(f"🔧 Operation {i+1}/{len(operations_list)}: {operation['operation']}")
        
        try:
            result = fileManipulation(operation)
            
            if result['success']:
                batch_results["successful"].append({
                    "operation": operation,
                    "result": result
                })
                
                # Update summary
                op_type = operation['operation']
                batch_results["summary"][op_type] += 1
                batch_results["summary"]["total_bytes"] += result.get('bytesTransferred', 0)
                
                print(f"   ✅ {operation['operation']} completed")
            else:
                batch_results["failed"].append({
                    "operation": operation,
                    "error": result.get('error')
                })
                print(f"   ❌ {operation['operation']} failed: {result.get('error')}")
        
        except Exception as e:
            batch_results["failed"].append({
                "operation": operation,
                "error": str(e)
            })
            print(f"   💥 {operation['operation']} exception: {str(e)}")
    
    # Generate summary report
    success_rate = len(batch_results["successful"]) / len(operations_list) * 100
    total_mb = batch_results["summary"]["total_bytes"] / (1024*1024)
    
    print(f"\n📊 Batch operation summary:")
    print(f"   Success rate: {success_rate:.1f}% ({len(batch_results['successful'])}/{len(operations_list)})")
    print(f"   Total data transferred: {total_mb:.2f} MB")
    print(f"   Copy operations: {batch_results['summary']['copy']}")
    print(f"   Move operations: {batch_results['summary']['move']}")
    print(f"   Rename operations: {batch_results['summary']['rename']}")
    
    return batch_results

def create_backup_workflow(source_directories, backup_base):
    """Create comprehensive backup workflow with multiple operations."""
    
    import datetime
    
    timestamp = datetime.datetime.now().strftime('%Y%m%d_%H%M%S')
    operations = []
    
    for source_dir in source_directories:
        # Get directory name for backup
        dir_name = os.path.basename(source_dir.rstrip('/'))
        backup_dir = f"{backup_base}/{dir_name}_backup_{timestamp}"
        
        # Add copy operation for each directory
        operations.append({
            "operation": "copy",
            "sourcePath": source_dir,
            "destinationPath": backup_dir,
            "preserveMetadata": True,
            "createDirectories": True,
            "validation": True
        })
    
    # Execute backup operations
    backup_results = batch_file_operations(operations)
    
    # Create backup manifest
    manifest = {
        "backup_timestamp": timestamp,
        "source_directories": source_directories,
        "backup_base": backup_base,
        "operations_performed": len(operations),
        "successful_backups": len(backup_results["successful"]),
        "failed_backups": len(backup_results["failed"]),
        "total_data_mb": backup_results["summary"]["total_bytes"] / (1024*1024)
    }
    
    # Write manifest file
    manifest_path = f"{backup_base}/backup_manifest_{timestamp}.json"
    writeFile({
        "filePath": manifest_path,
        "content": manifest,
        "format": "json",
        "createDirectories": True
    })
    
    return {
        "backup_results": backup_results,
        "manifest": manifest,
        "manifest_path": manifest_path
    }

# Usage
backup_dirs = [
    "/sandbox/projects/important",
    "/sandbox/data/analysis",
    "/sandbox/config"
]

backup_workflow = create_backup_workflow(backup_dirs, "/sandbox/backups")
print(f"Backup completed: {backup_workflow['manifest']['successful_backups']} of {backup_workflow['manifest']['operations_performed']} directories")

Error Handling and Recovery

Common Manipulation Issues

Error TypeCauseResolution
File LockedFile in use by another processWait for release or force operation
Permission DeniedInsufficient operation permissionsCheck file/directory permissions
Destination ExistsTarget path already occupiedUse overwrite option or rename
Disk Space FullInsufficient storage for operationFree up space or use different location
Path Too LongFile path exceeds system limitsUse shorter paths or restructure

Robust Operation Handling

def robust_file_operation_with_retry(operation_spec, max_retries=3):
    """Perform file operation with retry logic and error recovery."""
    
    import time
    
    for attempt in range(max_retries):
        try:
            result = fileManipulation(operation_spec)
            
            if result['success']:
                print(f"✅ Operation successful on attempt {attempt + 1}")
                return result
            else:
                error_msg = result.get('error', 'Unknown error')
                print(f"Attempt {attempt + 1} failed: {error_msg}")
                
                # Handle specific error types
                if 'permission denied' in error_msg.lower():
                    print("   🔧 Attempting permission fix...")
                    # Could attempt to fix permissions here
                
                elif 'file exists' in error_msg.lower() and not operation_spec.get('overwrite'):
                    print("   🔧 Attempting conflict resolution...")
                    # Modify operation to handle conflict
                    operation_spec['overwrite'] = True
                
                elif 'disk space' in error_msg.lower():
                    print("   💾 Disk space issue detected - aborting retries")
                    break
                
                if attempt < max_retries - 1:
                    wait_time = 2 ** attempt
                    print(f"   ⏳ Waiting {wait_time} seconds before retry...")
                    time.sleep(wait_time)
        
        except Exception as e:
            print(f"Attempt {attempt + 1} exception: {str(e)}")
            if attempt < max_retries - 1:
                time.sleep(2 ** attempt)
    
    return {
        "success": False,
        "error": f"Operation failed after {max_retries} attempts",
        "operation": operation_spec
    }

# Usage with retry logic
critical_operation = {
    "operation": "copy",
    "sourcePath": "/sandbox/critical/important_data.db",
    "destinationPath": "/sandbox/backup/important_data_backup.db",
    "preserveMetadata": True,
    "validation": True
}

robust_result = robust_file_operation_with_retry(critical_operation, max_retries=5)

Integration Patterns

With Other File Tools

# Integrate with file search and processing
def process_and_move_files(search_criteria, processing_function, destination_base):
    """Search for files, process them, and move to organized structure."""
    
    # Search for files to process
    search_result = fileSearch(search_criteria)
    
    if not search_result['success']:
        return {"error": "File search failed"}
    
    processing_results = {
        "processed": [],
        "failed": [],
        "moved": []
    }
    
    for file_result in search_result['results']:
        file_path = file_result['filePath']
        
        try:
            # Process the file
            processed = processing_function(file_path)
            
            if processed['success']:
                processing_results["processed"].append(file_path)
                
                # Move processed file to appropriate location
                file_name = os.path.basename(file_path)
                dest_path = f"{destination_base}/processed/{file_name}"
                
                move_result = fileManipulation({
                    "operation": "move",
                    "sourcePath": file_path,
                    "destinationPath": dest_path,
                    "createDirectories": True
                })
                
                if move_result['success']:
                    processing_results["moved"].append({
                        "source": file_path,
                        "destination": dest_path
                    })
                else:
                    print(f"⚠️ Processed but failed to move: {file_path}")
            else:
                processing_results["failed"].append({
                    "file": file_path,
                    "error": processed.get('error')
                })
        
        except Exception as e:
            processing_results["failed"].append({
                "file": file_path,
                "error": str(e)
            })
    
    return processing_results

# Example processing function
def simple_csv_processor(csv_path):
    """Simple CSV processing example."""
    try:
        # Read and validate CSV
        content = readFile({
            "filePath": csv_path,
            "readMode": "csv"
        })
        
        if content['success']:
            # Add processing timestamp
            processed_content = content['content']
            processed_content['metadata'] = {
                "processed_at": datetime.datetime.now().isoformat(),
                "row_count": len(processed_content.get('rows', []))
            }
            
            # Write processed version
            writeFile({
                "filePath": csv_path.replace('.csv', '_processed.csv'),
                "content": processed_content,
                "format": "csv"
            })
            
            return {"success": True}
        else:
            return {"success": False, "error": "Cannot read CSV"}
    
    except Exception as e:
        return {"success": False, "error": str(e)}

# Usage
search_and_process_result = process_and_move_files(
    {
        "filePattern": "*.csv",
        "searchPath": "/sandbox/input",
        "recursive": True
    },
    simple_csv_processor,
    "/sandbox/workflow"
)

Next Steps: Combine with File Search to locate files for manipulation, or use List Files to identify manipulation targets.