logo_smallAxellero.io

Delete File

Safely delete files and directories in the sandbox environment with comprehensive validation, backup options, and recovery capabilities.

Delete File

Securely delete files and directories within the sandbox environment with comprehensive safety checks, backup options, and recovery mechanisms to prevent accidental data loss.

🗑️ Safe Deletion Practices

File deletion operations include safety validation, backup creation, and recovery options. Deleted files can be recovered from backup if enabled before deletion.

Overview

The Delete File tool provides secure file and directory deletion capabilities within the sandbox environment, featuring safety checks, selective deletion, backup creation, and comprehensive validation to prevent accidental data loss.

Key Features

  • Safety Validation - Multiple confirmation checks before deletion
  • Backup Creation - Automatic backup before deletion with recovery options
  • Selective Deletion - Delete specific files, patterns, or directory contents
  • Recovery Mechanism - Restore deleted files from backup archives
  • Audit Logging - Comprehensive logging of all deletion operations

Methods

deleteFile

Delete files and directories in the sandbox environment safely.

ParameterTypeRequiredDescription
filePathStringYesPath to file or directory to delete
recursiveBooleanNoDelete directories recursively (default: false)
createBackupBooleanNoCreate backup before deletion (default: true)
confirmDeletionBooleanNoRequire explicit confirmation (default: true)
patternStringNoDelete files matching pattern (wildcard support)
excludePatternStringNoExclude files matching pattern from deletion
maxSizeNumberNoMaximum total size to delete in bytes
dryRunBooleanNoPreview deletion without executing (default: false)
{
  "filePath": "/sandbox/temp",
  "recursive": true,
  "createBackup": true,
  "pattern": "*.tmp",
  "excludePattern": "important_*.tmp"
}

Output:

  • success (Boolean) - Deletion operation success status
  • deletedItems (Array) - List of successfully deleted items
    • path (String) - Path of deleted item
    • type (String) - Item type: 'file' or 'directory'
    • size (Number) - Size of deleted item in bytes
    • deletedAt (String) - Deletion timestamp
  • skippedItems (Array) - List of skipped items (excluded patterns)
  • failedItems (Array) - List of items that failed to delete
  • backupInfo (Object) - Backup information (if backup was created)
    • backupPath (String) - Path to backup archive
    • backupSize (Number) - Backup archive size
    • itemCount (Number) - Number of items in backup
  • totalDeleted (Number) - Total number of items deleted
  • totalSize (Number) - Total size of deleted data in bytes

Single File Deletion

Basic File Removal

Bulk Deletion Operations

Pattern-Based Deletion

Safe Deletion Workflows

Backup and Recovery

Specialized Deletion Operations

Log File Management

def cleanup_log_files(log_directory, retention_days=30, max_size_mb=100):
    """Clean up log files based on age and size criteria."""
    
    import datetime
    
    # Get all log files
    log_files = listFiles({
        "path": log_directory,
        "recursive": True,
        "pattern": "*.log",
        "includeMetadata": True
    })
    
    if not log_files['success']:
        return {"error": "Cannot access log directory"}
    
    files_to_delete = []
    total_size_saved = 0
    cutoff_date = datetime.datetime.now() - datetime.timedelta(days=retention_days)
    
    for log_file in log_files['items']:
        if log_file['type'] == 'file':
            # Check age
            mod_time = datetime.datetime.fromisoformat(log_file['modified'].replace('Z', '+00:00'))
            
            # Check size (convert MB to bytes)
            size_mb = log_file['size'] / (1024 * 1024)
            
            should_delete = False
            reason = ""
            
            if mod_time < cutoff_date:
                should_delete = True
                reason = f"older than {retention_days} days"
            elif size_mb > max_size_mb:
                should_delete = True
                reason = f"larger than {max_size_mb}MB"
            
            if should_delete:
                files_to_delete.append({
                    "path": log_file['path'],
                    "size": log_file['size'],
                    "reason": reason
                })
                total_size_saved += log_file['size']
    
    # Delete qualifying log files
    deletion_results = []
    for file_info in files_to_delete:
        delete_result = deleteFile({
            "filePath": file_info['path'],
            "createBackup": True  # Keep backup for log files
        })
        
        deletion_results.append({
            "path": file_info['path'],
            "success": delete_result['success'],
            "reason": file_info['reason'],
            "size": file_info['size']
        })
    
    successful_deletions = [r for r in deletion_results if r['success']]
    
    return {
        "analyzed_files": len(log_files['items']),
        "files_deleted": len(successful_deletions),
        "size_freed_mb": sum(r['size'] for r in successful_deletions) / (1024 * 1024),
        "retention_days": retention_days,
        "max_size_mb": max_size_mb,
        "deleted_files": successful_deletions
    }

# Usage
log_cleanup_result = cleanup_log_files("/sandbox/logs", retention_days=14, max_size_mb=50)
print(f"Cleaned up {log_cleanup_result['files_deleted']} log files")
print(f"Freed {log_cleanup_result['size_freed_mb']:.2f} MB of storage")

Error Handling and Recovery

Common Deletion Issues

Error TypeCauseResolution
Permission DeniedInsufficient delete permissionsCheck file/directory permissions
File In UseFile locked by running processStop process or wait for release
Directory Not EmptyNon-recursive deletion of non-empty directoryUse recursive=true or clean contents first
Backup FailedCannot create backup before deletionCheck backup location permissions
Path Not FoundFile/directory doesn't existVerify path and check file existence

Robust Deletion with Error Recovery

def resilient_delete_operation(file_paths, retry_attempts=3):
    """Perform deletion with comprehensive error handling and retry logic."""
    
    import time
    
    results = {
        "successful": [],
        "failed": [],
        "retried": [],
        "permanently_failed": []
    }
    
    for file_path in file_paths:
        success = False
        attempt = 0
        last_error = None
        
        while attempt < retry_attempts and not success:
            try:
                delete_result = deleteFile({
                    "filePath": file_path,
                    "createBackup": True,
                    "confirmDeletion": False
                })
                
                if delete_result['success']:
                    results["successful"].append({
                        "path": file_path,
                        "attempt": attempt + 1,
                        "backup": delete_result.get('backupInfo', {}).get('backupPath')
                    })
                    success = True
                else:
                    last_error = delete_result.get('error', 'Unknown error')
                    attempt += 1
                    if attempt < retry_attempts:
                        print(f"Retry {attempt} for {file_path}: {last_error}")
                        time.sleep(2 ** attempt)  # Exponential backoff
                        results["retried"].append({
                            "path": file_path,
                            "attempt": attempt,
                            "error": last_error
                        })
                
            except Exception as e:
                last_error = str(e)
                attempt += 1
                if attempt < retry_attempts:
                    time.sleep(2 ** attempt)
        
        if not success:
            results["failed"].append({
                "path": file_path,
                "final_error": last_error,
                "attempts": retry_attempts
            })
            
            # Try alternative deletion approaches
            try:
                # Attempt to move to trash instead of direct deletion
                trash_path = f"/sandbox/trash/{os.path.basename(file_path)}"
                move_result = writeFile({
                    "filePath": trash_path,
                    "content": readFile({"filePath": file_path})['content'],
                    "createDirectories": True
                })
                
                if move_result['success']:
                    results["permanently_failed"].append({
                        "path": file_path,
                        "moved_to_trash": trash_path,
                        "original_error": last_error
                    })
            except:
                results["permanently_failed"].append({
                    "path": file_path,
                    "error": last_error,
                    "recovery_failed": True
                })
    
    return results

# Usage
files_to_delete = [
    "/sandbox/temp/file1.txt",
    "/sandbox/cache/cache1.json",
    "/sandbox/working/temp_data.csv"
]

deletion_results = resilient_delete_operation(files_to_delete, retry_attempts=3)

print(f"✅ Successfully deleted: {len(deletion_results['successful'])}")
print(f"❌ Failed to delete: {len(deletion_results['failed'])}")
print(f"🔄 Required retries: {len(deletion_results['retried'])}")

Integration Patterns

With File Management Workflows

# Integrate deletion with data processing pipelines
def cleanup_after_processing(processing_results):
    """Clean up temporary files after data processing."""
    
    # Identify temporary files created during processing
    temp_files = processing_results.get('temp_files', [])
    intermediate_files = processing_results.get('intermediate_files', [])
    
    # Keep only final output files
    files_to_delete = temp_files + intermediate_files
    
    cleanup_results = {
        "deleted_temp": [],
        "deleted_intermediate": [],
        "kept_outputs": processing_results.get('output_files', []),
        "errors": []
    }
    
    for file_path in files_to_delete:
        delete_result = deleteFile({
            "filePath": file_path,
            "createBackup": False  # Temporary files don't need backup
        })
        
        if delete_result['success']:
            if file_path in temp_files:
                cleanup_results["deleted_temp"].append(file_path)
            else:
                cleanup_results["deleted_intermediate"].append(file_path)
        else:
            cleanup_results["errors"].append({
                "path": file_path,
                "error": delete_result.get('error')
            })
    
    return cleanup_results

# Usage in data processing pipeline
def data_processing_with_cleanup():
    """Complete data processing workflow with automatic cleanup."""
    
    processing_results = {
        "temp_files": ["/sandbox/temp/raw_data.csv", "/sandbox/temp/cleaned_data.csv"],
        "intermediate_files": ["/sandbox/working/processed_data.json"],
        "output_files": ["/sandbox/output/final_report.pdf"],
        "success": True
    }
    
    if processing_results['success']:
        cleanup_result = cleanup_after_processing(processing_results)
        print(f"Deleted {len(cleanup_result['deleted_temp'])} temp files")
        print(f"Deleted {len(cleanup_result['deleted_intermediate'])} intermediate files")
        print(f"Kept {len(cleanup_result['kept_outputs'])} output files")
    
    return processing_results, cleanup_result

# Execute processing with cleanup
processing_result, cleanup_result = data_processing_with_cleanup()

Next Steps: Use with List Files to identify deletion candidates, or File Metadata for deletion criteria analysis.