logo_smallAxellero.io

Write File

Create and write files in the sandbox environment with support for multiple formats, encoding options, and atomic write operations.

Write File

Create and write files within the sandbox environment with comprehensive format support, encoding options, and atomic write operations for data persistence and file generation.

✍️ File Writing Capabilities

File writing supports text, binary, structured data formats with encoding control, atomic operations, and validation to ensure data integrity and safe file creation.

Overview

The Write File tool enables secure file creation and content writing within the sandbox environment, supporting multiple file formats, encoding options, and write modes for efficient data persistence and file generation.

Key Features

  • Multi-Format Support - Write text, binary, JSON, CSV, XML, and structured data
  • Encoding Control - Specify character encodings and format options
  • Atomic Operations - Safe file writing with rollback on failure
  • Append/Overwrite Modes - Flexible write modes for different use cases
  • Validation - Content validation and format checking before writing

Methods

writeFile

Write content to files in the sandbox environment.

ParameterTypeRequiredDescription
filePathStringYesPath where file should be created/written
contentString/Object/ArrayYesContent to write to the file
writeModeStringNoWrite mode: 'write', 'append', 'create' (default: 'write')
formatStringNoContent format: 'text', 'json', 'csv', 'xml', 'binary' (default: 'auto')
encodingStringNoCharacter encoding (default: 'utf-8')
createDirectoriesBooleanNoCreate parent directories if they don't exist (default: false)
backupBooleanNoCreate backup of existing file (default: false)
validationObjectNoContent validation options
{
  "filePath": "/sandbox/output/results.json",
  "content": {
    "analysis": "completed",
    "results": [1, 2, 3]
  },
  "format": "json",
  "createDirectories": true,
  "backup": true
}

Output:

  • success (Boolean) - Write operation success status
  • filePath (String) - Path to the written file
  • bytesWritten (Number) - Number of bytes written
  • fileInfo (Object) - Information about the written file
    • size (Number) - Final file size in bytes
    • created (String) - File creation timestamp
    • encoding (String) - Character encoding used
    • format (String) - Content format written
  • backupPath (String) - Path to backup file (if backup was created)
  • writeTime (Number) - Write operation duration in milliseconds

Text File Writing

Plain Text Files

Structured Data Writing

JSON Data Files

Binary and Media Files

Binary File Writing

🔒 Binary File Security

Binary file writing has size restrictions and format validation for security. Ensure content is validated before writing.

def write_binary_file(file_path, binary_data, max_size=10485760):  # 10MB limit
    """Write binary data to file with size validation."""
    
    if len(binary_data) > max_size:
        return {
            "success": False,
            "error": f"Binary data too large: {len(binary_data)} bytes (max: {max_size})"
        }
    
    result = writeFile({
        "filePath": file_path,
        "content": binary_data,
        "format": "binary",
        "createDirectories": True
    })
    
    return result

def convert_and_write_image(input_data, output_path, target_format="PNG"):
    """Convert and write image data."""
    
    try:
        from PIL import Image
        import io
        
        # Convert image data
        if isinstance(input_data, str):  # Base64 data
            import base64
            image_bytes = base64.b64decode(input_data)
        else:
            image_bytes = input_data
        
        # Process with PIL
        image = Image.open(io.BytesIO(image_bytes))
        
        # Convert to target format
        output_buffer = io.BytesIO()
        image.save(output_buffer, format=target_format)
        
        # Write to file
        result = writeFile({
            "filePath": output_path,
            "content": output_buffer.getvalue(),
            "format": "binary"
        })
        
        return result
        
    except Exception as e:
        return {"success": False, "error": f"Image conversion failed: {str(e)}"}

# Usage
# image_result = convert_and_write_image(base64_image_data, "/sandbox/images/converted.png")

Advanced Writing Operations

Atomic File Writing

def atomic_write_file(file_path, content, **options):
    """Perform atomic file writing with rollback on failure."""
    
    import tempfile
    import os
    
    # Create temporary file in same directory
    dir_path = os.path.dirname(file_path)
    temp_file_path = f"{file_path}.tmp.{os.getpid()}"
    
    try:
        # Write to temporary file first
        temp_result = writeFile({
            "filePath": temp_file_path,
            "content": content,
            **options
        })
        
        if not temp_result['success']:
            return temp_result
        
        # Verify written content
        verify_result = readFile({
            "filePath": temp_file_path,
            "readMode": options.get('format', 'auto')
        })
        
        if not verify_result['success']:
            # Cleanup and return error
            try:
                os.unlink(temp_file_path)
            except:
                pass
            return {"success": False, "error": "Content verification failed"}
        
        # Move temp file to final location
        os.rename(temp_file_path, file_path)
        
        return {
            "success": True,
            "filePath": file_path,
            "bytesWritten": temp_result['bytesWritten'],
            "atomic": True
        }
        
    except Exception as e:
        # Cleanup temp file on error
        try:
            if os.path.exists(temp_file_path):
                os.unlink(temp_file_path)
        except:
            pass
        
        return {"success": False, "error": f"Atomic write failed: {str(e)}"}

# Usage
atomic_result = atomic_write_file("/sandbox/critical/important_data.json", {"key": "value"}, format="json")

Batch File Writing

def write_multiple_files(file_operations):
    """Write multiple files in a single operation."""
    
    results = []
    successful_files = []
    
    for operation in file_operations:
        try:
            result = writeFile(operation)
            results.append({
                "file": operation['filePath'],
                "success": result['success'],
                "result": result
            })
            
            if result['success']:
                successful_files.append(operation['filePath'])
                
        except Exception as e:
            results.append({
                "file": operation['filePath'],
                "success": False,
                "error": str(e)
            })
    
    return {
        "total_operations": len(file_operations),
        "successful": len(successful_files),
        "failed": len(file_operations) - len(successful_files),
        "results": results,
        "successful_files": successful_files
    }

# Usage
batch_operations = [
    {
        "filePath": "/sandbox/output/file1.txt",
        "content": "Content for file 1",
        "format": "text"
    },
    {
        "filePath": "/sandbox/output/file2.json",
        "content": {"data": "value2"},
        "format": "json"
    },
    {
        "filePath": "/sandbox/output/file3.csv",
        "content": {"headers": ["A", "B"], "rows": [["1", "2"]]},
        "format": "csv"
    }
]

batch_result = write_multiple_files(batch_operations)
print(f"Successfully wrote {batch_result['successful']} out of {batch_result['total_operations']} files")

Data Processing Workflows

Data Pipeline Output

def save_processing_results(pipeline_results, output_base_path):
    """Save data processing pipeline results to multiple files."""
    
    import json
    import datetime
    
    timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
    results = {}
    
    # Save main results as JSON
    main_result = writeFile({
        "filePath": f"{output_base_path}/results_{timestamp}.json",
        "content": {
            "pipeline_id": pipeline_results.get('id'),
            "timestamp": datetime.datetime.now().isoformat(),
            "summary": pipeline_results.get('summary', {}),
            "metrics": pipeline_results.get('metrics', {})
        },
        "format": "json",
        "createDirectories": True
    })
    results['main'] = main_result
    
    # Save detailed data as CSV
    if 'data' in pipeline_results:
        csv_result = writeFile({
            "filePath": f"{output_base_path}/data_{timestamp}.csv",
            "content": {
                "headers": list(pipeline_results['data'][0].keys()) if pipeline_results['data'] else [],
                "rows": [list(row.values()) for row in pipeline_results['data']]
            },
            "format": "csv"
        })
        results['data'] = csv_result
    
    # Save error log
    if 'errors' in pipeline_results and pipeline_results['errors']:
        error_log = []
        for error in pipeline_results['errors']:
            error_log.append(f"{error.get('timestamp', '')} - {error.get('message', '')}")
        
        error_result = writeFile({
            "filePath": f"{output_base_path}/errors_{timestamp}.log",
            "content": '\n'.join(error_log),
            "format": "text"
        })
        results['errors'] = error_result
    
    # Create summary report
    summary_content = f"""
Data Processing Pipeline Report
Generated: {datetime.datetime.now().isoformat()}
Pipeline ID: {pipeline_results.get('id', 'Unknown')}

Summary:
- Total Records: {pipeline_results.get('summary', {}).get('total_records', 0)}
- Processed: {pipeline_results.get('summary', {}).get('processed', 0)}
- Errors: {len(pipeline_results.get('errors', []))}

Files Generated:
- Results: results_{timestamp}.json
- Data: data_{timestamp}.csv
- Errors: errors_{timestamp}.log (if errors occurred)
"""
    
    summary_result = writeFile({
        "filePath": f"{output_base_path}/summary_{timestamp}.txt",
        "content": summary_content.strip(),
        "format": "text"
    })
    results['summary'] = summary_result
    
    return results

# Usage
processing_output = {
    "id": "pipeline_001",
    "summary": {
        "total_records": 10000,
        "processed": 9950,
        "skipped": 50
    },
    "data": [
        {"id": 1, "value": "A", "status": "processed"},
        {"id": 2, "value": "B", "status": "processed"}
    ],
    "errors": [
        {"timestamp": "2023-12-01T10:00:00", "message": "Record 101 validation failed"}
    ],
    "metrics": {
        "processing_time": 120.5,
        "memory_usage": "256MB"
    }
}

save_results = save_processing_results(processing_output, "/sandbox/pipeline_output")

Error Handling

Common Write Issues

Error TypeCauseResolution
Permission DeniedInsufficient write permissionsCheck directory permissions
Disk Space FullInsufficient storage spaceFree up space or use smaller files
Invalid PathDirectory doesn't existUse createDirectories option
Format ErrorContent doesn't match specified formatValidate content before writing
Encoding ErrorCharacter encoding issuesSpecify correct encoding

Robust Write Operations

def safe_write_with_retry(file_path, content, max_retries=3, **options):
    """Write file with retry logic and error handling."""
    
    import time
    
    for attempt in range(max_retries):
        try:
            result = writeFile({
                "filePath": file_path,
                "content": content,
                **options
            })
            
            if result['success']:
                return result
            else:
                print(f"Attempt {attempt + 1} failed: {result.get('error', 'Unknown error')}")
                if attempt < max_retries - 1:
                    time.sleep(2 ** attempt)  # Exponential backoff
        
        except Exception as e:
            print(f"Attempt {attempt + 1} exception: {str(e)}")
            if attempt < max_retries - 1:
                time.sleep(2 ** attempt)
    
    return {"success": False, "error": f"Failed after {max_retries} attempts"}

# Usage with retry logic
retry_result = safe_write_with_retry(
    "/sandbox/output/important_file.json",
    {"critical": "data"},
    max_retries=3,
    format="json",
    createDirectories=True
)

Next Steps: Combine with Read File for data processing workflows, or use Create Directory for file organization.