Upload File
Transfer files securely from your local system to the sandbox environment with validation, progress tracking, and format support.
Upload File
Securely transfer files from your local system to the sandbox environment with comprehensive validation, progress tracking, and multi-format support.
📤 Secure File Transfer
File uploads include automatic validation, virus scanning, and format detection to ensure safe transfer of files into the sandbox environment.
Overview
The Upload File tool enables secure transfer of files from local systems into the sandbox environment, providing validation, progress monitoring, and organization capabilities for efficient file management workflows.
Key Features
- Multi-format Support - Text, binary, images, documents, data files, archives
- Batch Uploads - Upload multiple files in single operation
- Progress Tracking - Real-time upload progress monitoring
- Automatic Validation - File type, size, and content validation
- Virus Scanning - Real-time malware detection and quarantine
- Resume Capability - Resume interrupted large file uploads
Methods
uploadFile
Transfer files from local system to sandbox environment.
| Parameter | Type | Required | Description |
|---|---|---|---|
| files | Array | Yes | Array of files to upload |
| destination | String | No | Target directory in sandbox (default: '/sandbox/uploads') |
| options | Object | No | Upload options and settings |
| options.overwrite | Boolean | No | Overwrite existing files (default: false) |
| options.createDirectories | Boolean | No | Create destination directories (default: true) |
| options.validateContent | Boolean | No | Enable deep content validation (default: true) |
| options.preserveStructure | Boolean | No | Maintain directory structure (default: false) |
| options.compression | String | No | Upload compression: 'auto', 'gzip', 'none' (default: 'auto') |
| validation | Object | No | File validation settings |
| validation.maxSize | Number | No | Maximum file size in bytes (default: 100MB) |
| validation.allowedTypes | Array | No | Allowed file extensions (e.g., ['.pdf', '.txt']) |
| validation.blockedTypes | Array | No | Blocked file extensions (e.g., ['.exe', '.bat']) |
{
"files": [
{
"name": "document.pdf",
"content": "base64_encoded_content_here",
"mimeType": "application/pdf"
},
{
"name": "data.csv",
"path": "/local/path/to/data.csv"
}
],
"destination": "/sandbox/uploads/documents",
"options": {
"overwrite": false,
"createDirectories": true,
"validateContent": true
},
"validation": {
"maxSize": 52428800,
"allowedTypes": [".pdf", ".csv", ".txt", ".xlsx"]
}
}Output:
success(Boolean) - Upload operation success statusuploadedFiles(Array) - List of successfully uploaded filesname(String) - Uploaded filenamepath(String) - Sandbox file pathsize(Number) - File size in byteschecksum(String) - File integrity checksummimeType(String) - Detected MIME typeuploadTime(Number) - Upload duration in milliseconds
errors(Array) - Upload errors for failed filesfilename(String) - File with errorerror(String) - Error descriptioncode(String) - Error code
validation(Object) - Validation resultspassed(Boolean) - Overall validation successwarnings(Array) - Validation warningsvirusScan(Object) - Virus scan results
totalSize(Number) - Total uploaded size in bytesuploadDuration(Number) - Total upload time in milliseconds
Supported File Types
Text and Document Files
Data Files
Upload Workflows
Single File Upload
// Simple single file upload
async function uploadSingleFile(localPath, sandboxDestination) {
try {
const result = await uploadFile({
files: [
{
name: path.basename(localPath),
path: localPath
}
],
destination: sandboxDestination,
options: {
createDirectories: true,
validateContent: true
}
});
if (result.success) {
console.log(`✅ File uploaded successfully: ${result.uploadedFiles[0].path}`);
console.log(`📏 Size: ${(result.uploadedFiles[0].size / 1024).toFixed(2)} KB`);
console.log(`🔒 Checksum: ${result.uploadedFiles[0].checksum}`);
return result.uploadedFiles[0];
} else {
console.error(`❌ Upload failed: ${result.errors[0]?.error}`);
return null;
}
} catch (error) {
console.error(`💥 Upload error: ${error.message}`);
return null;
}
}
// Usage
const uploadedFile = await uploadSingleFile('/local/data.csv', '/sandbox/input/');Batch File Upload
// Batch upload with progress monitoring
class BatchUploader {
constructor() {
this.uploadQueue = [];
this.concurrentUploads = 3;
this.uploadProgress = new Map();
}
async addFiles(fileList, destination) {
const uploads = fileList.map(file => ({
file,
destination,
status: 'pending'
}));
this.uploadQueue.push(...uploads);
return this.processQueue();
}
async processQueue() {
const results = [];
const chunks = this.chunkArray(this.uploadQueue, this.concurrentUploads);
for (const chunk of chunks) {
const chunkPromises = chunk.map(upload => this.uploadSingle(upload));
const chunkResults = await Promise.all(chunkPromises);
results.push(...chunkResults);
// Report progress
this.reportProgress(results.length, this.uploadQueue.length);
}
return results;
}
async uploadSingle(upload) {
try {
upload.status = 'uploading';
const result = await uploadFile({
files: [upload.file],
destination: upload.destination,
options: {
createDirectories: true,
overwrite: false
}
});
upload.status = result.success ? 'completed' : 'failed';
upload.result = result;
return upload;
} catch (error) {
upload.status = 'error';
upload.error = error.message;
return upload;
}
}
reportProgress(completed, total) {
const percentage = Math.round((completed / total) * 100);
console.log(`📤 Upload Progress: ${completed}/${total} (${percentage}%)`);
}
chunkArray(array, size) {
const chunks = [];
for (let i = 0; i < array.length; i += size) {
chunks.push(array.slice(i, i + size));
}
return chunks;
}
}
// Usage example
const uploader = new BatchUploader();
const fileList = [
{ name: 'data1.csv', path: '/local/data1.csv' },
{ name: 'data2.csv', path: '/local/data2.csv' },
{ name: 'config.json', path: '/local/config.json' }
];
const batchResults = await uploader.addFiles(fileList, '/sandbox/batch_data/');Directory Structure Upload
// Upload entire directory structure
async function uploadDirectoryStructure(localRoot, sandboxRoot) {
const fs = require('fs').promises;
const path = require('path');
async function scanDirectory(dirPath, relativePath = '') {
const files = [];
const entries = await fs.readdir(dirPath, { withFileTypes: true });
for (const entry of entries) {
const fullPath = path.join(dirPath, entry.name);
const relPath = path.join(relativePath, entry.name);
if (entry.isDirectory()) {
// Recursively scan subdirectories
const subFiles = await scanDirectory(fullPath, relPath);
files.push(...subFiles);
} else if (entry.isFile()) {
// Add file to upload list
files.push({
name: entry.name,
path: fullPath,
relativePath: relPath,
metadata: {
originalPath: fullPath,
directory: path.dirname(relPath)
}
});
}
}
return files;
}
try {
console.log(`🔍 Scanning directory: ${localRoot}`);
const allFiles = await scanDirectory(localRoot);
console.log(`📁 Found ${allFiles.length} files`);
// Group files by directory for organized upload
const filesByDirectory = new Map();
for (const file of allFiles) {
const targetDir = path.join(sandboxRoot, file.metadata.directory);
if (!filesByDirectory.has(targetDir)) {
filesByDirectory.set(targetDir, []);
}
filesByDirectory.get(targetDir).push(file);
}
// Upload files directory by directory
const uploadResults = [];
for (const [directory, files] of filesByDirectory) {
console.log(`📤 Uploading to: ${directory}`);
const result = await uploadFile({
files: files.map(f => ({
name: f.name,
path: f.path,
metadata: f.metadata
})),
destination: directory,
options: {
createDirectories: true,
preserveStructure: true
}
});
uploadResults.push({
directory,
result
});
}
return uploadResults;
} catch (error) {
console.error(`💥 Directory upload error: ${error.message}`);
throw error;
}
}
// Usage
const structureUpload = await uploadDirectoryStructure(
'/local/project/',
'/sandbox/projects/my_project/'
);Security and Validation
Upload Security Pipeline
Validation Rules
🔒 Upload Security
Automatic Validations:
- File Size Limits - Maximum 500MB per file, 2GB total upload
- File Type Verification - MIME type and extension validation
- Content Scanning - Deep content analysis for malicious patterns
- Virus Scanning - Real-time malware detection
- Path Validation - Prevention of directory traversal attacks
Custom Validation Rules:
const customValidation = {
maxSize: 100 * 1024 * 1024, // 100MB limit
allowedTypes: ['.csv', '.json', '.xlsx', '.txt'],
blockedTypes: ['.exe', '.bat', '.sh', '.com'],
contentRules: [
{
type: 'csv',
maxRows: 1000000,
maxColumns: 1000,
requiredColumns: ['id'],
allowedDelimiters: [',', ';', '\t']
},
{
type: 'json',
maxDepth: 10,
maxObjectSize: 10000,
allowedTypes: ['object', 'array']
}
]
};
const result = await uploadFile({
files: dataFiles,
validation: customValidation
});Error Handling
// Comprehensive upload error handling
async function safeUpload(files, destination, retries = 3) {
let attempt = 0;
while (attempt < retries) {
try {
const result = await uploadFile({
files,
destination,
options: {
validateContent: true,
createDirectories: true
}
});
// Check for partial failures
if (result.errors.length > 0) {
console.warn('⚠️ Partial upload failure:');
result.errors.forEach(error => {
console.warn(` - ${error.filename}: ${error.error}`);
});
// Retry failed files only
const failedFiles = files.filter(file =>
result.errors.some(error => error.filename === file.name)
);
if (failedFiles.length > 0 && attempt < retries - 1) {
console.log(`🔄 Retrying ${failedFiles.length} failed files...`);
files = failedFiles;
attempt++;
continue;
}
}
// Handle security warnings
if (result.validation.warnings.length > 0) {
console.warn('🛡️ Security warnings:');
result.validation.warnings.forEach(warning => {
console.warn(` - ${warning}`);
});
}
// Check virus scan results
if (!result.validation.virusScan.clean) {
console.error('🦠 Virus scan failed:');
result.validation.virusScan.threats.forEach(threat => {
console.error(` - ${threat}`);
});
throw new Error('Upload blocked due to security threats');
}
return result;
} catch (error) {
attempt++;
console.error(`💥 Upload attempt ${attempt} failed: ${error.message}`);
if (attempt >= retries) {
throw new Error(`Upload failed after ${retries} attempts: ${error.message}`);
}
// Exponential backoff
const delay = Math.pow(2, attempt) * 1000;
await new Promise(resolve => setTimeout(resolve, delay));
}
}
}
// Usage with error handling
try {
const result = await safeUpload(fileList, '/sandbox/data/', 3);
console.log('✅ Upload completed successfully');
} catch (error) {
console.error('❌ Upload completely failed:', error.message);
}Performance Optimization
Large File Upload
// Optimized large file upload with chunking
class LargeFileUploader {
constructor() {
this.chunkSize = 5 * 1024 * 1024; // 5MB chunks
this.maxRetries = 3;
}
async uploadLargeFile(filePath, destination) {
const fs = require('fs').promises;
const fileStats = await fs.stat(filePath);
const fileSize = fileStats.size;
if (fileSize <= this.chunkSize) {
// Use regular upload for small files
return this.regularUpload(filePath, destination);
}
console.log(`📁 Large file detected: ${(fileSize / 1024 / 1024).toFixed(2)} MB`);
console.log('🔧 Using chunked upload...');
return this.chunkedUpload(filePath, destination, fileSize);
}
async chunkedUpload(filePath, destination, fileSize) {
const fs = require('fs').promises;
const path = require('path');
const chunks = Math.ceil(fileSize / this.chunkSize);
const uploadPromises = [];
for (let i = 0; i < chunks; i++) {
const start = i * this.chunkSize;
const end = Math.min(start + this.chunkSize, fileSize);
const chunkPromise = this.uploadChunk(filePath, start, end, i, destination);
uploadPromises.push(chunkPromise);
}
// Upload chunks in parallel
const chunkResults = await Promise.all(uploadPromises);
// Combine chunks on server side
return this.combineChunks(chunkResults, destination);
}
async uploadChunk(filePath, start, end, chunkIndex, destination) {
const fs = require('fs').promises;
const fileHandle = await fs.open(filePath, 'r');
try {
const chunkSize = end - start;
const buffer = Buffer.alloc(chunkSize);
await fileHandle.read(buffer, 0, chunkSize, start);
const chunkName = `${path.basename(filePath)}.chunk.${chunkIndex}`;
const result = await uploadFile({
files: [{
name: chunkName,
content: buffer,
metadata: {
chunkIndex,
chunkStart: start,
chunkEnd: end,
totalChunks: Math.ceil((end - start) / this.chunkSize)
}
}],
destination: `${destination}/.chunks/`
});
console.log(`✅ Chunk ${chunkIndex + 1} uploaded`);
return result;
} finally {
await fileHandle.close();
}
}
async combineChunks(chunkResults, destination) {
// This would typically be handled by the server
// For now, return the chunk information
return {
success: true,
message: 'Chunked upload completed',
chunks: chunkResults.length,
totalSize: chunkResults.reduce((sum, chunk) =>
sum + chunk.uploadedFiles[0]?.size || 0, 0
)
};
}
async regularUpload(filePath, destination) {
return uploadFile({
files: [{
name: path.basename(filePath),
path: filePath
}],
destination
});
}
}
// Usage
const largeUploader = new LargeFileUploader();
const result = await largeUploader.uploadLargeFile('/local/large_dataset.csv', '/sandbox/data/');Integration Examples
With Code Execution
# Upload and process workflow
def upload_and_process_data():
"""Upload data files and process with Python."""
# 1. Upload data files
upload_result = await uploadFile({
'files': [
{'name': 'sales.csv', 'path': '/local/data/sales.csv'},
{'name': 'customers.json', 'path': '/local/data/customers.json'}
],
'destination': '/sandbox/input/',
'validation': {
'allowedTypes': ['.csv', '.json'],
'maxSize': 100 * 1024 * 1024 # 100MB
}
})
if not upload_result['success']:
raise Exception(f"Upload failed: {upload_result['errors']}")
# 2. Process uploaded files with code execution
processing_code = """
import pandas as pd
import json
# Read uploaded data
sales_df = pd.read_csv('/sandbox/input/sales.csv')
with open('/sandbox/input/customers.json', 'r') as f:
customers = json.load(f)
# Process data
sales_summary = sales_df.groupby('product').agg({
'quantity': 'sum',
'revenue': 'sum'
}).reset_index()
# Save processed results
sales_summary.to_csv('/sandbox/output/sales_summary.csv', index=False)
print(f"Processed {len(sales_df)} sales records")
print(f"Generated summary with {len(sales_summary)} products")
"""
exec_result = await codeExecution({
'language': 'python',
'code': processing_code
})
return {
'upload': upload_result,
'processing': exec_result
}
# Execute workflow
result = await upload_and_process_data()Related Tools
Download File
Download processed files from sandbox back to your local system
File Metadata
Access detailed information about uploaded files including checksums
Code Execution
Process uploaded files with Python and JavaScript code execution
Data Analysis Tools
Analyze uploaded Excel and CSV files with advanced data tools
Next Steps: After uploading files, explore Read File to access file contents, or Code Execution to process uploaded data.