logo_smallAxellero.io

Workflow Builder

Visual interface for creating automated processes using drag-and-drop canvas.

The Workflow Builder is Axellero Studio's visual interface for creating automated processes. Build complex workflows using a drag-and-drop canvas with pre-built nodes and components.

Overview

The Workflow Builder provides comprehensive automation capabilities:

CapabilityDescriptionKey Features
Visual DesignDrag-and-drop workflow creationNode-based canvas, connection lines, visual data flow
System IntegrationConnect to external data sources and servicesDatabase connectors, API integrations, file operations
Process AutomationEnd-to-end business process automationMulti-step workflows, conditional routing, error handling
Data OperationsTransform, validate, and route dataData mapping, validation rules, format conversion
Logic ImplementationConditional branching and decision makingIf/then/else logic, loops, switch statements

Interface Components

Canvas

The main workspace where you build workflows:

  • Grid Layout: Organize nodes in a structured manner
  • Zoom Controls: Scale the canvas for detailed work or overview
  • Pan Navigation: Move around large workflows easily
  • Selection Tools: Select single or multiple nodes
  • Connection Lines: Visual data flow indicators

Node Library

Comprehensive collection of workflow building blocks:

Node CategoryPurposeAvailable NodesPrimary Use Cases
Data NodesInput/output and data handlingInput, Output, Database Query, File Operations, API Request/ResponseData collection, storage, retrieval, formatting
Logic NodesControl flow and decision makingBranch (if/then/else), Loop, Switch, MergeConditional logic, iteration, flow control
Transform NodesData processing and conversionData Mapper, Format Converter, Validator, CalculatorData transformation, validation, calculations
Integration NodesExternal system connectivityConnectors, HTTP Client, Email, Notifications, File SystemSystem integration, communication, external operations
AI NodesIntelligent processingAI Agent, Text Analysis, Content Generation, Decision SupportAI-powered automation, intelligent data processing

Properties Panel

Configure selected nodes and connections:

  • Node Settings: Customize node behavior and parameters
  • Data Mapping: Define input/output relationships
  • Validation Rules: Set data validation criteria
  • Error Handling: Configure failure responses

Toolbar

Quick access to common actions:

  • Save/Load: Store and retrieve workflow configurations
  • Run/Debug: Execute workflows with debugging capabilities
  • Undo/Redo: Reverse and replay changes
  • Copy/Paste: Duplicate workflow sections

Building Workflows

Creating a New Workflow

  1. Open Workflow Builder: Navigate to Workflows in the main menu
  2. Create New: Click the "+" button to start a new workflow
  3. Name Your Workflow: Provide a descriptive name and description
  4. Choose Template: Start from scratch or use a template
  5. Begin Building: Drag nodes onto the canvas

Basic Workflow Structure

Start Node:

  • Every workflow begins with a start node
  • Defines workflow triggers and initial parameters
  • Can be triggered manually, on schedule, or by events

Processing Nodes:

  • Add nodes to handle data and perform operations
  • Connect nodes to create the workflow flow
  • Configure each node's specific parameters

End Node:

  • Workflows conclude with end nodes
  • Define final outputs and completion actions
  • Handle success and error scenarios

Connecting Nodes

Creating Connections:

  1. Click and drag from an output port
  2. Connect to an input port on another node
  3. Configure data mapping between nodes
  4. Validate connection compatibility

Connection Types:

  • Data Connections: Pass data between nodes
  • Control Connections: Manage workflow flow
  • Error Connections: Handle failure scenarios

Data Flow Management

OperationDescriptionConfiguration OptionsUse Cases
Data MappingConnect data between nodesField mapping, type conversion, default valuesTransform API responses, prepare database inputs
Data TransformationConvert data formats and structuresJSON/XML conversion, array operations, object restructuringNormalize data from different sources
CalculationsApply formulas and computationsMathematical operations, string manipulation, date functionsCalculate totals, format outputs, derive values
Data ValidationEnsure data integrity and qualityRequired fields, data types, custom validation rulesValidate user inputs, ensure data consistency

Variable Management

Variable TypeScopePersistenceCommon Applications
Workflow VariablesCurrent workflow executionExecution durationTemporary calculations, intermediate results
Global VariablesApplication-wideUntil modifiedConfiguration values, shared constants
Context VariablesNode execution contextNode executionLoop counters, conditional values
Environment VariablesWorkspace/applicationPersistentAPI keys, database connections, environment-specific settings

Advanced Features

Conditional Logic

Implement decision-making in workflows:

If/Then/Else Nodes:

  • Evaluate conditions and branch workflow paths
  • Support multiple condition types
  • Handle complex logical expressions
  • Provide alternative execution paths

Switch Nodes:

  • Route workflow based on multiple conditions
  • Support case-based routing
  • Handle default cases
  • Enable complex decision trees

Loop Processing

Handle repetitive operations:

For Each Loops:

  • Iterate over arrays and collections
  • Process each item individually
  • Accumulate results across iterations
  • Handle large datasets efficiently

While Loops:

  • Continue processing while conditions are met
  • Support complex termination criteria
  • Prevent infinite loops with safeguards
  • Handle dynamic iteration counts

Error Handling

Manage workflow failures gracefully:

Try/Catch Blocks:

  • Wrap risky operations in error handling
  • Provide fallback actions for failures
  • Log errors for debugging
  • Continue workflow execution after errors

Retry Logic:

  • Automatically retry failed operations
  • Configure retry counts and delays
  • Handle transient failures
  • Implement exponential backoff

Parallel Processing

Execute multiple operations simultaneously:

Parallel Branches:

  • Split workflow into concurrent paths
  • Execute independent operations simultaneously
  • Synchronize results at merge points
  • Improve overall workflow performance

Batch Processing:

  • Handle multiple items in parallel
  • Optimize throughput for large datasets
  • Balance load across system resources
  • Monitor parallel execution progress

Workflow Design Patterns

PatternDescriptionWhen to UseImplementation
Linear ProcessingSequential execution of tasksSimple data processing, step-by-step operationsChain nodes in sequence with data flow
Conditional BranchingDifferent paths based on conditionsBusiness rule implementation, error handlingUse Branch nodes with conditional logic
Parallel ProcessingSimultaneous execution of independent tasksPerformance optimization, concurrent operationsSplit workflow into parallel branches
Data AggregationCollect and combine data from multiple sourcesReporting, data consolidationUse Merge nodes to combine multiple inputs
Loop ProcessingIterate over collections or repeat operationsBatch processing, array operationsUse Loop nodes with collection data

Testing and Debugging

Testing FeatureDescriptionConfigurationBest Practices
Test ModeSafe execution environment with sample dataTest data sets, mock responsesUse realistic test data, test edge cases
Debug ModeStep-by-step execution with breakpointsBreakpoint placement, variable inspectionSet breakpoints at decision points, inspect data flow
Performance MonitoringTrack execution metrics and bottlenecksExecution time tracking, resource monitoringMonitor frequently executed workflows, optimize slow nodes
Error TrackingLog and analyze workflow failuresError logging, failure patternsImplement comprehensive error handling, monitor error trends

Debugging Tools

ToolPurposeUsageInformation Provided
Execution LogTrack workflow execution historyView past executions, analyze patternsExecution time, success/failure status, data flow
Variable InspectorExamine data at each stepDebug mode, breakpoint inspectionVariable values, data types, transformation results
Performance ProfilerIdentify slow operationsMonitor execution timesNode execution duration, bottleneck identification
Error ConsoleView and analyze errorsError details, stack tracesError messages, failure points, recovery suggestions

Best Practices

Workflow Design

Create maintainable and efficient workflows:

Modular Design:

  • Break complex workflows into smaller modules
  • Reuse common workflow patterns
  • Create template workflows for frequent use cases
  • Document workflow purposes and logic

Data Management:

  • Validate input data at workflow entry points
  • Transform data consistently throughout the workflow
  • Handle edge cases and unexpected data formats
  • Implement proper error handling for data issues

Performance Optimization:

  • Minimize unnecessary data transformations
  • Use parallel processing where appropriate
  • Cache frequently accessed data
  • Monitor and optimize slow-running operations

Maintenance

Keep workflows running smoothly:

Regular Testing:

  • Test workflows with current data
  • Validate integrations and connections
  • Check for deprecated features or APIs
  • Update workflows as requirements change

Documentation:

  • Document workflow purposes and business logic
  • Maintain up-to-date process documentation
  • Include troubleshooting guides
  • Document integration dependencies

Version Control:

  • Save workflow versions before major changes
  • Tag stable workflow releases
  • Maintain rollback capabilities
  • Track workflow change history

Common Use Cases

Data Integration

  • Synchronize data between systems
  • Import/export data from various sources
  • Transform data formats and structures
  • Validate and clean data

Process Automation

  • Automate repetitive business tasks
  • Orchestrate multi-system processes
  • Handle document processing workflows
  • Manage approval and notification processes

Real-time Processing

  • Process streaming data in real-time
  • React to system events and triggers
  • Implement real-time dashboards and alerts
  • Handle time-sensitive operations

Next Steps

  1. Create Your First Workflow: Start with a simple data processing workflow
  2. Explore Node Types: Learn about different node capabilities
  3. Practice Data Flow: Understand how data moves through workflows
  4. Implement Error Handling: Build robust, failure-resistant workflows
  5. Optimize Performance: Learn workflow optimization techniques