Skip to main content

Operation Node Guide

Overview

The Operation Node is a core component of NINA workflows that allows you to execute security tools and operations on your input data. Operation Nodes connect to the NINA tool registry to run specialized security tools on your workflow data.

Use Cases

  • Subdomain enumeration and discovery
  • Port scanning and service detection
  • Vulnerability scanning
  • Web application crawling and analysis
  • DNS resolution and analysis
  • Fuzzing and brute forcing
  • Network mapping and reconnaissance

Creating an Operation Node AKA Tools

Basic Setup

  1. Drag a Tool from the node palette onto your workflow canvas
  2. Connect it to an input source. Can be any node type as long as it matches required format (typically an Input Node or a Script Node)
  3. Configure tool-specific parameters

Operation Node being added to a workflow with tool selection

Configuration Options

Node Properties

PropertyDescription
NameA descriptive name for the node
ToolThe security tool to execute
ParametersTool-specific configuration options

Tool Parameters

Each tool has its own set of parameters that control its behavior. Common parameters include:

  • Threads: Number of concurrent threads for parallel processing
  • Timeout: Maximum execution time
  • Rate Limit: Requests per second to limit scanning speed
  • Word Lists: Paths to word lists for fuzzing or brute forcing
  • Output Format: Format of the tool's output (JSON, CSV, etc.)

Node outputs are stored as files at /workflow/nodeName, aside from Input Nodes with populated Files, those would store at /workflow/nodeName/nameOfFile

Tool parameters configuration interface

How Operation Nodes Work

When a workflow is executed:

  1. The Operation Node receives input data from the previous node
  2. The input is automatically split into chunks for parallel processing
  3. Each chunk is executed independently by a worker running the selected tool
  4. Once all chunks finish, the results are merged back into a single output
  5. The merged output is passed to the next node in the workflow

Parallel Execution and Chunking

Operation Nodes automatically split large inputs to speed up execution. How the splitting works depends on the type of input:

  • Multiple input files (e.g. from branching edges or file-based Input Nodes): Each file is processed independently as its own chunk — no splitting needed.
  • Single text input (e.g. a list of domains from a previous node): The lines are distributed evenly across multiple chunks. The number of chunks is controlled by the tool's concurrency setting (defaults to 3). Setting concurrency to 1 disables splitting entirely.

For example, if your input has 100 domains and the tool concurrency is 3, the engine creates 3 chunks of ~33 domains each and runs them in parallel.

Merge Step

After all chunks complete, the engine automatically merges the partial results into a single output file. Downstream nodes always receive the merged result — they never see individual chunk files. This merge step happens transparently and requires no configuration.

VPN / Network Configuration

Some tools require VPN connectivity (e.g. login checkers, Google Dorks). If you attach a Network Config to the node, the worker establishes a VPN tunnel before running the tool. When VPN is active, proxy parameters are automatically excluded to avoid conflicts.

Best Practices

  • Connect Appropriate Inputs: Ensure the input data format matches what the tool expects
  • Parameter Tuning: Adjust tool parameters based on your specific needs and target scope
  • Resource Consideration: Some tools can be resource-intensive; adjust thread counts accordingly
  • Output Compatibility: Verify that the tool's output format is compatible with downstream nodes

Example Configurations

Example 1: Subfinder Configuration

ParameterValueDescription
threads10Number of concurrent threads
timeout30Maximum execution time in seconds
recursivetrueEnable recursive subdomain discovery
sources"google,virustotal,passivetotal"Data sources to query

Example 2: HTTPX Configuration

ParameterValueDescription
threads50Number of concurrent threads
follow-redirectstrueFollow HTTP redirects
status-codetrueInclude status codes in output
titletrueInclude page titles in output
technology-detectiontrueIdentify technologies in use

Troubleshooting

IssueResolution
Tool execution failureCheck tool parameters and input format
Timeout errorsIncrease the timeout parameter or reduce thread count
Empty resultsVerify input data is valid and within the tool's scope
Format incompatibilityAdd a Script Node to transform data between incompatible nodes

Advanced Usage

Tool Chaining

Create powerful reconnaissance pipelines by chaining multiple Operation Nodes:

  1. Subdomain DiscoveryDNS ResolutionPort ScanningVulnerability Scanning

Chain of Operation Nodes in a workflow

Next Steps

After configuring your Operation Node, consider connecting it to:

  • Script Nodes: To process or transform the tool output

Operation Node connected to downstream nodes

Updated: 2025-12-02