Skip to main content

Operation Node Guide

Overview

The Operation Node is a core component of NINA workflows that allows you to execute security tools and operations on your input data. Operation Nodes connect to the NINA tool registry to run specialized security tools on your workflow data.

Use Cases

  • Subdomain enumeration and discovery
  • Port scanning and service detection
  • Vulnerability scanning
  • Web application crawling and analysis
  • DNS resolution and analysis
  • Fuzzing and brute forcing
  • Network mapping and reconnaissance

Creating an Operation Node AKA Tools

Basic Setup

  1. Drag a Tool from the node palette onto your workflow canvas
  2. Connect it to an input source. Can be any node type as long as it matches required format (typically an Input Node or a Script Node)
  3. Configure tool-specific parameters

Operation Node being added to a workflow with tool selection

Configuration Options

Node Properties

PropertyDescription
NameA descriptive name for the node
ToolThe security tool to execute
ParametersTool-specific configuration options

Tool Parameters

Each tool has its own set of parameters that control its behavior. Common parameters include:

  • Threads: Number of concurrent threads for parallel processing
  • Timeout: Maximum execution time
  • Rate Limit: Requests per second to limit scanning speed
  • Word Lists: Paths to word lists for fuzzing or brute forcing
  • Output Format: Format of the tool's output (JSON, CSV, etc.)

Node outputs are stored as files at /workflow/nodeName, aside from Input Nodes with populated Files, those would store at /workflow/nodeName/nameOfFile

Tool parameters configuration interface

How Operation Nodes Work

When a workflow is executed:

  1. The Operation Node receives input from a predecessor node
  2. The input is processed by the selected security tool
  3. The tool performs its operations based on the configured parameters
  4. Results are collected and made available to subsequent nodes
  5. The node status changes to "Complete" when processing finishes

Performance Optimization

Operation Nodes automatically optimize tool execution for better performance:

  • Input processing is parallelized when possible
  • Resource usage is balanced for optimal throughput
  • Results are automatically formatted for downstream nodes

Best Practices

  • Connect Appropriate Inputs: Ensure the input data format matches what the tool expects
  • Parameter Tuning: Adjust tool parameters based on your specific needs and target scope
  • Resource Consideration: Some tools can be resource-intensive; adjust thread counts accordingly
  • Output Compatibility: Verify that the tool's output format is compatible with downstream nodes

Example Configurations

Example 1: Subfinder Configuration

ParameterValueDescription
threads10Number of concurrent threads
timeout30Maximum execution time in seconds
recursivetrueEnable recursive subdomain discovery
sources"google,virustotal,passivetotal"Data sources to query

Example 2: HTTPX Configuration

ParameterValueDescription
threads50Number of concurrent threads
follow-redirectstrueFollow HTTP redirects
status-codetrueInclude status codes in output
titletrueInclude page titles in output
technology-detectiontrueIdentify technologies in use

Troubleshooting

IssueResolution
Tool execution failureCheck tool parameters and input format
Timeout errorsIncrease the timeout parameter or reduce thread count
Empty resultsVerify input data is valid and within the tool's scope
Format incompatibilityAdd a Script Node to transform data between incompatible nodes

Advanced Usage

Tool Chaining

Create powerful reconnaissance pipelines by chaining multiple Operation Nodes:

  1. Subdomain DiscoveryDNS ResolutionPort ScanningVulnerability Scanning

Chain of Operation Nodes in a workflow

Next Steps

After configuring your Operation Node, consider connecting it to:

  • Script Nodes: To process or transform the tool output

Operation Node connected to downstream nodes