ONXX
The ONNX node allows you to perform AI inference directly in Node-RED using ONNX models. It can run a wide range of pre-trained or custom models, including image classification, object detection, and numeric prediction tasks.
ONNX (Open Neural Network Exchange) is an open standard for representing machine learning models. With this node, you can load an ONNX model and run predictions locally or on the edge without requiring a separate AI service.
Inputs
General
- Property: input
- Type: object, buffer, or tensor
- Description: The input data to process. It can be an image, array, or tensor. See the Input Formats section below for supported structures.
Model Selection
- Property: model
- Type: string
- Description: Path to the ONNX model file. It can be a direct file path (for example, /data/models/model.onnx) or an environment variable (for example,${MODEL_PATH}).
Outputs
- Property: payload
- Type: object or array
- Description: Contains the model’s output after inference. Depending on the model, this may include predictions, probabilities, or other structured results.
Input Formats
The input format depends on what your ONNX model expects. You can check the model’s input names, types, and shapes by clicking the Model Info button in the node configuration panel.
1. Tensor Format
Use this format when the model expects a single tensor input.
{
  "data": [0.0, 0.1, 0.2, ...],
  "type": "float32",
  "dim": [1, 3, 224, 224]
}- data: Flat array of numerical values (for example, pixel data).
- type: Data type of the tensor (for example, float32,int8).
- dim: Tensor dimensions in [batch_size, channels, height, width]format.
2. Array of Tensors
Used when the model expects multiple input tensors.
[
  {
    "data": [0.0, 0.1, ...],
    "type": "float32",
    "dim": [1, 3, 224, 224]
  },
  {
    "data": [0, 1, 2, ...],
    "type": "int8",
    "dim": [1, 10]
  }
]3. Named Tensor Properties
Used when the model defines multiple named input tensors.
{
  "input_1": {
    "data": [0.0, 0.1, 0.2, ...],
    "type": "float32",
    "dim": [1, 3, 224, 224]
  },
  "input_2": {
    "data": [0.0, 0.1, 0.2, ...],
    "type": "float32",
    "dim": [1, 10]
  }
}4. Array-like Input
If the model expects a single flat array, you can provide it directly:
[0.0, 0.1, 0.2, ...]For batch inputs, use an array of arrays:
[
  [0.0, 0.1, 0.2, ...],
  [0.0, 0.1, 0.2, ...]
]Configuration
- The model must be in the ONNX (.onnx) format.
- Ensure your input format matches the model’s expected input definition.
- Use the Model Info button in the configuration panel to inspect model input and output specifications before wiring it into your flow.
- The result of the inference is available in msg.payloadfor further processing or visualization.