Strange behavior with input-output self or was I wrong?

In a node with input-output self, in the following lines it is logical that there is a time between reading “Set” and deciding.

if (!isInputDirty<input_Init>(ctx)){
emitValue<output_OUT>(ctx, getValue<input_IN>(ctx));

Now if I use the following code and do a jumper to output, should a delay also occur when passing through the node?

if (!isInputDirty<input_Init>(ctx))


In both cases there is a delay, which alters the code, all the input-x-dev of the library must go directly to output DEV
I modified all the libraries, before I had not noticed this. Maybe I was always wrong.
Thought that an internal bridge was the same as external, but the sequence takes another path.

When you have a not-implemented-in-xod node on a patch it is considered to be a C++ patch. Only its C++ code is taken into account. Any links and nodes (besides the terminals) are thrown away when transpiling.

What are you trying to achieve? A daisy-chain of nodes acting over the same resource/hardware/object?

Hi, I see that in the library “hid” ah added an output-DEV ', try to do the same in the OLED library SSD1306, but if that node is not activated it works like a defer, the image of the display flashes.

In the previous image it works as a defer, not as a jumper, so stop using the pins and make a direct connection to DEV.

The chaining pattern is useful enough to be supported at a more basic level. We are going to introduce chainable marker nodes that could be applied to a custom type so that the DEV input will get its DEV counterpart output linked internally out of the box.

The DEV / DEV' pattern is something I have successfully exploit in a few projects to achieve chaining to the extent it was necessary there and, to be honest, I can’t understand why doesn’t it work in your case. Passing a value to the DEV input alone is enough for the node to be evaluated, so it should not act as a defer… Would you elaborate?

Doing some tests with I found the problem, I made a misuse of the chained.

Here at nkrkv / hid, I found the one that works correctly.


void evaluate(Context ctx) {
    emitValue<output_OUT>(ctx, getValue<input_DEV>(ctx));
    if (!isInputDirty<input_UPD>(ctx))

I used this, and the problem arose, when evaluating the node first

void evaluate(Context ctx) {
    if (!isInputDirty<input_INIT>(ctx)){
    emitValue<output_OUT>(ctx, getValue<input_DEV>(ctx));

is equal to

and the “defer” occurs.

I think it’s the solution I was looking for, thanks to the library HID, helped me.

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.