Potential Bug - Object masks out other objects

This is going to be quite edge case
I have an image with a trampoline and half transparent net.
First it identified it as a boat. OK not really worried about that… But when it does that the detection process does not seem to scan further into the image on the detected area.


So this is an example seconds later where it didn’t detect “boat” and detected persons behind the net.

I think this is a bug in DeepStack… or maybe a setting? If it finds something in a region it ignores it from further identification.

It does not seem to ever show me boat, person when the region is clipped like that by a larger object. (I compared to Azure Vision and the same frames show consistent data for people - but it ignores the “boat” and does not identify the trampoline)

If somebody knew how they could set up in the frame to cause a the majority of the frame to be obscured by a false positive and not detecting people in the frame

Are you sure this is not how Blue Iris selects objects? Would be a resonable algo that selected the highest scoring object in a region and ignored the lower scoring ones. To debug, save both images locally and use CURL to check what is returned by deepstack

Sometimes it detects the trampoline as a boat and nothing else… like the images behind the “boat” as I mentioned.
Nothing else is detected.