Including default models with custom models

Hi,

I have successfully trained and deployed a custom model set for a deepstack container running on a Jetson Nano and scanning camera images sent to it by Home Assistant running on a Raspberry Pi.

sudo docker run --runtime nvidia -v /path/to-my/custom/model:/modelstore/detection -p 80:5000 deepquestai/deepstack:jetpack

Through testing it is clear the custom model is running well but despite setting targets, none of the default models (person, cat, bird etc.) are being detected. Is it possible to run default models with custom models at the same time, or will I need to create a second instance of deepstack and run multi-passes?

1 Like

I’ve got no experience with the Jetson Nano but I think you could you just run a second instance of the default deepstack vision detection alongside your custom model. The custom model doesn’t include any of the default models. Just run it on a different port number.

Its also useful to make sure your custom model uses different tags to the default model. For example, I use Kitty instead of the default Cat tag to avoid confusion and clashes in the post processing.

I’m using VorlonCD branch of AItool in a similar way, windows based so probably not useful to you. But the concept is similar, it runs the deepstack default model server instance alongside one deepstack custom model server instance (not currently supporting multiple custom models). The MQTT feature then sends trigger info to Home assistant for actioning on.

If you want to check it out you have to look in the github issues pages to find the latest feature releases and bug fixes https://github.com/VorlonCD/bi-aidetection/issues/201

1 Like

The custom model has a different url, so no need another instsnce or even different tags than default model. Just put your custom model on the correct folder and call either your custom model or default or both and decide according to results which one to “trust”.

1 Like

@Gadget and @antb you can run any number of custom models with all the inbuilt APIs on a single running instance of DeepStack. See details in this FAQ

At the moment, to detect custom objects and defaults objects in the same image, you will need to send the image twice to DeepStack

  • once to the custom model endpoint
  • once to the default detection endpoint

We currently don’t have a single endpoint that does both custom model and default model detection yet, but it is something we are considering.

cc: @john

1 Like