DeepStack is Back

Hello everyone. I trust everyone has been good.
We have been away for a while trying to sort out a couple of things and we appreciate your patience.

We are fully back now both on community engagements and active development of deepstack. We have a lot of exciting updates both to the deepstack ecosystem coming soon and will share more on our plans soonest.

As of today, we have released the newest cpu version, which is signficantly fast, running near real time on cpu. Update to the gpu version is coming next week.
Run deepquestai/deepstack:latest or deepquestai/deepstack:cpu-x4-beta

Updates to the pi version and a new nvidia jetson version is coming before the end of the year.

We have a lot planned for the open sourcing of DeepStack and are looking forward to making DeepStack the most awesome AI platform for you all with your support.

Questions or feedback is most welcome.

Thank you all.

1 Like

Happy you’re back!

Any word on cpu-noavx and gpu-noavx? I am really interested in the Windows builds and possibly supporting gpu-noavx there as well. My home server is an older Xeon build without AVX but it does have a GPU, sadly it runs Windows Server so I can’t utilize the GPU for any of the Docker builds.

Hello, note that the new deepstack supports both avx and noavx systems out of the box. The gpu version will as well.
For GPU support on windows, our plan is to enable this via WSL (Windows Subsystem for Linux)

It is important to note that the current cpu version is about as fast as the previous gpu version for object detection.

Update:

Due to a bug with no detections, we have updated this

Run

deepquestai/deepstack:latest

or

deepquestai/deepstack:cpu-x5-beta

This is real nice, thank you! I already have the x5 Beta up and running in an Ubuntu VM.

I don’t actually feel like this is a great option. WSL v1 does not support a GPU and WSL v2 is currently only the insider builds of Windows 10 and even then it is Hyper-V VM based which only supports select systems that can utilize SR-IOV. If I wanted to go the WSL v2 route I might as well just run a Ubuntu VM and do the same thing, but I can’t, I’ve tried, my hardware just doesn’t support it.
Part of the big reason I want to go with GPU acceleration is because my system does not support AVX and the upgrade cost for the very slight benefit far out weighs just utilizing a GPU. And in all cases a GPU is faster at this than a CPU anyway so I’d much rather go that route.
Sadly, my server’s Host OS is Windows Server 2019 because a lot of what I run at the Host level requires Windows, I’d move to a Linux Host if I could just to be able to run such services as this at the Host level and utilize the GPU but this would require a complete rework of my server and that just isn’t possible for me. Beyond this one task I’ve been able to get away with either running at Host level when needing a GPU or running anything else through a VM.

I don’t want to sound like, “Why don’t you just do the thing I want?!” I get it isn’t that simple… But what would be the limit to porting the GPU version over to a Windows build? I am genuinely curious and honestly would be happy to help in any manner I can. Seeing it was done once before makes me more optimistic that it is possible.

1 Like

Thanks a lot for the insights @charredchar . I totally understand the complication with WSL. Note that the object detection in the latest cpu update is nearly as fast has the prior gpu version. That said, the coming gpu version next week is going to be tremendously faster than cpu version.

We can build a native gpu version for Windows, it will not be docker based. The reason behind the idea of WSL is to keep everything docker however, as you said there is need for a native windows version.

We shall add this and we should have a build available in November. And we would need your help in testing this.

I should say that you should be proud of this and I am actually quite happy to see it, you are awesome. I’ve already done some testing on a VM and this should easily hold me over until a Windows+GPU option becomes available.

I have been enjoying the use of Docker more and more when building services inside of Linux, sadly “Docker for Windows” just isn’t a thing in my eyes as it really is just a Linux VM at that point. With that the case you might as well just run a Linux VM with Docker in it, I wish this wasn’t the case but on the Windows side I don’t see a point to keeping it in Docker.

I will be happy to do anything you need to test this out on a Windows environment. My gaming PC does run the latest versions of Windows 10 as well (and a much beefier GPU!) if you need to do a little testing on that front, it should support SR-IOV though I haven’t tested it.

1 Like

Thanks a lot @charredchar :grinning: . That is good to hear. Docker is really a linux thing and yeah, bringing it to windows and mac always involves some form of linux abstraction.
The WSL backend made things easier but again it is always a bit trickier.
Would reach out once we have builds available for the windows version.
As the source code for the new deepstack is built to be easy to bring to new platforms, this will not take long.

I would love to see the Windows version run in a service and remember my port settings as well, so I don’t have to manually start it each time the computer reboots. I love the product using it with Blue Iris.

Thanks.

1 Like

Thanks for the feedback. We shall definitely do so

1 Like

Hello! Sorry but where do I run that command deepquestai/deepstack:latest? I’m using windows and would love to update.
Thanks

That command only works within a docker container. Windows version is simply an .exe. Hopefully that’ll get an update soon!

1 Like

Oh thanks for the clarification. I’ll wait then.

Hi @wilddoktor @mrpie , an update for the Windows version will be available soon as a standalone .exe as we are planning to support NVIDIA GPU acceleration as well.

2 Likes

@OlafenwaMoses invested in Intel NCS seeing Deepstack support.
Any new release for NCS?

We will have update on this by December.

Hello everyone,
DeepStack is now available on the Jetson.
See the post below for more details

1 Like