NOKUBI Takatsugu: Virtual Background using webcam
I made a webpage to produce virtual background with webcam.
https://knok.github.io/virtbg/
Sorcecode:
https://github.com/knok/knok.github.io/tree/master/virtbg Some online meeting software (Zoom, Microsoft Teams) supports virtual background, but I want to use other software like Jitsi (or google meet), so I made it. To make this, I referred the article Open Source Virtual Background .
The following figure is the diagram. It depends on docker, GPU, v4l2loopback (only works on Linux), so I want to make more generic solution. To make as a webpage, and using OBS
Studio with plugins (obs-v4l2sink, OBS-VirtualCam or OBS (macOS) Virtual Camera) you can use the solution on more platforms. To make as a single webpage, I can reduce overhead using inter-process commuication using http via docker. This is an example animation: Using jisti snapshot: Unfortunately, BodyPix releases only pretraind models, no training data. I need more improvements:
https://github.com/knok/knok.github.io/tree/master/virtbg Some online meeting software (Zoom, Microsoft Teams) supports virtual background, but I want to use other software like Jitsi (or google meet), so I made it. To make this, I referred the article Open Source Virtual Background .
The following figure is the diagram. It depends on docker, GPU, v4l2loopback (only works on Linux), so I want to make more generic solution. To make as a webpage, and using OBS
Studio with plugins (obs-v4l2sink, OBS-VirtualCam or OBS (macOS) Virtual Camera) you can use the solution on more platforms. To make as a single webpage, I can reduce overhead using inter-process commuication using http via docker. This is an example animation: Using jisti snapshot: Unfortunately, BodyPix releases only pretraind models, no training data. I need more improvements:
- Accept any background images
- Suppot choose camera device
- Useful UI