OBS Live-streaming, Unity+Tensorflow Experiments, & “Y-Lab presents: Extended Reality”

Recently I have discovered the many uses of OBS (Open Broadcaster Software), a free open source software for all your live video needs!

When I first thought of doing some livestream recordings as a way to present new digital works – I didn’t realise it meant I actually needed to up my studio game with lights and proper mics, and now it seems I might potentially need to get a better web camera further down the line. The first thing I shot involved a webcam image of myself that was way too dark. On  the 3rd occasion when I did the livestream, the dingparents invited me to come over to shoot my things at theirs when the hacking noises in my block were too much to bear, so I lugged over my rickety portable setup… which looks like this behind the scenes:

“AMATEUR HOUR” Streaming Setup

I must confess it is still a bit like amateur hour here. Somehow the live event recording I made was still not at its best – there were a few bitrate issues I haven’t quite sorted out, and I don’t know why I felt that when things are more chaotic my “performance” seems… better? Since I don’t have a fixed workspace at home (in the daytime I work from the living room, in the night whilst Beano sleeps I work in the bedroom watching over her), so my setup has to remain quite mobile and portable. It feels like if I’m serious about doing livestreams as a format then I actually have to set up a more dedicated space in the house for all my ‘streaming’ shenanigans…

The Bread & Butter Stream

This week I also participated in my first professionally managed livestream and actually yes until this point I thought that everyone just had to live through the awfulness of juggling countless screens – it hadn’t occured to me to outsource it to someone else. I do not have multiple monitors so when I attempt to stream it means managing a halfdozen tiny windows on screen and constant switching between fullscreen modes (and having a niggling feeling that when windows are not maximised or arbitrarily resized, I must be totally messing up all the screen proportions / resolutions / bitrates / framerates – and yes all those are very specific important things which can indeed screw up the livestream output).The Extended Reality Panel on 16 July 2021

Oh, so did you think you would watch a few videos on how to become a twitch streamer and how to use OBS and would WING IT ALL? Did you ever consider setting aside budget for your livestreamed productions? Well then, get thee a stream manager to juggle all your windows! (* ok unless you have no budget in which case… you have no choice but to shuffle on back and handle all your window management duties by yourself…)

Chroma Key in OBS with a green ikea blanket and table lamp pointed at my face? Either my teeth disappear or my blouse goes…

So…. I have been slowly figuring out how to use OBS. For example, don’t want to take over everyone’s screens with a screenshare during a Zoom call? Why not screenshare within your camera feed like I do here, except that my chroma key filter setting was slightly borked (lighting? greenscreen issue?) and everyone asks me why I am looking “a little transparent” today. -_-

Another fun thing I realised about OBS is that now anything at all on my screen can be my camera. Yes, so I can just put a youtube video on and feed it in whatever nonsensical build I am experimenting with at the moment.

In this case, this week I was playing around with Unity Barracuda + Tensorflow, which resulted in these test images where I ended up with the classic ridiculous “let’s do a tour of all the random things on my table/floor”:

Image Classification with Tensorflow in Unity

But actually, nobody really wants to have to continuously hold stupid objects up to a camera to test whether their image classification app is working… when you could just videos in the background to do the virtual equivalent of holding things up to the camera! And I simply connected Unity to my OBS virtual camera output. Basically now I can just grab a window capture from any web browser (or even youtube) and feed it into any app that takes webcam/camera inputs!

Tensorflow Image Classification Experiment x Music Video to “Sigrid’s Mirror”:

Tensorflow Image Classification Experiment x Music Video to “The Weeknd’s Blinding Lights”:

I am having fun reading the (mis-)classifications but in any case this is just using some ONNX computer vision models right out of the box and… IF YOU FEED IN PEANUT PICS, YOU GET PEANUTS!

Other Interesting Things to do with OBS

Easy Scrolling Text Marquee: Add New Text (Freetype 2) > Right-click and select Filters > Add Scroll > Set Horizontal Scroll to about 10-30.

Fast Colour Grading: Search for free cinematic LUTs online. Right-click the video capture device > select Filters > Add LUT.

Lookup Tables or LUTs are just a set of numbers that provide a shortcut for how your your colour input will be transformed into the final colour output. The LUT file is either in the form of an image like the one below or in a .CUBE file (a human-readable file with a set of numbers for colour conversion use, so its basically saving you from having to endlessly tweak some curves here).

(Tip: if you want to export a specific effect from some other app, you can take this original default png and apply any filter to this image – whether from instagram or vsco or photoshop – and use the new image as your LUT)

Virtual Classroom Panopticon: A (former) colleague of mine once shared that he got all his students to use OBS to stream themselves working on their computers as their camera feed, so he could see all their work at once without having to get everyone to share screen, easily creating a Zoom-panopticon in the virtual classroom…


A replay of Y-Lab Presents: Extended Reality is online here:

Y-Lab Presents: Extended Reality
@ Y-Lab Online | 16 July 2021 2pm-5pm
AR. VR. MR. XR. These technologies are everywhere, as what used to be physical is now going virtual or phygital. What are the opportunities and limitations when it comes to working with these technologies as individuals and institutions? Whether you are an artist, a curator, or simply curious – join our diverse panel of speakers to explore the state and the future of XR together. Come find out about the role that you can play in this space. :wink:

Agenda
  • Intro to Y-Lab by Kevin, National Gallery of Singapore
  • An Insider’s Look at XR by Lionel, iMMERSiVELY
  • The Evolution and State of XR (Technology Landscape Scan) by Yi Ren, Unity Technologies
  • Reimagining Arts and Culture with XR (Use Cases) by Jervais, National Heritage Board
  • init Virtual Worlds by Debbie, Visual Artist & Technologist
  • Intersections Between Art, Tech and Start-ups by Eugene, Dude Studios
  • Storyliving with Embodied Performances by Toph, Tinker Studio
  • Volumetric Captures and the Afterlife by Shengen, Cult Tech
  • Q&A with Panel

Leave a Reply

Your email address will not be published. Required fields are marked *