Explaining what is art and interaction design to 2-5 year olds

It was “Occupations” day at Bean’s childcare where they had asked if parents could come in and share about their jobs. You know, maybe wear their daily uniform and maybe bring down tools or equipment to show the children. I noticed they had prepared a wall full of common occupations, like being a firefighter, a nurse, a truck driver, a teacher….

Someone’s gotta represent the hybrid-techy-hyphenated jobs of the future, so I prepared a little introduction to myself… and brought down a touchboard and many bananas and spoons, and tried to explain that they could make art on computers. The setup (which seems suitable for all ages, including adults actually) I had brought down involved Bare Conductive’s touchboard, a mini speaker, crocodile clips, and a bunch of bananas and spoons. This was my deck:

Some Observations:
  • Upon seeing bananas, children often want to eat the bananas.
  • Some children examine the connection points and want to disconnect and reconnect the bananas
  • The word “touch” is quite ambigious and does not define how you touch a banana. Do you whack the banana? Do you squeeze the banana? Do you tap the banana lightly? Do you rest your entire hand on the banana?

OBS Live-streaming, Unity+Tensorflow Experiments, & “Y-Lab presents: Extended Reality”

Recently I have discovered the many uses of OBS (Open Broadcaster Software), a free open source software for all your live video needs!

When I first thought of doing some livestream recordings as a way to present new digital works – I didn’t realise it meant I actually needed to up my studio game with lights and proper mics, and now it seems I might potentially need to get a better web camera further down the line. The first thing I shot involved a webcam image of myself that was way too dark. On  the 3rd occasion when I did the livestream, the dingparents invited me to come over to shoot my things at theirs when the hacking noises in my block were too much to bear, so I lugged over my rickety portable setup… which looks like this behind the scenes:

“AMATEUR HOUR” Streaming Setup

I must confess it is still a bit like amateur hour here. Somehow the live event recording I made was still not at its best – there were a few bitrate issues I haven’t quite sorted out, and I don’t know why I felt that when things are more chaotic my “performance” seems… better? Since I don’t have a fixed workspace at home (in the daytime I work from the living room, in the night whilst Beano sleeps I work in the bedroom watching over her), so my setup has to remain quite mobile and portable. It feels like if I’m serious about doing livestreams as a format then I actually have to set up a more dedicated space in the house for all my ‘streaming’ shenanigans…

The Bread & Butter Stream

This week I also participated in my first professionally managed livestream and actually yes until this point I thought that everyone just had to live through the awfulness of juggling countless screens – it hadn’t occured to me to outsource it to someone else. I do not have multiple monitors so when I attempt to stream it means managing a halfdozen tiny windows on screen and constant switching between fullscreen modes (and having a niggling feeling that when windows are not maximised or arbitrarily resized, I must be totally messing up all the screen proportions / resolutions / bitrates / framerates – and yes all those are very specific important things which can indeed screw up the livestream output).The Extended Reality Panel on 16 July 2021

Oh, so did you think you would watch a few videos on how to become a twitch streamer and how to use OBS and would WING IT ALL? Did you ever consider setting aside budget for your livestreamed productions? Well then, get thee a stream manager to juggle all your windows! (* ok unless you have no budget in which case… you have no choice but to shuffle on back and handle all your window management duties by yourself…)

Chroma Key in OBS with a green ikea blanket and table lamp pointed at my face? Either my teeth disappear or my blouse goes…

So…. I have been slowly figuring out how to use OBS. For example, don’t want to take over everyone’s screens with a screenshare during a Zoom call? Why not screenshare within your camera feed like I do here, except that my chroma key filter setting was slightly borked (lighting? greenscreen issue?) and everyone asks me why I am looking “a little transparent” today. -_-

Another fun thing I realised about OBS is that now anything at all on my screen can be my camera. Yes, so I can just put a youtube video on and feed it in whatever nonsensical build I am experimenting with at the moment.

In this case, this week I was playing around with Unity Barracuda + Tensorflow, which resulted in these test images where I ended up with the classic ridiculous “let’s do a tour of all the random things on my table/floor”:

Image Classification with Tensorflow in Unity

But actually, nobody really wants to have to continuously hold stupid objects up to a camera to test whether their image classification app is working… when you could just videos in the background to do the virtual equivalent of holding things up to the camera! And I simply connected Unity to my OBS virtual camera output. Basically now I can just grab a window capture from any web browser (or even youtube) and feed it into any app that takes webcam/camera inputs!

Tensorflow Image Classification Experiment x Music Video to “Sigrid’s Mirror”:

Tensorflow Image Classification Experiment x Music Video to “The Weeknd’s Blinding Lights”:

I am having fun reading the (mis-)classifications but in any case this is just using some ONNX computer vision models right out of the box and… IF YOU FEED IN PEANUT PICS, YOU GET PEANUTS!

Other Interesting Things to do with OBS

Easy Scrolling Text Marquee: Add New Text (Freetype 2) > Right-click and select Filters > Add Scroll > Set Horizontal Scroll to about 10-30.

Fast Colour Grading: Search for free cinematic LUTs online. Right-click the video capture device > select Filters > Add LUT.

Lookup Tables or LUTs are just a set of numbers that provide a shortcut for how your your colour input will be transformed into the final colour output. The LUT file is either in the form of an image like the one below or in a .CUBE file (a human-readable file with a set of numbers for colour conversion use, so its basically saving you from having to endlessly tweak some curves here).

(Tip: if you want to export a specific effect from some other app, you can take this original default png and apply any filter to this image – whether from instagram or vsco or photoshop – and use the new image as your LUT)

Virtual Classroom Panopticon: A (former) colleague of mine once shared that he got all his students to use OBS to stream themselves working on their computers as their camera feed, so he could see all their work at once without having to get everyone to share screen, easily creating a Zoom-panopticon in the virtual classroom…


A replay of Y-Lab Presents: Extended Reality is online here:

Y-Lab Presents: Extended Reality
@ Y-Lab Online | 16 July 2021 2pm-5pm
AR. VR. MR. XR. These technologies are everywhere, as what used to be physical is now going virtual or phygital. What are the opportunities and limitations when it comes to working with these technologies as individuals and institutions? Whether you are an artist, a curator, or simply curious – join our diverse panel of speakers to explore the state and the future of XR together. Come find out about the role that you can play in this space. :wink:

Agenda
  • Intro to Y-Lab by Kevin, National Gallery of Singapore
  • An Insider’s Look at XR by Lionel, iMMERSiVELY
  • The Evolution and State of XR (Technology Landscape Scan) by Yi Ren, Unity Technologies
  • Reimagining Arts and Culture with XR (Use Cases) by Jervais, National Heritage Board
  • init Virtual Worlds by Debbie, Visual Artist & Technologist
  • Intersections Between Art, Tech and Start-ups by Eugene, Dude Studios
  • Storyliving with Embodied Performances by Toph, Tinker Studio
  • Volumetric Captures and the Afterlife by Shengen, Cult Tech
  • Q&A with Panel

An Apocalyptic City in Blender: Lazy Speedrun

I was watching another lazy tutorial and had the impulse to try it out for myself. So here is a speedrun of an apocalyptic city. One basic building multiplied many times. No need for elaborate post-processing stacks or piles of artfully arranged rubble, this is the MVP (minimum viable product) shipped to you in 20 minutes (or less, in the case of this slightly sped up video…)

I think that me making these sort of videos is the present-day equivalent of trying to embark on digressions when I have an exam or important project to complete; instead I suddenly get all sorts of ideas to do ridiculous things like make more screencasts of myself doing something in Blender.

For years I’ve watched countless online tutorials on YouTube, many of which were set to some generic vaguely-inspirational electronic music. (I confess that I have playlists full things like youtube tutorial classic Tobu’s Candyland and other NCS Releases) and I took great joy in choosing royalty-free background sounds for this.

People, the keyword for this type of tutorial background music is “CORPORATE TECHNOLOGY”. Don’t go for CINEMATIC or DRAMATIC or INSPIRATIONAL, as there is a chance it might end up too teenage-over-the-top self-aggrandising. As it turns out “CORPORATE” plus “TECHNOLOGY” usually results in something blandly aspirational and futuristic.

A Shopfront in Blender and Unity: Lazy Speedrun using UV Project from View

After encountering Ian Hubert’s World Building video (how did I not see this until now?) I had an epiphany about a different way of making in Blender, besides photogrammatising things or modelling everything up from scratch. For many years I had actively avoided trying to understand UV mapping because I considered it too time consuming, and like he mentions, it is this 3d person whose face is the stuff of nightmares:

HA I have surely attempted to create and unwrap a face like this countless times only to horribly botch it and create something unintentionally horrific (and horrific but in not even an interesting way).

Every time this happened, I had simply accepted this to mean that I was not likely to make it as a character designer or a UV mapping specialist in this life… I mean, you gotta pick your battles. But everytime I saw this map it was like a symbol of all the UV mapping I would never learn to do because I AIN’T GOT THE TIME TO DO IT…

So the UV project from view is an absolute game changer. I actually used the UV project from view to make some pieces of bread previously (for the titular BREAD in the BREAD AND BUTTER game i am working on), but I hadn’t connected the dots to the possibilities… till I saw this…

As a trial run, I did a screen recording of myself doing a speed run making a shop front in Blender and importing it into Unity which took 14 minutes in real time (including a lot of hemming and hawing and undoing). In fact, the editing of the video you see here in iMovie took way longer at 40 minutes (according to Rescuetime), including video exporting and uploading time.

The image I am using is a picture of Hiep Phat from Walworth Road Yes I know it is not even in Stoke Newington, but just another shop found via the keyword “Church Street Stoke Newington”. Sometimes you just need a little hook to get you started. The image can be found on Flickr from the user Emily Webber and it is shared on a CC BY-NC-SA 2.0 licence.

Ironically yes, I have titled this as a SPEED RUN using a LAZY technique because the point is that I ain’t got time to do it the complicated unwrapping way! I’m not sorry that I didn’t even unwrap the ground (pavement in front of shop) totally because even without the ground properly unwrapped it kinda passes muster!

The resulting shop front is very acceptable and definitely usable as a game asset that you might see glancingly from a distance.