Green Screen Studio Glitches

A few weeks ago I was involved in a shoot at a green screen studio and I ended up 3D scanning the green screen studio itself for the fun of it. I like the speediness with which I can clobber something together and the glitchiness it brings, like a digital version of an abstraction or a ‘painterly stroke’.

I bought a Logitech C920 recently so I could seperate my webcam from my mac’s casing and make funnier shots like these. Feels like what’s been missing in a lot of the images I make are people. Inserting myself into the image is what makes me sit up.

Bread and Butter in a Field of Dreams (Coming July 2021)

This July, I’ll be releasing a Free-to-play interactive experience titled “Bread & Butter In a Field of Dreams” for Mac/Win Desktop. But actually, you could say that this project originated as a project under a different name – “The Legend of Debbie“…


Do you want to get a reminder when
“Bread & Butter in a Field of Dreams”
is released for download,
or to hear first about
Debbie’s upcoming projects?
Join Debbie’s newsletter for all DBBD updates!

 

The Legend of Debbie” was originally made as a commission for Asian Film Archive’s State of Motion in January 2021 and it was my way of trying to use the archive of my own artwork as the source material for a sprawling game, exploring the different works as strange portals transporting you to weird spatialised versions of the works, and splicing my works with a partially fictionalised narrative (approximately 25% fiction, 75% reality).

The titular “legend” for the work was this directory which categorised my works into many different categories. A map legend. When I had time I was going to put more symbols all over the place, maybe have a little radar map overhead as well. I also had a lot of fun designing different rooms to represent different works.

I originally wanted to design a LIVE VR experience for the “The Legend of Debbie” and rather than to release the game (because this would take so much more testing for the development side of the project rather than running it as a moderated tour), I would run it as a live event (workshop) where participants could come down in different timeslots to experience this VR game (facilitated by myself)….

Imagine how fun it would be rolling through these odd spaces…

But then the Phase 2 Heightened Measures kicked in again, so we couldn’t have live events like this anymore. So… I did not make a VR version for “The Legend of Debbie”. And in any case, there was something that disturbed me about the final presentation of Legend.

IT JUST WASN’T FICTIONAL ENOUGH!!!

I have come to the conclusion that there is no room for nuance. Or maybe I am not very good at nuance (it is something I am working on, but I suspect that nuance does not come easily to me mainly because my real life personality is too excitable and shouty and maybe a bit childlike and overly earnest at heart).

Instead of developing The Legend further, I somehow ended up making a completely new game from scratch. One in which very deliberately NONE of the works were shown in the game world in their original form, besides the first room which replicates the Wikicliki exhibition by the Singapore Art Museum, currently in the Ngee Ann Kongsi Concourse Gallery (Basement) of National Gallery Singapore. The show runs until 11 July 2021.

Since we couldn’t be in the gallery itself for the talk, I had re-created the gallery for a talk on 29th May (A conversation between myself and curator Mustafa, whom I have worked closely with during the last few months.) Instead of boring slides, based on the items that Mustafa was interested in discussing about, I brought them into the gallery space through the various 3D modelled props on a table, including a few laptops handily scrolling through my actual Wikicliki and a spreadsheet of the Here the River Lies cards (many credits to George for painstakingly digitizing them).

From this totally realistic representation of a real exhibition you eventually get teleported to another world where there are lots of objects which are directly representative of the projects I’ve worked on over the last 10 years, but nothing is represented in the original form that it was made.

In the world of the Field of Dreams, every single artwork I have made in the last 10 years is turned into a transmogrified version of itself – a pop translation of the work which could comfortably exist within a commercially lucrative museum retail shop (a la MOMA shop or NAIISE or any one of those shiny design shops)… or in a dusty basement reading room within an alternative community-based establishment for which there is no lack of heart but financial viability is always a question (such as The Substation’s Random Room).

Somehow making art is an act of translation for me. I don’t really seem to start by drawing or sketch, but by writing, and then I have to translate that into sketches, and from sketches into whatever digital medium I am doing. And this act of translation seems so arbitrary at times. Many ideas could have turned out differently had I chosen to make them in a different medium. Perhaps this expresses the tension I feel between making work as an artist and work as a designer/design educator (which earns me my living). The art can be poetic and ruminative and open-ended whereas the design has to fulfill the brief requirements and ultimately has to be functional (and most likely measurable).

So I thought that instead of a Space Geode rendered all in white, I would have a mildly tacky Space Geode Citrus Squeezer; instead of The Library of Pulau Saigon, its various components would be turned into functional items such as a Tic-tac-toe-set featuring the Chinese Spoon as the naughts and the Political Party Badge as the zeroes (something with the potential to be a slightly tacky coffee table centerpiece). My pulsed laser holography work, “War Fronts” would be rendered instead as a Jigsaw set. And instead of my print of 100 of my dreams from my Dream Syntax book, I turned it into a Scratch-off-chart of the 100 dreams. Because scratch off maps are all the rage now on everyone’s internet shopping list, aren’t they?

Along the way I er…. got a bit too excited because who needs to write a book when you can just make the cover for the book? I was churning out dozens and dozens of pdf book cover textures to populate the DBBD SHOP.

So, perhaps we can’t quite call this work “The Legend of Debbie 2.0” anymore. Maybe this should be called by the name that seems more appropriate for it now: Bread & Butter in The Field of Dreams.

The work takes its name from a 2013 ACES study by the NAC – apparently the first survey of its kind done on arts and cultural workers to examine how on earth do they make their living. I do not know which unnamed arts/cultural worker would give the survey such an evocative name, but here I have made the breads and butters literal, to be collected up before you can gain entry to the next scene.

Special mention also goes to another big survey I participated in not too long ago, which asked artists some very sobering questions about what we thought had advanced our artistic careers or had inhibited our careers, with a dropdown list of items that could potentially limit our careers being twice as long as the advancing list. (In an earlier iteration of the study, it suggested that we dig up our past 10 years of tax returns to examine the difference between our art-income and non-art income. Me, I almost thought this was like some cruel form of “formative assessment” – “Alright, you got me, I’ve NOT been solely living off my earnings as an artist, and in fact, at times this whole “art” thing is frequently a complete loss leader operation!”) I have many ambivalent feels about this. One one hand, my desire to make art isn’t about the money, but on the other hand I also do want to fix the current state of affairs…

There’s a maze and some other weird shizz coming up…

The world is still very much a work-in-progress and I look forward to fleshing it over for July’s “workshop” and to be able to release it as a free game for download! My goal is a release by July 2021! Me thinks I might even do it as a gameplay video – I quite enjoyed this live stream (ticketed as a workshop, but really more like a twitch stream with me having set up OBS and all the ridiculous animated overlays and chats)

I also did another detailed breakdown of the time I spent on this last week using Rescuetime. Rescuetime tracks the time I spend in each app and it is handy in that it breaks down the time I spend into working hours (defined as 9am-6am) and non-working hours (6am-9am) so I can sift out the time I spend on personal projects versus time on my day job. My secret to ekeing out the time is usually to work for 1-2 hrs after Beano sleeps at night and wake at about 4-5am to work.

It goes to show that despite working full time and having a time-consuming baby bean (with help of dingparents dutifully caring for her whilst I work), it is still possible to eke out the time to maintain an active artistic practice if one has the will to do so (and the disclipline to wake up early).

It does feel like a culmination of 3D skills I have taken years to acquire:
2014: when I realised how non-existent my 3D design skills were
2016: when I made myself try to make one blender render a day
2017: intentionally producing new works using 3D
2019: intentionally producing new works in Unity (very basic at that stage)
2020: taking the Unity Developer cert at my workplace, supervising more Unity-based projects
2021: being able to build things like this in a week (on top of a seperate full-time job)

I’ve seen several GDC talks and devlog videos on youtube detailing how every successful game dev probably has dozens of “failed games” before they finally make the one game they are happy with, that one breakthrough game. Likewise I don’t expect Field of Dreams to be perfect on its July 2021 release but I hope to collect lots and lots of feedback after releasing it so I can improve the experience!


Do you want to get a reminder when
“Bread & Butter in a Field of Dreams”
is released for download,
or to hear first about
Debbie’s upcoming projects?
Join Debbie’s newsletter for all DBBD updates!

 

Sucked into the vacuum of a black hole: Unity Shader Graph

One of the effects I’ve liked in games is the “everything being sucked into a black hole vacuum” look. What are the correct words to describe it? Does this effect already have a proper name or keyword? In Virtual Virtual Reality, the conceit is that you’re the human employee pruning and vacuuming some machine’s lovely domestic abode and suddenly without warning it is as if you have done a terrible thing; you’ve accidentally sucked the first layer of the world away by mistake!…

Today, I was reminded of it again whilst watching the (ever inspiring) devlog for the indie game Farewell North, so I wanted to figure out how it was made. In Farewell North, it seems like it is being used in the playback scene for memories; the visual effect of being sucked back into the projector is exactly what makes them feel like ethereal memories being played back.

This evening I spent a while trying to figure it out. The answer seems to be using Unity’s Shader Graph (which I’ve actually never properly used before, but it reminds me of Blender’s shader nodes, so I guess roughly get the node based system). I looked around for examples and explanations of how it was created. I am glad to say that with all the power of the internets and online resources, it was indeed possible for me to understand how one can recreate the “sucked into a vacuum” effect. Lerp refers to linear interpolation and the value will change from a to b over t. There’s a Vector3 to set the origin point of the “BlackHole” or where everything will be sucked into / spat out of. And then there is a slider property for the “Effect” (a bit like “time” in this case) which can be helpfully tweaked in Inspector for testing purposes. “Range” is a fixed value. There’s obviously a lot more I can experiment with Shader Graph. But for now… a working example of the “Sucked-into-a-blackhole-vacuum” Shader Graph looks like this:

My basic version of the “Sucked-into-a-blackhole-vacuum” look…

Imagine my Spheres and Cubes teleporting endlessly from this world to another world and then back again – oh wait now you don’t even have to imagine it, here’s an actual visual representation of it!

Unity + Oculus Integration on Mac

 

How long does it take to create a test project for the Oculus Quest on Mac with Unity3D? Well fortunately it does not take very much time at all, although it will take a whole lot more time to make something decent and of interest. But if one just wanted to connect up all the equipment, it is pretty fast!

According to Rescuetime I spent 33 minutes in total in Unity in order to complete these steps on my 15″ Macbook Pro 2019, including all the downloading and importing. The writing of this documentation is probably taking far longer.

The Chinese New Year Weekend is too short! I want to spend maximum fun time with the Bean, and get some catching-up-with-sleep-time, but I also want to learn how to make something for Oculus using my Mac alone?? I found several posts online claiming to be able to teach you to set up your Oculus device in 10 minutes. HA I suppose they definitely didn’t use a Mac for these speed runs (my Mac has now decided that its new calling is to mimic the hideous sound of an airplane taking off). Still, I persist in valuing the retina display+portablity over practicality and doing everything on my Mac. Will I be forced to retreat back to using a PC again after much frustration? Let’s find out!

The Oculus is a type of Android device so have to check for Android Build Support in the version of Unity I’m using. Just created a 3D project in Unity in the version of Unity I happen to be using.

Unity Asset Store has this default “Oculus Integration”. Whilst waiting for that to download, I saw there were so many different integration packages out there for VR and more. Actually got lost browsing all the rather interesting sounding “Tools/Integrations” category on Asset Store. Which ones do the most interesting things? NO CLUE. I guess I will just try Oculus Integration first before actually trying the others.

There are several updates and Unity will need to restart, after which there will be some new Menu items for Oculus like this:

Under Edit > Project Settings > Player > XR Settings > Virtual Reality should be supported.

Another step I would add in is to preemptively remove Vulkan Graphics API, because if you don’t, it will throw up an error about XR being incompatible with Vulkan. (Alternatively, I suppose one could go into the OVR scripts which is stopping the build and find the lines where it checks for Vulkan and comment out the checks?)

So I also went to read up on Vulkan Graphics API and what it does – the internet says: “Until now, the mobile graphical interface has been using the OpenGL platform. While the platform was suitable for intense mobile applications like gaming and photography five years ago, the old platform isn’t enough to handle today’s AR/VR intensive applications. It is also not possible to pack in massive hardware in a restrained form factor for running intensive mobile applications. The Vulkan API was developed by Khronos to ensure improved graphical performance with lesser resource usage. The new API has been built from scratch for rendering console quality graphics on existing mobile hardware. What that means is you will be able to enjoy the PC-like graphics on your high-end smartphone”.

ALRIGHT BUT WE WON”T USE IT.

OH WAIT AS OF 1 FEB 2021 the internets say that Unity now supports Vulkan for Oculus Quest? ¯_(ツ)_/¯

Ok whatevers. At this point I just removed Vulkan for the time being so I can continue.

Next is to create a developer account and app ID. Now I definitely have mixed feelings about the Facebook integration, which means I have to take several precautions regarding privacy. If I was buying a VR headset only as a casual user, then the issues with forcing users to login with a Facebook account would make me reconsider getting this device. However, the reason I’ve gotten a Quest 2 is for portability in VR development. Consider all the factors on your own before getting a VR headset device!

Go to http://dashboard.oculus.com/ in a different browser and set up the Developer account.

Connect the Quest to the Mac with the usb cable. Under Build Settings > Android > the Quest should now be available as a device.

Build & Run > and when its done, you can put on the headset and it will start to load your scene. Probably could have used the prefabs to make a scene but there are some demo scenes that came with so I just loaded that first.

Wahooey a demo scene!


NEXT STEPS?

Tiltbrush? Building Tiltbrush which has gone opensource?

How does I workflow???: Workflows: The process flows you should follow

How to screenshot on Oculus Quest 2? Press the Oculus Button + any trigger button. The app needs to have permission to save to storage beforehand.

Where do the screenshots on Oculus Quest 2 go to? Turns out that the Quest is a kind of Android device so on a Mac you have to download Android File Transfer and find screenshots under Oculus > Screenshots.

Blender & Unity: Manually Rigging Blender Humanoid Characters for use with Unity Mecanim


I’m definitely no character animator by trade, but there comes a time when you end up with a Unity project that somehow requires it. There are obviously many automatic rigging methods available (Blender does actually have an auto-Rigging system called Rigify for biped humanoids) and you could even try to download other rigs made by other people and plonk them into your scene, but I found that so many of the rigs including the rigify one seems to involve so many complicated bones you don’t need, so you end up having to sift through the bones, deleting so many unwanted bones, renaming bones, perhaps even having the impression of the impossibility of rigging up them bones.

Although it may seem terrifying at the beginning (I’m not an animator or rigging specialist!), I found that surprisingly, it is not that difficult to manually rig up all your bones if what you have is a very simple humanoid character. You just need to be orderly and to stick with the admittedly tedious bone naming process. (Although our character is blobby, we’re sticking with a humanoid as we’re going to use it with the Kinect to sync it with the movement of the human user, and our human user is going to return a humanoid set of values that we’ll need to rig up our character to…)

According to the Unity Blog’s post on Mecanim Humanoid:

“The skeleton rig must respect a standard hierarchy to be compatible with our Humanoid Rig. The skeleton may have any number of in-between bones between humanoid bones, but it must respect the following pattern:”
Hips – Upper Leg – Lower Leg – Foot – Toes
Hips – Spine – Chest – Neck – Head
Chest – Shoulder – Arm – Forearm – Hand
Hand – Proximal – Intermediate – Distal

This is the list of all the bones you need (I found it useful to copy and paste in these names directly)

head
neck
collarbone.L
collarbone.R
upperArm.L
upperArm.R
lowerArm.L
lowerArm.R
hand.L
hand.R
chest
abdomen
hips
upperLeg.L
upperLeg.R
lowerLeg.L
lowerLeg.R
foot.L
foot.R
toes.L
toes.R

Optional: eye.L and eye.R

For starters: Ensure that your character model is positioned at origin and that its pivot point is also at origin (0,0,0). Make sure you reset the scale to 1 just in case (Ctrl+A, Select Scale). The hip bone is the key bone in all this, so start by creating one big bone starting from the bottom of hip to top of the chest. Hit Space and start typing “Subdivide Multi” (Armature) and give it 2 cuts so you get 3 bones. These will form the hips, abdomen and chest bone.

After you’ve done the main spine bones, you can turn on x-axis mirror.

– Select the ball on top of the bottom bone (hips bone). Make sure Options>Armature option>X-Axis Mirror is selected, then press Shift-E to extrude mirrored bones. When you’re in mirror mode, every time you create a new bone, you’ll have a second one mirrored on the other side of the X-Axis. Remember that you’ll have to rename BOTH bones later on – if you are facing your model face-on, also remember that L is actually to the right and R is to the left, and name it accordingly.

– Arrange the leg bone into position (you may need uncheck “Connected” in order to let the leg bone go into the right position). Reposition the leg bones away from the hip. Subdivide Multi (1 cut) this leg bone into two bones, forming upperLeg and lowerLeg.

– Shift-E to extrude two more foot and toe bones, and also add in the collarbone, arms and neck+head bone. Do make sure you keep it all in a standing T-pose (as if the character is standing in the shape of the letter t).

– Ensure that all of your bones are renamed correctly as per the list. If there is an L bone there must always be a R bone.

– Go into Object Mode and Select first the character and then Shift select the armature. Press Ctrl+P and select Set Parent To – Armature Deform – With automatic weights. Your computer might lag for a second before its all connected up.

From there, you’re in the home stretch. Export your Blender model in FBX format and then import it into Unity, and in Unity set the rig to humanoid (instead of generic) and at the bottom of that, hit Apply.

Let the wild rigging begin!

See also:
Animate Anything with Mecanim