Explaining what is art and interaction design to 2-5 year olds

It was “Occupations” day at Bean’s childcare where they had asked if parents could come in and share about their jobs. You know, maybe wear their daily uniform and maybe bring down tools or equipment to show the children. I noticed they had prepared a wall full of common occupations, like being a firefighter, a nurse, a truck driver, a teacher….

Someone’s gotta represent the hybrid-techy-hyphenated jobs of the future, so I prepared a little introduction to myself… and brought down a touchboard and many bananas and spoons, and tried to explain that they could make art on computers. The setup (which seems suitable for all ages, including adults actually) I had brought down involved Bare Conductive’s touchboard, a mini speaker, crocodile clips, and a bunch of bananas and spoons. This was my deck:

Some Observations:
  • Upon seeing bananas, children often want to eat the bananas.
  • Some children examine the connection points and want to disconnect and reconnect the bananas
  • The word “touch” is quite ambigious and does not define how you touch a banana. Do you whack the banana? Do you squeeze the banana? Do you tap the banana lightly? Do you rest your entire hand on the banana?

OBS Live-streaming, Unity+Tensorflow Experiments, & “Y-Lab presents: Extended Reality”

Recently I have discovered the many uses of OBS (Open Broadcaster Software), a free open source software for all your live video needs!

When I first thought of doing some livestream recordings as a way to present new digital works – I didn’t realise it meant I actually needed to up my studio game with lights and proper mics, and now it seems I might potentially need to get a better web camera further down the line. The first thing I shot involved a webcam image of myself that was way too dark. On  the 3rd occasion when I did the livestream, the dingparents invited me to come over to shoot my things at theirs when the hacking noises in my block were too much to bear, so I lugged over my rickety portable setup… which looks like this behind the scenes:

“AMATEUR HOUR” Streaming Setup

I must confess it is still a bit like amateur hour here. Somehow the live event recording I made was still not at its best – there were a few bitrate issues I haven’t quite sorted out, and I don’t know why I felt that when things are more chaotic my “performance” seems… better? Since I don’t have a fixed workspace at home (in the daytime I work from the living room, in the night whilst Beano sleeps I work in the bedroom watching over her), so my setup has to remain quite mobile and portable. It feels like if I’m serious about doing livestreams as a format then I actually have to set up a more dedicated space in the house for all my ‘streaming’ shenanigans…

The Bread & Butter Stream

This week I also participated in my first professionally managed livestream and actually yes until this point I thought that everyone just had to live through the awfulness of juggling countless screens – it hadn’t occured to me to outsource it to someone else. I do not have multiple monitors so when I attempt to stream it means managing a halfdozen tiny windows on screen and constant switching between fullscreen modes (and having a niggling feeling that when windows are not maximised or arbitrarily resized, I must be totally messing up all the screen proportions / resolutions / bitrates / framerates – and yes all those are very specific important things which can indeed screw up the livestream output).The Extended Reality Panel on 16 July 2021

Oh, so did you think you would watch a few videos on how to become a twitch streamer and how to use OBS and would WING IT ALL? Did you ever consider setting aside budget for your livestreamed productions? Well then, get thee a stream manager to juggle all your windows! (* ok unless you have no budget in which case… you have no choice but to shuffle on back and handle all your window management duties by yourself…)

Chroma Key in OBS with a green ikea blanket and table lamp pointed at my face? Either my teeth disappear or my blouse goes…

So…. I have been slowly figuring out how to use OBS. For example, don’t want to take over everyone’s screens with a screenshare during a Zoom call? Why not screenshare within your camera feed like I do here, except that my chroma key filter setting was slightly borked (lighting? greenscreen issue?) and everyone asks me why I am looking “a little transparent” today. -_-

Another fun thing I realised about OBS is that now anything at all on my screen can be my camera. Yes, so I can just put a youtube video on and feed it in whatever nonsensical build I am experimenting with at the moment.

In this case, this week I was playing around with Unity Barracuda + Tensorflow, which resulted in these test images where I ended up with the classic ridiculous “let’s do a tour of all the random things on my table/floor”:

Image Classification with Tensorflow in Unity

But actually, nobody really wants to have to continuously hold stupid objects up to a camera to test whether their image classification app is working… when you could just videos in the background to do the virtual equivalent of holding things up to the camera! And I simply connected Unity to my OBS virtual camera output. Basically now I can just grab a window capture from any web browser (or even youtube) and feed it into any app that takes webcam/camera inputs!

Tensorflow Image Classification Experiment x Music Video to “Sigrid’s Mirror”:

Tensorflow Image Classification Experiment x Music Video to “The Weeknd’s Blinding Lights”:

I am having fun reading the (mis-)classifications but in any case this is just using some ONNX computer vision models right out of the box and… IF YOU FEED IN PEANUT PICS, YOU GET PEANUTS!

Other Interesting Things to do with OBS

Easy Scrolling Text Marquee: Add New Text (Freetype 2) > Right-click and select Filters > Add Scroll > Set Horizontal Scroll to about 10-30.

Fast Colour Grading: Search for free cinematic LUTs online. Right-click the video capture device > select Filters > Add LUT.

Lookup Tables or LUTs are just a set of numbers that provide a shortcut for how your your colour input will be transformed into the final colour output. The LUT file is either in the form of an image like the one below or in a .CUBE file (a human-readable file with a set of numbers for colour conversion use, so its basically saving you from having to endlessly tweak some curves here).

(Tip: if you want to export a specific effect from some other app, you can take this original default png and apply any filter to this image – whether from instagram or vsco or photoshop – and use the new image as your LUT)

Virtual Classroom Panopticon: A (former) colleague of mine once shared that he got all his students to use OBS to stream themselves working on their computers as their camera feed, so he could see all their work at once without having to get everyone to share screen, easily creating a Zoom-panopticon in the virtual classroom…


A replay of Y-Lab Presents: Extended Reality is online here:

Y-Lab Presents: Extended Reality
@ Y-Lab Online | 16 July 2021 2pm-5pm
AR. VR. MR. XR. These technologies are everywhere, as what used to be physical is now going virtual or phygital. What are the opportunities and limitations when it comes to working with these technologies as individuals and institutions? Whether you are an artist, a curator, or simply curious – join our diverse panel of speakers to explore the state and the future of XR together. Come find out about the role that you can play in this space. :wink:

Agenda
  • Intro to Y-Lab by Kevin, National Gallery of Singapore
  • An Insider’s Look at XR by Lionel, iMMERSiVELY
  • The Evolution and State of XR (Technology Landscape Scan) by Yi Ren, Unity Technologies
  • Reimagining Arts and Culture with XR (Use Cases) by Jervais, National Heritage Board
  • init Virtual Worlds by Debbie, Visual Artist & Technologist
  • Intersections Between Art, Tech and Start-ups by Eugene, Dude Studios
  • Storyliving with Embodied Performances by Toph, Tinker Studio
  • Volumetric Captures and the Afterlife by Shengen, Cult Tech
  • Q&A with Panel

An Apocalyptic City in Blender: Lazy Speedrun

I was watching another lazy tutorial and had the impulse to try it out for myself. So here is a speedrun of an apocalyptic city. One basic building multiplied many times. No need for elaborate post-processing stacks or piles of artfully arranged rubble, this is the MVP (minimum viable product) shipped to you in 20 minutes (or less, in the case of this slightly sped up video…)

I think that me making these sort of videos is the present-day equivalent of trying to embark on digressions when I have an exam or important project to complete; instead I suddenly get all sorts of ideas to do ridiculous things like make more screencasts of myself doing something in Blender.

For years I’ve watched countless online tutorials on YouTube, many of which were set to some generic vaguely-inspirational electronic music. (I confess that I have playlists full things like youtube tutorial classic Tobu’s Candyland and other NCS Releases) and I took great joy in choosing royalty-free background sounds for this.

People, the keyword for this type of tutorial background music is “CORPORATE TECHNOLOGY”. Don’t go for CINEMATIC or DRAMATIC or INSPIRATIONAL, as there is a chance it might end up too teenage-over-the-top self-aggrandising. As it turns out “CORPORATE” plus “TECHNOLOGY” usually results in something blandly aspirational and futuristic.

Bread and Butter in a Field of Dreams (Coming July 2021)

This July, I’ll be releasing a Free-to-play interactive experience titled “Bread & Butter In a Field of Dreams” for Mac/Win Desktop. But actually, you could say that this project originated as a project under a different name – “The Legend of Debbie“…


Do you want to get a reminder when
“Bread & Butter in a Field of Dreams”
is released for download,
or to hear first about
Debbie’s upcoming projects?
Join Debbie’s newsletter for all DBBD updates!

 

The Legend of Debbie” was originally made as a commission for Asian Film Archive’s State of Motion in January 2021 and it was my way of trying to use the archive of my own artwork as the source material for a sprawling game, exploring the different works as strange portals transporting you to weird spatialised versions of the works, and splicing my works with a partially fictionalised narrative (approximately 25% fiction, 75% reality).

The titular “legend” for the work was this directory which categorised my works into many different categories. A map legend. When I had time I was going to put more symbols all over the place, maybe have a little radar map overhead as well. I also had a lot of fun designing different rooms to represent different works.

I originally wanted to design a LIVE VR experience for the “The Legend of Debbie” and rather than to release the game (because this would take so much more testing for the development side of the project rather than running it as a moderated tour), I would run it as a live event (workshop) where participants could come down in different timeslots to experience this VR game (facilitated by myself)….

Imagine how fun it would be rolling through these odd spaces…

But then the Phase 2 Heightened Measures kicked in again, so we couldn’t have live events like this anymore. So… I did not make a VR version for “The Legend of Debbie”. And in any case, there was something that disturbed me about the final presentation of Legend.

IT JUST WASN’T FICTIONAL ENOUGH!!!

I have come to the conclusion that there is no room for nuance. Or maybe I am not very good at nuance (it is something I am working on, but I suspect that nuance does not come easily to me mainly because my real life personality is too excitable and shouty and maybe a bit childlike and overly earnest at heart).

Instead of developing The Legend further, I somehow ended up making a completely new game from scratch. One in which very deliberately NONE of the works were shown in the game world in their original form, besides the first room which replicates the Wikicliki exhibition by the Singapore Art Museum, currently in the Ngee Ann Kongsi Concourse Gallery (Basement) of National Gallery Singapore. The show runs until 11 July 2021.

Since we couldn’t be in the gallery itself for the talk, I had re-created the gallery for a talk on 29th May (A conversation between myself and curator Mustafa, whom I have worked closely with during the last few months.) Instead of boring slides, based on the items that Mustafa was interested in discussing about, I brought them into the gallery space through the various 3D modelled props on a table, including a few laptops handily scrolling through my actual Wikicliki and a spreadsheet of the Here the River Lies cards (many credits to George for painstakingly digitizing them).

From this totally realistic representation of a real exhibition you eventually get teleported to another world where there are lots of objects which are directly representative of the projects I’ve worked on over the last 10 years, but nothing is represented in the original form that it was made.

In the world of the Field of Dreams, every single artwork I have made in the last 10 years is turned into a transmogrified version of itself – a pop translation of the work which could comfortably exist within a commercially lucrative museum retail shop (a la MOMA shop or NAIISE or any one of those shiny design shops)… or in a dusty basement reading room within an alternative community-based establishment for which there is no lack of heart but financial viability is always a question (such as The Substation’s Random Room).

Somehow making art is an act of translation for me. I don’t really seem to start by drawing or sketch, but by writing, and then I have to translate that into sketches, and from sketches into whatever digital medium I am doing. And this act of translation seems so arbitrary at times. Many ideas could have turned out differently had I chosen to make them in a different medium. Perhaps this expresses the tension I feel between making work as an artist and work as a designer/design educator (which earns me my living). The art can be poetic and ruminative and open-ended whereas the design has to fulfill the brief requirements and ultimately has to be functional (and most likely measurable).

So I thought that instead of a Space Geode rendered all in white, I would have a mildly tacky Space Geode Citrus Squeezer; instead of The Library of Pulau Saigon, its various components would be turned into functional items such as a Tic-tac-toe-set featuring the Chinese Spoon as the naughts and the Political Party Badge as the zeroes (something with the potential to be a slightly tacky coffee table centerpiece). My pulsed laser holography work, “War Fronts” would be rendered instead as a Jigsaw set. And instead of my print of 100 of my dreams from my Dream Syntax book, I turned it into a Scratch-off-chart of the 100 dreams. Because scratch off maps are all the rage now on everyone’s internet shopping list, aren’t they?

Along the way I er…. got a bit too excited because who needs to write a book when you can just make the cover for the book? I was churning out dozens and dozens of pdf book cover textures to populate the DBBD SHOP.

So, perhaps we can’t quite call this work “The Legend of Debbie 2.0” anymore. Maybe this should be called by the name that seems more appropriate for it now: Bread & Butter in The Field of Dreams.

The work takes its name from a 2013 ACES study by the NAC – apparently the first survey of its kind done on arts and cultural workers to examine how on earth do they make their living. I do not know which unnamed arts/cultural worker would give the survey such an evocative name, but here I have made the breads and butters literal, to be collected up before you can gain entry to the next scene.

Special mention also goes to another big survey I participated in not too long ago, which asked artists some very sobering questions about what we thought had advanced our artistic careers or had inhibited our careers, with a dropdown list of items that could potentially limit our careers being twice as long as the advancing list. (In an earlier iteration of the study, it suggested that we dig up our past 10 years of tax returns to examine the difference between our art-income and non-art income. Me, I almost thought this was like some cruel form of “formative assessment” – “Alright, you got me, I’ve NOT been solely living off my earnings as an artist, and in fact, at times this whole “art” thing is frequently a complete loss leader operation!”) I have many ambivalent feels about this. One one hand, my desire to make art isn’t about the money, but on the other hand I also do want to fix the current state of affairs…

There’s a maze and some other weird shizz coming up…

The world is still very much a work-in-progress and I look forward to fleshing it over for July’s “workshop” and to be able to release it as a free game for download! My goal is a release by July 2021! Me thinks I might even do it as a gameplay video – I quite enjoyed this live stream (ticketed as a workshop, but really more like a twitch stream with me having set up OBS and all the ridiculous animated overlays and chats)

I also did another detailed breakdown of the time I spent on this last week using Rescuetime. Rescuetime tracks the time I spend in each app and it is handy in that it breaks down the time I spend into working hours (defined as 9am-6am) and non-working hours (6am-9am) so I can sift out the time I spend on personal projects versus time on my day job. My secret to ekeing out the time is usually to work for 1-2 hrs after Beano sleeps at night and wake at about 4-5am to work.

It goes to show that despite working full time and having a time-consuming baby bean (with help of dingparents dutifully caring for her whilst I work), it is still possible to eke out the time to maintain an active artistic practice if one has the will to do so (and the disclipline to wake up early).

It does feel like a culmination of 3D skills I have taken years to acquire:
2014: when I realised how non-existent my 3D design skills were
2016: when I made myself try to make one blender render a day
2017: intentionally producing new works using 3D
2019: intentionally producing new works in Unity (very basic at that stage)
2020: taking the Unity Developer cert at my workplace, supervising more Unity-based projects
2021: being able to build things like this in a week (on top of a seperate full-time job)

I’ve seen several GDC talks and devlog videos on youtube detailing how every successful game dev probably has dozens of “failed games” before they finally make the one game they are happy with, that one breakthrough game. Likewise I don’t expect Field of Dreams to be perfect on its July 2021 release but I hope to collect lots and lots of feedback after releasing it so I can improve the experience!


Do you want to get a reminder when
“Bread & Butter in a Field of Dreams”
is released for download,
or to hear first about
Debbie’s upcoming projects?
Join Debbie’s newsletter for all DBBD updates!

 

Sucked into the vacuum of a black hole: Unity Shader Graph

One of the effects I’ve liked in games is the “everything being sucked into a black hole vacuum” look. What are the correct words to describe it? Does this effect already have a proper name or keyword? In Virtual Virtual Reality, the conceit is that you’re the human employee pruning and vacuuming some machine’s lovely domestic abode and suddenly without warning it is as if you have done a terrible thing; you’ve accidentally sucked the first layer of the world away by mistake!…

Today, I was reminded of it again whilst watching the (ever inspiring) devlog for the indie game Farewell North, so I wanted to figure out how it was made. In Farewell North, it seems like it is being used in the playback scene for memories; the visual effect of being sucked back into the projector is exactly what makes them feel like ethereal memories being played back.

This evening I spent a while trying to figure it out. The answer seems to be using Unity’s Shader Graph (which I’ve actually never properly used before, but it reminds me of Blender’s shader nodes, so I guess roughly get the node based system). I looked around for examples and explanations of how it was created. I am glad to say that with all the power of the internets and online resources, it was indeed possible for me to understand how one can recreate the “sucked into a vacuum” effect. Lerp refers to linear interpolation and the value will change from a to b over t. There’s a Vector3 to set the origin point of the “BlackHole” or where everything will be sucked into / spat out of. And then there is a slider property for the “Effect” (a bit like “time” in this case) which can be helpfully tweaked in Inspector for testing purposes. “Range” is a fixed value. There’s obviously a lot more I can experiment with Shader Graph. But for now… a working example of the “Sucked-into-a-blackhole-vacuum” Shader Graph looks like this:

My basic version of the “Sucked-into-a-blackhole-vacuum” look…

Imagine my Spheres and Cubes teleporting endlessly from this world to another world and then back again – oh wait now you don’t even have to imagine it, here’s an actual visual representation of it!

Breastfeeding and Coding

Breastfeeding and coding – were there ever two more unlikely words to be smushed together?

My breastfeeding journey is still going strong and baby is almost 2 now. It definitely occupies a huge part of my day and night, and even when I go into the office I still am pumping milk and there is so much logistical hassle that comes with it. I’ve done so much research into it, that maybe one day I should write a post just about breastfeeding and breast pumps… (wait, who is the audience of my blog anyway? Any breastfeeding mums here?)

I encountered a paper in which they tried to run an experiment testing the hypothesis that breastfeeding women are the victim of bias. The results pointed to negative societal perception of breastfeeding, with breastfeeding mothers were rated as being significantly less competent in maths (Source: Smith, J. L., Hawkinson, K., & Paull, K. (2011). Spoiled Milk: An Experimental Examination of Bias Against Mothers Who Breastfeed. Personality and Social Psychology Bulletin, 37(7), 867–878. https://doi.org/10.1177/0146167211401629)

If someone is a working breastfeeding mother it often means they have had to rearrange much of their lives in order to accomodate this, and the experience will probably make them even more awesome at time management. I look back on the years before the Bean came along and realise that I’ve squandered so much time in the past on frivolities. Breastfeeding and wanting to do it all (career-wise) really does means you have to be extremely careful with how you manage the finite resource of time – I’ve never been more focused and productive at work, because I only have one shot at things now! I don’t have time to faff about!

It is true that the logistics of breastfeeding (when done directly) is such that it can be difficult to do work with your hands. But I was resolved to become a better coder despite also having to spend an inordinate amount of time breastfeeding my baby. I like to follow various Python MOOCs whilst lying down breastfeeding baby, with just my phone (for coding) and tablet (for the MOOC). All this seemed to point to the importance of having a cloud-based service where you could code, and to have the right coding-oriented keyboard on mobile.

The solution? Google Colab + Codeboard (a coding-oriented keyboard for Android) to play around with Python on the cloud and on your mobile!

Seems fitting that this post is being posted on Mother’s day. Big up to all the hardworking mums out there, it sure is a lot of work to raise a child!

The Near Impossibility of Compatibility between a Mac and the Samsung Galaxy Tab 10.1 and Samsung Pico Projector

I’ve been playing around with two devices recently, for work reasons. Both of them are incidentally Samsung devices – namely the Samsung Pico Projector SP-H03 (SGD399 Promo at EpiLife) and Samsung Galaxy Tab 10.1 (Retail Price SGD848). One thing I noticed is that although they are kinda decent products on their own, unfortunately they won’t work right out of the box with a Mac. In fact they pretty much don’t respond at all when connected to a Mac! It’s times like these that I think that Apple does it the right way when I don’t have to install a driver to print on most common printers. In this day and age, we expect the devices to be smart enough to connect up. At the very least I would expect some distant flicker of recognition or reaction from it!

These are my workarounds for connecting them to a Mac, I hope it helps someone else connect their devices to their Mac since it took me some time to figure these out. For the record, I’m using a Macbook Pro 15″ running on Mac OSX 10.6.8.


My issues with both Samsung products:

1. When you plug a mobile device into a computer, you would expect it to mount and charge, like an iPad would on any device whether Mac or Windows. Anything that requires me to install a driver/application and go through a very complicated method to access files shouldn’t have been shipped out to unwitting consumers.

2. When you plug a device such as a projector into a computer, you would expect it to be detectable without having to completely stop what you’re doing and reboot your computer.


 

How to Connect Samsung Galaxy Tab 10.1 to a Mac

I tried using Kies, Android File Transfer, and via DDMS in Android SDK. DDMS stands for Dalvik Debug Monitor Server and its Android’s debugging tool. Kies and Android File Transfer did not work AT ALL with Mac. Android SDK was a bit more hassle to install but essentially straightforward to do and most importantly, at least it works!

Screen shot 2012-05-05 at AM 12
Hopeless Kies: Infinitely spinning “Connecting…” Dialog

Method 1: Kies – FAILED

Will not connect the Galaxy Tab to my Macbook Pro via USB or Wifi. In any case, it could only transfer photos, videos, and movies, and would not have allowed any access to the file structure. It perpetually says “Connecting…” but never does.

Method 2: Android File Transfer – FAILED

Very disappointing as this was the “official” method listed in the Samsung website. For explanation, read this: “Google provide Android File Transfer and there is also a third party product called XNJB. Neither currently work and the reason for this is that they both use libmtp which is an open source implementation of an MTP Initiator. To understand why this is we need to go back to MTP. In MTP each device has a manufacturer code and device code, a list of which is maintained in the libmtp source code (take a look at music-players.h). Building the library hard-wires the device list inside it, which, at the last build of both Android File Transfer and XNJB did not include the requisite codes for the Samsung Galaxy Tab.” – Who made the decision to direct consumers to download AFT if in the first place it is certainly not going to work with the Galaxy Tab? Also, why did they not ship Kies together with the product if its their software?

Screen shot 2012-05-05 at AM 02
Also hopeless: Android File Transfer does not work in this case

Method 3: Via Android SDK – WORKS!

This was the last ditch attempt, but this method was the only thing that worked for me. You’ll want to make sure that you have Eclipse Classic (Mac OS X for 64 bit) + ADT Plugin for Eclipse + Android SDK. I followed this guide on how to install Android SDK in Eclipse and it was a breeze.

Screen shot 2012-05-05 at AM 02
Eclipse

Screen shot 2012-05-05 at AM 02

Installing Android SDK

Screen shot 2012-05-05 at AM 02

Installing ADT Plugin

While the Galaxy Tab is not connected to the computer, go to Applications > USB Debugging and make sure it is ticked so that Debugging mode will launch when the USB is next connected.

Go to the Android SDK toolkit window in Eclipse. Go to Windows > Open Perspective > Others… > DDMS.

Screen shot 2012-05-05 at AM 02
SUCCESS!!! Here’s looking at the files inside the Galaxy Tab 10.1
Now you’re in. But should we have to go to so much trouble to access the device??? I am doubtful that most will take the trouble to do this. Plus its not very pretty, is it? This method is sure to scare off most casual users, I’d imagine.


 

How to Connect Samsung Pico Projector to a Mac

Samsung Pico Projector
A friend described this device as “an obstinate pony” and I honestly could not agree any more with that description. I enjoy having a larger projection of my screen, but I must admit that it doesn’t come with much instructions, and it is hard to make it behave. For example, if you simply plug it into your mac, it won’t recognise the device. After some hair-pulling and experimenting, I have found the (almost) fool-proof method of connecting it to a Mac. I have no explanation for why this works or what is going on. BUT TRUST ME IT WORKS.

FOLLOW THESE STEPS EXACTLY IN THIS ORDER:

  1. Turn off projector and shut down your mac. Turn it off. Turn off all the power. Disconnect everything!
  2. Start by connecting projector to battery, power, and then use D-Sub Gender cable and connect projector to Mac’s USB drive.
  3. Turn on projector first, before you start up the mac.
  4. Turn on your mac next. Wait for your mac to have completely finished starting up before going to the projector and using its touch buttons to change the selected Input to “PC”.
  5. Projector should be showing the output from your Mac now. Note that you might want to switch to Mirroring if that makes it easier for you to see what is going on…

as3isolib, regular expressions

Today’s roundup of experiments and snippets:

as3isolib

I downloaded as3isolib some time ago but never had time to play around with it. Today I went back to their site after a long while and realised that all their example swfs on their documentation site were broken. Most of the examples were originally for Flex but I sometimes like to work in Flash IDE itself so this is an example how to use it with as3iso. Credit goes to the documentation wiki as well as various commenters correcting each other on the wiki.

As3isolib is an open-source ActionScript 3.0 Isometric Library developed to assist in creating isometrically projected content (such as games and graphics) targeted for the Flash player platform.”

This is a simple example of an IsoScene with an IsoBox. It uses classFactory to create the shadow below itself, and TweenMax to handle the animation.

import as3isolib.core.ClassFactory;
import as3isolib.core.IFactory;
import as3isolib.core.IIsoDisplayObject;
import as3isolib.display.IsoView;
import as3isolib.display.primitive.IsoBox;
import as3isolib.display.renderers.DefaultShadowRenderer;
import as3isolib.display.scene.IsoGrid;
import as3isolib.display.scene.IsoScene;
import as3isolib.geom.IsoMath;
import as3isolib.geom.Pt;

import eDpLib.events.ProxyEvent;
import com.greensock.TweenMax;

var box:IIsoDisplayObject;
var scene:IsoScene;
var tm:TweenMax;

scene = new IsoScene();

var g:IsoGrid = new IsoGrid();
g.showOrigin = false;
g.addEventListener(MouseEvent.CLICK, grid_mouseHandler);
scene.addChild(g);

box = new IsoBox();
box.setSize(25, 25, 25);
box.moveBy(0, 0, 20);

scene.addChild(box);

var factory:as3isolib.core.ClassFactory = new as3isolib.core.ClassFactory(DefaultShadowRenderer);
factory.properties = {shadowColor:0x000000,shadowAlpha:0.15,drawAll:false};
scene.styleRenderers = [factory];

var view:IsoView = new IsoView();
view.clipContent = true;
view.setSize(stage.stageWidth, stage.stageHeight);
view.addScene(scene);
addChild(view);

scene.render();

function grid_mouseHandler(evt:ProxyEvent):void
{
var mEvt:MouseEvent = MouseEvent(evt.targetEvent);
var pt:Pt = new Pt(mEvt.localX,mEvt.localY);
IsoMath.screenToIso(pt);
var squareSize:int = 50;//
var gridX:int = Math.floor(pt.x / squareSize);
var gridY:int = Math.floor(pt.y / squareSize);
TweenMax.to(box, 0.5, {x:gridX * squareSize, y:gridY * squareSize, onComplete:completeCallback});

if (! hasEventListener(Event.ENTER_FRAME))
{
this.addEventListener(Event.ENTER_FRAME, enterFrameHandler);
}
}

function completeCallback():void
{
this.removeEventListener(Event.ENTER_FRAME, enterFrameHandler);
}

function enterFrameHandler(evt:Event):void
{
scene.render();
}

Using regular expressions to parse loaderInfo url name:

I have a folder full of swfs which have an external file / xml entry where they must get their information from. Each swf file is named in the format xxx_0.swf where xxx is a 3 letter suffix and 0 is any number (can be any number of digits). The issue is that when you try to return the url name you will actually get the full directory path. Don’t know if this is the most efficient way, but the solution I eventually went with involves putting each URL segment (divided by a slash) into an seperate array item and then going to the last one. Finally we slice off the front 4 characters (xxx_) and last 4 characters off (.swf):

var myFileNameArray:Array = new Array();
var num:Number;
var myFileNumber:String;
var myFileNum:Number;

var myFileName:String = this.loaderInfo.url.split(“/”).pop().replace(/%5F/g, “_”).replace(/%2D/gi,”-“);
myFileName = this.loaderInfo.url.split(“/”).pop().replace(/%5F/g, “_”).replace(/%2D/gi,”-“);
myFileNameArray = myFileName.split(“/”);
num = myFileNameArray.length;
myFileNumber = myFileNameArray[num-1].slice(4,-4);
myFileNum = Number(myFileNumber);

Traversing XML structures – Loops

I am working on a AS3 project with an XML data sheet and this week I figured out another solution to traversing the structure in a better fashion…

<?xml version="1.0" encoding="utf-8"?>
<TIMELINE>
 <YEAR NAME="2000">
  <ARTICLE>
   <MONTH>January</MONTH>
   <NAME>Lorem Ipsum</NAME>
   <DESCRIPTION>Lorem Ipsum is simply dummy text.</DESCRIPTION>
   <IMAGE>image1.jpg</IMAGE>
  </ARTICLE>
  <ARTICLE>
   <MONTH>December</MONTH>
   <NAME>Ipsum Lorem</NAME>
   <DESCRIPTION>Lorem Ipsum is simply dummy text.</DESCRIPTION>
   <IMAGE>image2.jpg</IMAGE>
  </ARTICLE>  
 </YEAR>
 <YEAR NAME="2003">
  <ARTICLE>
   <MONTH>April</MONTH>
   <NAME>Ipsum Lorem Ipsum</NAME>
   <DESCRIPTION>Lorem Ipsum is simply dummy text.</DESCRIPTION>
   <IMAGE>image3.jpg</IMAGE>
  </ARTICLE>  
 </YEAR>
</TIMELINE>

Basically, in my XML sheet, there were years, and within those years there could be more than one article. Some years had numerous articles. Some years had only one article.

myXML.*.*.length();

When I traced something like this, I would get the total number of articles. I could also traverse all the articles as if the year was not of consequence and all the articles were just being counted and assigned numbers in order from 0 onwards.

myXML.*.ARTICLE[articleNum].DESCRIPTION.toString();

The quandrary was that I needed to be able to scroll through all the articles chronologically but I needed to be able to tell what year it was from, according to the article number (which was now effectively independent from the year), and also be able to find out the article number within that year alone, rather than as a whole.

The solution I came up with: Use a loop to check for number of articles in all previous years, add them up so i can subtract it from the full number of articles. This gives you the article number within that year alone, instead of the article number within the entire “timeline” sheet.

var g:Number = 0;
//myXML.*.ARTICLE[articleNum].parent().childIndex() is year index for that article
while (g < myXML.*.ARTICLE[articleNum].parent().childIndex())
   {
        //myXML.YEAR[g].*.length() is the number of articles in g year
        articleNumContainer += myXML.YEAR[g].*.length();
        g += 1;
   }
// one is added to articleNum cos it actually starts at zero
articleNumberWithinYear = (articleNum + 1) - articleNumContainer;

Flash AS3 and SoThink SWF Decompiler

While saving my work, I had an error and lost the FLA while fortunately, having first published a perfect SWF. If you are saving your work in Flash and Flash crashes before it saves, it often causes your FLA to completely disappear. Always backup your work periodically. Thus I was forced to decompile my own code with SoThink SWF Decompiler to recover my work into an FLA file (for speed) but I also realised that it had decompiled it in a different way from how I had originally coded it. It was easy to figure out and I recovered my entire file within an hour to almost the exact way I had originally coded it. Here are some notes on what I noticed about how it had changed my code in the process of decompiling from swf to fla.

  1. All variables that are randomly declared all over the code are collected and put at the top of the code.
  2. New variables are given names like “_loc_2” or “_loc_3”. Anything with many brackets is also assigned various variable names for different parts of the equation, and then a few of these temporary functions are added up together instead.
  3. All for loops are turned into while loops.
  4. “this.” is added to any objects referred to on the main timeline
  5. It also makes use of the rather uncommon addFrameScript, which adds script to movieclips in AS3. If you leave bits of code outside functions by mistake (eg stage display items), it might group them together in a function called frame1.
  6. It adds the “end function” comment at the end of functions.

Example of typical while loop as decompiled in SoThink:

while (t < max_T_Number)
            {
                // do this
                t = (t + 1);
            }

Example of typical addFrameScript function as decompiled in SoThink:

public function MainTimeline()
        {
            addFrameScript(0, this.frame1);
            return;
        }// end function

function frame1()
{
stage.scaleMode = StageScaleMode.SHOW_ALL;
stage.align = StageAlign.TOP_LEFT;
stage.displayState = StageDisplayState.FULL_SCREEN;
}// end function
}

Although the decompiler is perhaps not the most ethical way of accessing other people’s source code, I think the decompiler still makes an excellent learning tool for flash. When I have more time I will download more swfs or analyse more of my old swfs to see if the conditions listed above are consistently true.

Mediawiki Lockdown – How to make your wiki private

 

For some years now I have been using a couple of private wikis for storing and sorting short notes on topics and interests.

Wikicliki – general wiki
Design Patterns – design and scripting wiki (for my students)
Disukowiki – language wiki

Lately there have been some issues with spam bots which seem to be getting smarter but even more bizarre, creating generic account names of the same type, and generating or writing over my pages with what looks a lot like actual paragraphs of useful information about random irrelevant topics (examples being: homopathy, ugg boots, weather, self-confidence mastery, etc) sometimes with no outbound links. Why do the spambots do this, it is still a complete mystery to me. It serves no purpose (not even SEO-wise for them) except to be a real pain in the ass to the wikisysop.

What makes me sad is the abuse of the system that actually has the potential of providing open space for some really interesting things, so I almost find it a pity to close it up. Some years back, once there was someone who posted on my wiki saying they were doing an experiment in which they stored chunks of (gibberish/encoded) data over a number of open wikis.

They set up anonymous accounts on people’s open wikis (like mine), and posted these encrypted chunks of data in text form. In posting these fragments of the file online, their experiment/goal was to create a system in which people could transmit a really large and encrypted file to someone else. The recipient would have to locate and visit all these wikis to find the numerous segments of encrypted data, and then recompile it together with their prearranged encryption key to get the original huge file. A brilliant idea, although perhaps it would be a bit tedious in practice.

On that bittersweet note:
If you haven’t edited your wiki’s settings yet, do this now. Add this to LocalSettings.php:

# Disable anonymous editing
$wgGroupPermissions['*']['edit'] = false;

# Disable editing by ALL users
$wgGroupPermissions[‘user’][‘edit’] = false;

# Enable editing by ONLY sysops
$wgGroupPermissions[‘sysop’][‘edit’] = true;

# Prevent new user registrations except by sysops
$wgGroupPermissions[‘*’][‘createaccount’] = false;

After doing this, make sure you open up your FTP browser and chmod LocalSettings.php to 600. Duplicate a copy of LocalSettings.php and rename it to LocalSettings_date.php so that in case something unexpected happens you still have a backup of LocalSettings.php from when it was still alright…

Picture 23
You can look at the Mediawiki guide to Preventing Access to see more options on how you can tailor the privacy settings (for example, you could set it so that only users with accounts of a certain age, like say a few weeks, can create new pages).