To Tech or Not to Tech… That is Not the Question.

Photo by Alexander Dummer on Unsplash

A discussion recently ensued amongst some of the parents at my daughter's school regarding access to technology, acceptable allowances and boundaries. Like us, many parents noted that increased access to Ipads brought with it an increase in poor behaviour, tantrums and a general metamorphosis of their children into foreign evil spawn they did not recognise.

The allowance of the Ipad is always a tricky thing in our house. Almost all of the work my husband and I do is digital based, so interaction with a screen at some point during the day is somewhat of a norm. My daughter also has an Ipad of her own which is used as an educational tool and it has helped her significantly in surpassing reading expectations for her age. Occasionally too, she is allowed to watch her favourite shows or play games for a set period of time. However, regardless of the length of times spent watching, or the repetitive setting of boundaries and expectations at the beginning of the access period, we still inevitably end up with complaining, bad attitude and sometimes out of control anger.

Fueled by the school debate, my husband and I began trying to make sense of some of the influencing factors surrounding these issues. Why was it that, as children, we could watch hours of television or play the entire weekend away on the game console without turning into tiny demonic creatures when our parents called ‘time up?’ What was so fundamentally different about our experience, to the experience of kids today that is causing this dramatic shift in behaviour after interaction with technology.

After a lengthy discussion, that possibly should have been fueled by much more gin, these are some of the factors that made the headlines.

Media watching: Historical vs Current

We used to consume media, largely as a collective. Even if you were the only one watching your cartoons on a Saturday morning, the television was still placed centrally in the household and everyone could hear what was going on. While not everyone engaged, there was a mutual awareness of everyone’s presence and activities. Also, if your parents were anything like ours, chances are you were forbidden from channel surfing or even touching the remote. The television was put on what you asked for and stayed there until you were done or requested a change. When it was time for the news or your parents’ favourite shows, it didn’t matter if you were halfway through an episode, you understood that having the opportunity to get some personalised screen time depended on sharing the privilege. Today, this is a very different story.

Ownership and Control
Today, it is more the norm for every individual in the household to have access to his or her very own screen. Be it through television in the bedroom (I’ve designed homes where clients have requested them in the bathroom!), a laptop, tablet or smartphone, access and ownership are unprecedented. Every individual is the master of their own viewing pleasure and there is no longer a shared awareness or a requirement for shared access. The experience in itself has become focussed around control.

Photo by Victoria Heath on Unsplash

Immediacy
Integration of media streaming services and social media platforms as part of our daily lives has brought with it the uncontrollable desire for immediacy. With each access, we hunger for more content. We expect this content to constantly improve, becoming more tailored to our specific appreciations. We also demand no interruption to our engagement and when there is, we expect control over the choice to accept or deny the interruption. Let’s be honest, in those 5 seconds of forced advertisement break on youtube clips you’ve probably hit ‘skip’ 10 times in anger. I can’t remember the last time I viewed an advertisement as an opportunity to stretch my legs and grab a snack or a toilet break, rather than a dire inconvenience. This, over time, has created the phenomenon of binge-watching, where our need to experience something from start to finish has come to overshadow the pleasure of the experience as a whole.

Photo by freestocks.org on Unsplash

Remember how excited you used to be that it was Saturday because it meant 4 hours of uninterrupted cartoons on the morning Disney session? As we are no longer at the whim of pre-defined showing schedules, we no longer harbour this excitement. As a result, we become disengaged at the slightest plateau of storylines and switch to a different ‘show’ in search of a better ‘high.’ We’ve developed, what I like to call the ‘Insta Mindset.’ Unable to engage in anything for longer than 20 seconds if we are not immediately rewarded with stimulus or climax. We are all guilty! I myself have refrained from watching any of the Game of Thrones episodes until the last one airs in order to watch them all at the same time.

This mindset has extraordinary repercussions in other parts of our lives, which is a deeper discussion best left for another article. But one area of note is the resulting negative impact on our depth of experience as a whole. You see, historically, when we watched the latest episode of our favourite show, we spent all week talking about it with our friends. We shared thoughts and collectively analysed the character depth, storylines and unexpected tangents. We made predictions of what could possibly come next. We experienced that episode, effectively, for an entire week and once the next episode came around, we were that much more engaged and connected in the outcomes. Today, we are having to rewatch the previous season of episodes prior to the start of the next, because we simply don’t recall what happened. We are simply consuming and not engaging or absorbing. Immediacy has perforated minds.

Gaming: Historical vs Current

I remember how excited we used to be to visit my cousins for gaming weekends. How fun it was to pile everyone in the lounge and spend the weekend challenging each other's skills. Everyone brought along their collection of games and votes were taken as to the order of games and players. Our favourites included Mario Kart, Donkey Kong, Mortal Combat, Sonic the Hedgehog and various other multi-game units featuring everything from the Olympic Games to Battleships, to Pac Man. It was a shared experience and democratic process, players floated in and out whilst engaging with other things, or other non-players, when it wasn’t their turn. There was laughing, heckling, fighting and a genuinely good grounding for strong relationships skills gained through the experience.

Today, however, the experience is much more solitary. While there are many online player groups, and relationships do exist between common players, it is not the same as having to face your opponent and navigate through the emotions of the experience person-to-person. You only have to look at the lack of multi-player, single device titles to see evidence of this. We all know it’s easy to be brave when you are protected by the screen! Ultimately, if it all gets too hard you can simply turn it off an walk away, totally avoiding having to deal with any of the fundamentals that form part of healthy relationship skills.

But aside from this, in all of the aforementioned games and others of this era, there are key differences to the type gameplay experienced by users today.

Player Perspectives
Most early video games utilised external or observational perspectives during gameplay, using ‘side-scroll’ and ‘parallax’ as a technique for simulating 3-dimensional space. Later developments into 3-dimensional environments again predominantly utilised 3rd person perspectives, where the player observed the character from an external viewpoint, maintaining a certain level of detachment from the character and activities.

3rd person perspective (Image Source)

Today, most gaming experiences operate in 1st person perspective, providing increased accuracy for gameplay and improved engagement and immersion in the gaming experience. The player essentially ‘becomes’ the character and the master of their experience.

1st person perspective (Image Source)

Customisable Identity vs Pre-defined Identity
Coupled with this newfound ability to control your experience in gameplay, also came the ability to define your persona. Today, there is a plethora of characteristics, clothing and accessory choices that enable players to set themselves apart from others in the game environment. Each individual becomes his or her very own virtual brand and more play allows for more access and even further personalisation, not to mention your ability to buy yourself into the cool crowd with the right amount of credits.

Where historically the identity of your character was pre-defined and the extent of personalisation stretched only as far as being able to choose your character…or fight over the options with your opponent, today there is little limitation to what you are able to do. This brings with it one very important point, that players today ‘become’ their characters. Each choice is strategically made based on external perception of their character. A manifestation of their alter-ego and so much more than a simple choice based on who has the best technical ability or coolest kit.

Immersion: Graphics Quality & Layers of Gaming
Alongside both these points is the fact that it is becoming increasingly difficult to distinguish simulated environments from ‘real’ environments due to exponential improvements in technology. Total immersion continues to be the ultimate goal of every game developer and with Virtual Reality and Artifical Intelligence capabilities in constant focus, this is no longer an unreachable universe.

We are already seeing an extraordinary depth to the layers of gaming contained within most products, with players rarely experiencing the journey the same way twice, and no two players experiencing the journey the same. Aside from traditional game-dynamics and narrative, the entire gaming experience is now overlayed with social opportunities, purchasable content, ambition preaching, accessible gambling…even in the most rudimentary sense of ‘spinning the wheel for today’s free gift’ as is common in some of our 5-year-old daughter’s games (Dragonvale). It is no longer simply story based action and reaction.

Dragonvale Game (Image Source)

With more focus on customisable open-ended play, experience within these virtual worlds is entirely dependant on any given day, on the participants present. Much like the real world. Which, when you think about it, offers the perfect opportunity for losing ones perception of reality.

A word on healthy Watching and Gaming
Now, in no way do any of the previous points intend to devalue the world of television or gaming media. Sometimes we all need an escape from our reality. There are perfectly healthy habits for watching and gaming and in fact, many professionals exist in both these spaces who engage with this media for hours on end, most days of the week, without losing their grip on reality, acceptable behaviour or personal identity. So technology and the opportunities that come with it is not the problem. Access is not the question. But knowing when to provide access and to what, is.

Problems with Classification
Put simply, the classification system is inadequate for providing us with the roadmap to navigate these decisions in an educated way. Global classification of media looks only at the content involved, for example, violence, language, adult themes etc. It does not consider, at all, the deeper psychological and emotional concepts that are impacted by the content and the process of engaging with it. It does not provide guidance or understanding of the impact on brain pathways.

What is even more concerning, is that research has shown us that most popular media becomes so, because it has the power to over-write weak pathways to get directly to the pleasure centre of the brain. When the pleasure is removed, pathways dictating appropriate response are weak due to lack of use and thus the only reaction remaining is anger. Enter tiny demonic human.

Understanding Cognitive Pathway Development, Appropriateness and Impact.
We recently watched an excellent TED talk by Cognitive Neuroscientist Sarah-Jane Blackmore on ‘The Mysterious Workings of the Adolescent Brain,’ which really brought a lot of the concepts from this discussion together for me. In her talk, she explains that recent discoveries in neuroscience have helped us to understand the developmental stages of the human brain at a depth never previously possible. In somewhat of a collective pooling of research, scientists have discovered that the brain continues to develop through adolescence into the twenties and thirties and in fact does not cease major development by early adolescence as previously thought. Part of this discovery shows that the Pre-frontal Cortex [the part of the brain heavily responsible for high-level cognitive functions such as decision making, planning, inhibiting inappropriate behaviour, social interaction and self-awareness] undergoes dramatic development during adolescence, suggesting that this is a critical period of impact for external influences.

Without getting too deep into the theory, essentially what takes place in the brain until late adolescence or early adulthood, is several cycles of synaptic pruning. During these pruning phases, the brain rids itself of unused pathways or ‘branches’ to make room for commonly used pathways or branches to be strengthened. This has huge implications for everything from education to the larger future of our experience based economy. The key take away from this, however, in the context of our discussion, is that a resolved ability to take into consideration external perspective does not reach ‘established’ stage until early adulthood.

What this means in terms of consumption of media is that the ability to distinguish between different perspectives is limited and the lines between the virtual world and real world are hard to establish for anyone younger than late adolescent or early adult age. In this realisation lies a golden ‘easter egg’ off sorts.

That ownership, control, immediacy, perspective, personalisation, identity and immersion [aspects that are recent phenomenons in media engagement] all dramatically impact which pathways are being regularly used by our pre-adult children; and thus determine which pathways will be pruned during the adolescent development of Pre-frontal Cortex. This pruning will determine whether or not they become a capable, well-functioning adult, or a socially inept individual with potentially psychopathic or sociopathic tendencies.

Ultimately then, To Tech or Not to Tech is not the question. But rather, how what we are providing access to is directing the regular use of pathways. In this context, your 5-year-old watching other kids playing with toys on youtube is probably actually better for them than an hour’s play of Dragonvale.

We need to understand as parents, which pathways are being reinforced or superseded by media access. This means more awareness on our behalf of what is being watched and played and removal of these activities as solitary experiences. In lieu of better infrastructure to provide this understanding, bringing media consumption back into the centre of the home is a good first step in monitoring content and context. Unfortunately for parents, this does mean you require your own brain pathways to consider what your child is consuming and it’s the potential impact, instead of turning on, to switch off.

So that’s exactly what we are going to do. For the next month, we are going to put away the tablets, bring out the television and old game console and see whether there is any change. We would love for you to do the same and let us know your findings so we can include them in the follow-up article.

Designer. Photographer. Writer. Creator. Educator

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store