What is it that makes us human? Many of the characteristics that appear unique to or species also exist in nature. Higher order animals can demonstrate self-awareness, tools are widely used from crows to beavers and creativity is often shown, from bird song to puffer fish who make highly decorative patterns in the sand. The true definition of humanness is something that scientists and philosophers have struggled with for eons. Cambridge psychology professor, Simon Baron Cohen, believes he has the answer – it is our ability to invent that sets us apart from the rest of the animal world.
In a new book, The Pattern-Seekers: A New Theory of Human Invention, Baron Cohen suggests that genetic development led to a distinct cognitive leap around 100,000 years ago that saw a ‘systemising mechanism’, resulting in human invention. Whilst birds may use a rock as a tool to access food, it is achieved through simple cause and effect. However, it lacks the foresight of true invention. Humans have a unique causal reasoning that sets us apart from other creatures. It allows us to think beyond immediate consequences to create inventions whether it be computer algorithms, smartphones or DNA sequencing.
For myself, I have been a strong believer that good ideas can come from anyone, thanks to our causal reasoning. The challenge, however, are the limitations that are put in place by work or education. There’s a perception that a scientist is not a creative thinker and an artist is not an engineer. The current emphasis on STEM (science, technology, engineering and maths) is not helpful either. It suggests that there is greater value in these areas and less in creative subjects. Yet, the process of invention is creative endeavour. New ideas result from making different and unusual connections, which is exactly what creativity is about. Not only that, but innovation is largely a collaborative process that brings in a range of skills and ways of thinking.
Not everyone is bounded by the limited definitions of scientist and artist which is why we are still able to invent. In education this is interdisciplinary approach more formalised through trends such as STEAM – STEM, with the addition of Arts. A creative approach is the way in which we will solve many of our current and future challenges. We may not all become the next Ada Lovelace, Hedy Lamar or Rosalind Franklin but there’s no question that the ability to invent or innovate exists within all of us.
When I saw the news that Apple would be releasing 217 new emojis into the world, I did what I always do: I asked my undergraduates what it meant to them. “We barely use them any more” they scoffed. Apparently, emojis are now only used by ‘middle aged people’ like their parents. “And they use them all wrong anyway” my cohort from generation Z added earnestly.
My work focuses on how people use technology, and I’ve been following the rise of the emoji for a decade. With 3,353 characters available and 5 billion sent each day, emojis are now a significant language system. When the emoji database is updated, it usually reflects the needs of the time. This latest update, for instance, features a new vaccine syringe and more same-sex couples.
But if my undergraduates are anything to go by, emojis are also a generational battleground. Like skinny jeans and side partings, the ‘laughing crying emoji‘, better known as 😂, fell into disrepute among the young in 2020 – just five years after being picked as the Oxford Dictionaries’ 2015 Word of the Year. For gen Z TikTok users, it’s millennials who are responsible for rendering many emojis utterly unusable – to the point that some in gen Z barely use emojis at all.
Research can help explain these spats over emojis. Because their meaning is interpreted by users, not dictated from above, emojis have a rich history of creative use and coded messaging. Apple’s 217 new emojis will be subjected to the same process of creative interpretation: accepted, rejected or repurposed by different generations based on pop culture currents and digital trends.
Face the facts
When emojis were first designed by Shigetaka Kurita in 1999, they were intended specifically for the Japanese market. But just over a decade later, the Unicode Consortium, sometimes described as “the UN for tech”, unveiled these icons to the whole world.
In 2011, Instagram tracked the uptake of emojis through user messages, watching how 🙂 eclipsed 🙂 in just a few years. Since that time, the Unicode Consortium now meets each year to consider new types of emoji, including emojis that support inclusivity. In 2015, a new range of skin colours was added to existing emojis. In 2021, the Apple operating system update will include mixed-race and same-sex couples, as well as men and women with beards.
The End of English?
Not everyone has been thrilled by the rise of the emoji. In 2018, a Daily Mail headline lamented that “Emojis are ruining the English language“, citing research by Google in which 94% of those surveyed felt that English was deteriorating, in part because of emoji use.
But such criticisms, which are sometimes levelled by older generations, tend to misinterpret emojis, which are after all informal and conversational, not formal and oratory. Studies have found no evidence that emojis have reduced overall literacy.
On the contrary, it appears that emojis actually enhance our communicative capabilities, including in language acquisition. Studies have shown how emojis are an effective substitute for gestures in non-verbal communication, bringing a new dimension to text. A 2013 study, meanwhile, suggested that emojis connect to the area of the brain associated with recognising facial expressions, making a 😀 as nourishing as a human smile. Given these findings, it’s likely that those who reject emojis actually impoverish their language capabilities.
Questions on the impact of emerging forms of media communication are not new. When the use of SMS became popular amongst teenagers there was a suggestion that highly abbreviated words, or ‘text speak’, may be harmful to literacy. A research study did not find any significant negative impact and in children, the use of abbreviated or phonetic speech generally supported literacy and language development. In particular the research found that children who were initiating new forms of text speak demonstrated a strong grasp of the structure of language. Akin to Jazz music, it seems that knowing the rules allows them to be broken.
Although current research has considered the emotional and sociological effect, we can see from its usage that Emoji also brings a level of creativity. There are stories, novels, celebrity biographies and even The Bible written in Emoji. They have also been adopted by artists both as a medium of expression and as tool to bring meaning to existing visual art.
It’s not only about which emojis are used, there are also different, confusing meanings for specific generations. Although the Unicode Consortium has a definition for each icon, including the 217 Apple are due to release, out in the wild they often take on new meanings. Many emojis have more than one meaning: a literal meaning, and a suggested one, for instance. Subversive, rebellious meanings are often created by the young: today’s gen Z.
The aubergine 🍆 is a classic example of how an innocent vegetable has had its meaning creatively repurposed by young people. The brain 🧠 is an emerging example of the innocent-turned-dirty emoji canon, which already boasts a large corpus.
And it doesn’t stop there. With gen Z now at the forefront of emerging digital culture, the emoji encyclopaedia is developing new ironic and sarcastic double meanings. It’s no wonder that older generations can’t keep up, and keep provoking outrage from younger people who consider themselves to be highly emoji-literate.
Emojis remain powerful means of emotional and creative expression, even if some in gen Z claim they’ve been made redundant by misuse. This new batch of 217 emojis will be adopted across generations and communities, with each staking their claim to different meanings and combinations. The stage is set for a new round of intergenerational mockery.
A version of this article first appeared in The Conversation on 25.2.21.
What happened last year might be best summed up by a quote from Lenin, ‘there are decades when nothing happens and there are weeks when decades happen’. In the initial few months of the pandemic McKinsey found that video calling for work more than doubled, in what they described as a five-year technology leap. Zoom, the clear winner for video platforms, reported a jump in paid users from 10m in December 2019 to 200m by March 2020. Teams and Google Meet also saw large user increases and Slack’s paid customer base doubled in 2020.
More Tools for Digital Work and Personal Lives
With the significant increase in online working, there has been much discussion as to whether this shift represents a permanent change. Previously, companies have seen office-based jobs as a form of compliance, with working from home often regarded as a less productive option. The experience of 2020 has demonstrated the compliance argument is largely false. You don’t need to be sitting at an office desk to send emails or to join meetings. Observers suggest that post-pandemic, up to 20% of face-to-face work will permanently move online and many other jobs will be done through a hybrid online/office model. Even with some movement back into physical offices, 2021 will continue to see a growth in online work-related tools. One trend is the use of plug-ins that enhance the online presenter experience for video conferences and meetings. A good example is mmhmm (chosen, because you can say the brand name with your mouthful). The software replaces the camera in Zoom or Google Meet with the presenter, slides and videos managed in a single, integrated screen. It avoids the ‘can anyone see my screen’ situation, offering a much slicker presenter experience.
Changes in 2020 weren’t just about work. Ofcom data revealed that UK personal video calling jumped from 34% of users to over 70%, with a majority made on Facebook’s platforms, WhatsApp and Messenger. The app downloads reported by Apple and Google are also telling. In 2020 Zoom, TikTok and Disney+ were the most downloaded apps, highlighting our need to stay connected and be entertained. Amongst their most recommended apps were Endel (stress reduction) and Loona (sleep management), indicating an unsurprising trend for digitally-based well-being.
Most of us faced a rapid learning curve with online working, both getting to grips with the technology, but also finding the most effective ways in which to use it. That was often a process of individual discovery, supported by a greater sharing of life-hack solutions. It meant that in 2020 we all became innovators. Accenture’s future consultancy, Fjord, identified this trend as Do It Yourself Innovation highlighting bicycle repair pop-ups and online work-out platforms as examples of this. This innovation trend has also led bourgeoning hyperlocal businesses exemplified by artisanal food production such as micro-bakeries. It is an area that will likely buck the downward trend in physical retail. Although initially driven by necessity, DIY innovation offers considerable potential in the coming year. Digital platforms are providing opportunities to monetize innovative or creative endeavours. One example is TikTok’s partnership with Shopify. This is a significant development now that the platform has matured beyond lip sync and dance challenges.
More working from home has led to a shift in commuting patterns alongside a need for less infection-risky public transport (not to mention the urgent need to reduce carbon footprints). The drop in physical retail has also raised significant questions of the role of city centres. In many places there was an explosion of cycling, not just for commuting but also for pleasure, along with increased sales of electric bikes and scooters. Even in the UK, where electric scooters are largely not yet street legal, the retailer Halfords reported a three-fold increase in sales. The country also trialled electric scooter hire schemes in some cities. The evidence from both public transport use and housing purchase shows a move away from busy city centres to more localised approaches. Somewhat prophetically in early 2020, the mayor of Paris has proposed a concept called “ville du quart d’heure”– the quarter of an hour city – in which all the main amenities are available within a 15 minute walk or cycle ride. Many of these behaviour shifts appear permanent, so expect to see a rise in sales of electric personal vehicles alongside more localised, specialised retail in 2021.
Inevitably, the high level of video calling in 2020 brought its own specific problems. Broadly referred to as Zoom Anxiety there are considerable negative consequences of staring at a screen for hours. With fewer non-verbal signals there is much greater anxiety. Studies also found high levels of stress associated with the technology challenges that most of us experienced. 2020 also highlighted a digital divide. Moving meetings or education online reveals a host of inequalities in terms of connectivity, technology and even working/living spaces. Ofcom, for example, reported that over 50% of 75+ do not use the internet. That is a worrying sign in aging population, where isolation leads to poor mental health and increased mortality rates. In 2021 governments, technology providers, businesses and educators will need to take significant steps if they want to prevent further inequalities from the digital divide.
AI – The Good, The Bad and The Ugly
2020 also saw the inevitable march of artificial intelligence. Although we are currently at the machine learning, or weak AI stage, the last year saw many new examples of the potential that the technology has to offer. The Deep Mind AI found the solution to a long-standing conundrum on protein folding. Whilst that’s a specific application, a good demonstration of the possibilities for broader use were Adobe Photoshop’s Neural Filters. Although face swapping and ageing have been available in social media platforms for a while, Photoshop’s filters applied them with greater sophistication to high resolution images. Inevitably AI also had its fair share of blame. Its use to predict A level results led to a debacle in which the UK Prime Minister blamed it on a ‘rogue algorithm’. And herein lies the challenge for AI. There was nothing rogue about the algorithm, it functioned as it was programmed to do based on the parameters and the data that was provided. Technology will often be blamed for human problems, but as we move forward in the next year, there needs to be a broader understanding of the bias that is built into all algorithms and data*. Further challenges of AI were highlighted by Channel 4 with their alternative Christmas message for 2020. They created a deep fake version of The Queen to deliver a manipulated Xmas speech that demonstrated some of the dangers associated with the spread of these technologies.
The Dopamine Affect
Whilst the challenge of digital device addiction has been recognised for some years, the Netflix documentary, The Social Dilemma brought this to the fore. That was especially pertinent in a year when we spent more time online than ever before. There are many facets to the digital addiction challenge, but it can be summed up by the Like. These social affirmations generate small hits of dopamine that build addiction in much that same way as recreational drugs. It puts users in a constant state of low-level anxiety whereby they are continually seeking more dopamine hits – more likes. This addictive behaviour is monetized by the social media platforms who package it for advertisers to a point in which the user becomes the product. Understandably concerned, Facebook wrote a rebuttal to some of the points made in the documentary. They accused the programme of lacking nuance and scapegoating social media for wider societal problems. Arguably, if Facebook was that concerned about these issues they might simply remove or limit the Like button. Author and academic, Shoshanna Zuboff was interviewed in the Social Dilemma. Her book The Age of Surveillance Capitalism offers a more detailed argument on the challenges that social media create. 2021 will undoubtedly see the debate continue on the responsibility of the social media platforms, a need for greater moderation and continued calls to break up Facebook’s properties. It looks like it will be a year in which our relationship with technology will move forward and be questioned in equal measure.
* I would thoroughly recommend reading Hannah Fry’s excellent book Hello World. It gives a good understanding of how algorithms are made. And if you want to know more about data bias, have a look at the Caroline Priado-Perez’s Invisible Women and Safia Umoja Noble’s Algorithms of Oppression