18 Ways of Thinking
Ways of Thinking
We’ve come a long way from the dot-com boom in the year 2000. The internet was certainly not all the rage with a negligible 361,000,000 global users, as compared to the staggering 4,383,810,342 of today (yes you read that right). Now, anyone who can help it spends hours and hours per day on sites such as YouTube, Reddit, Instagram… the list is endless. The consumption of the shiny, light emitting rectangles recognised as phones, has taken over the world. Wherever you turn, you’ll find the relentless hammering on keyboards with plastic buttons or the furious tapping of glass surfaces – all with the purpose interacting with the technological phenomenon we all know as the great internet.
Our genes however, aren't as quick to keep up with this lifestyle adjustment.
Our current way of interacting with technology certainly has its’ efficiencies, however it does not by any means represent all the ways in which we are humanly capable of working and thinking. We're missing out on endless possibilities by not thinking about technology in different ways.
To describe such ways of thinking, here are 3 human psychology experts who made human psychology their lives’ work: Jerome Bruner, Howard Gardner and Kieran Egan.
Jerome Bruner discovered our ways of thinking to be categorized into 3 ways: the enactive, the iconic and the symbolic. Enactive thinking is action based, iconic is image based and symbolic based on language.
Howard Gardner discovered our ways of thinking to be categorizable as visual, verbal, logical, musical, bodily, interpersonal, intrapersonal and natural.
Kieran Egan discovered our ways of thinking to be categorizable as somatic, mythic, romantic, philosophic and ironic.
The current state of intellectual work is heavily overprioritizing the combination of these two:
Now, you might be thinking something along the lines of: "If it ain’t broke, don’t fix it" Well how about if we told you that Designer Bret Victor describes the monotony of how humans think in relation to technology as equal to only maxing out a single core on an 8-core CPU.
Would you only max out one of your 8 CPUs?
The border of our thinking
Considered by some to be the greatest philosopher of the 20th century, Ludwig Wittgenstein played a central, if controversial, role in 20th-century analytic philosophy. He talks about the border of our thinking, the point at which many of important things transcend the capabilities of language.
So while language might be the #1 CPU in our computation rack, due to its positive attributes of specificity and sharability, increasing an understanding with an additional form of language could act like another layer that adds more pieces towards finishing a puzzle:
Every additional language of perception reduces the unknown
Having a base 100% just marks an artificial border to visualize the example – however we're living in an infinite reality/realities with infinite borders of knowledge.
Channelling multiple ways of thinking
Not having all ways of thinking properly represented in human-technological interaction prevents an efficient leverage of talent.
If we were able to integrate more ways of thinking, the development of our minds would improve with a more robust and informed perception of what we can make sense of. Stronger thoughts produced by mankind would have a multiplication effect on the world’s progress, efficiency and sustainability. Presenting all ways of thinking about technological interaction is a completely underfunded investment of human time and energy towards creating a better… well… everything. It’s a huge strategical failure.
How would one design technological interaction including less-prominent ways of thinking?
Extended Realities through spatial computing and display
Augmented Reality as viewbale through transparent displays have the revolutionary potential enable interaction systems with a broader representation of how humans think.
Assuming we'd want to leverage visual, aural, tactile, kinaesthetic and spatial thinking, some of these channels could be done with state of the art AR headsets like the Magic Leap One or Hololens 2 or the 30 others.
Augmented Reality especially leverages kinaesthetic and spatial thinking.
You could potentially interact with technology anywhere, anytime and in a physical place while doing physical work. This does not apply to the current, most used digital devices like desktop computers, tablets, mobile phones.
With the given spatial context and gesture-interaction systems, knowledge work for non-desk professions would become a seamless experience.
The effects of AR-altered thinking
It's possible to say: "For people who are doing physical work, rather than doing stationary office work, there is a plethora of needs to be satisfied which have not yet been addressed or even identified." In this case, it's possible to use current findings about what works really well in certain areas, but could not be applied to these new areas in the same way.
Some challenges include:
→ Connecting people with the infinite digital knowledge while performing work, to allow them to make better decisions → Connecting people who are performing activities in a physical environment with other people somewhere else on the planet. → Creating and updating knowledge in real-time while performing activities in a physical environment → Connecting people who want to learn a physical activity with the specific knowledge they need to learn this activity – all whilst performing it in real-time → Laying out knowledge relevant to the task at hand on a visible environment, in a way that’s spatially optimized increase understanding
To figure out the full potential of AR, rigorous experimentation and measurement of improvement over the status quo is required. Only then, real improvements can be proven and built upon.
Howard Gardner, 2006. Multiple Intelligences: New Horizons. Basic Books. Kieran Egan, 1997. The educated mind: How Cognitive Tools Shape Our Understanding. Jerome Bruner, 1987. Actual Minds, Possible Worlds. Ludwig Wittgenstein, 1961. Tractatus Logico-Philosophicus.
Last Edited on April 20, 20:04 PM. Published by Daniel Seiler, edited by Camilla Burchill