When upskill becomes overkill
Are you digital-ready? Are you confident that you have the necessary skills to make it in the digital-first future? Yes? No? Maybe?
You are not alone. The latest Global Digital Skills Index from Salesforce found that 34% of Irish workers felt ‘overwhelmed’ by the rate of technological change in the workplace and 25% were ‘fearful’.
Just under a third claimed to be very prepared with the workplace digital skills needed today and around a quarter believed they were prepared for the skills needed in five years’ time.
The report argues that many in-demand workplace skills, such as collaboration tech and cyber security, are not typically found in the school curriculum, so recruiters should “focus less on established education programmes and more on the ‘real world’ digital skills”.
It claims that building training programmes “based on what workers believe will make them most successful in the workplace” and enable companies to “create a flexible working culture that empowers all employees to connect, learn and progress from anywhere”.
What does that mean, exactly? What do workers believe will make them more successful in the workplace? And is there a difference between what they believe and the things that will actually make them more successful? What are ‘real world’ digital skills?
Let’s consider the low number of people who felt very prepared for the workplace digital skills needed today. Does that seem shocking to anybody in any way? Is anyone especially surprised that so many people are overwhelmed or fearful about the rate of technological change in the workplace?
Whose responsibility is it to do something about that? For instance, why do workers find technological change so threatening? What does that say about technology, about how workers perceive the motivation of the people who make it and what they believe the underlying rationale is for deploying it?
The public currency of technology is innovation, improvement, making things easier, faster, more efficient. So why should that translate into something that’s viewed as daunting, difficult, challenging, overwhelming?
Do we blame people for that? Or should the IT industry ask itself if the problem is that technology is a bit more complicated and difficult than it should be?
What is the acceptable level of training that ought to be required before people have the right workplace skills? How much training do people need for the different IT products and services deployed in their workplace to be able to do their jobs effectively with that technology?
Are those skills transferable? Can they be replicated easily across different technologies in the digital-first future? How much further training should people expect to have to do when new products and services are adopted?
In other words, can they ever be digital-ready? How will they know?
Is it worth looking at things from a different perspective? What if the burden of workplace skills was assumed by the technology rather than the people? What if the digital-first future placed less of a skills burden on employees than the not quite digital-first present?
What if the digital-first future was more intuitive and there was less need for people to learn how to get the best out of it? What if the focus for training and skills shifted to the creators of the products and services underpinning the digital-first future? What if, instead of teaching employees to be able to interact with the technology better, they ‘teach’ the technology to help get the best out of people on their terms?
It’s worth bearing in mind that digital might be synonymous with technology in today’s world, but it started with a finger. IT still does.