“Yeah, well I’m going to turn myself into a cyborg…” proclaimed a disheveled, UDF-coffee chugging classmate in my labor economics class. Our professor had just given the classic talk about technological change; think about Lexis Nexis decimating paralegal pools or computer kiosks replacing of airline ticket attendants. After a few seconds of holding back laughter I realized a serious question that our generation needs to ask itself. How will we fit into the economy of the future?
Well that question begs another question. What will the economy of the future look like? A recent 60 Minutes segment with MIT Researchers Andrew McAfee and Erik Brynjolfsson painted a pretty good picture of the risks associated with technological progress. They were primarily concerned about the rate at which structured, routine jobs are being replaced by cheaper, more efficient automated software and hardware. Areas like artificial intelligence and deep computing are unlikely to go away anytime soon. Software and algorithms will be diving into more advanced tasks like categorizing corporate expenses for tax filing, creating infinite scenarios of stress tests and finding optimal employee candidates. Business analytics will ultimately lead to better understandings of institutional status in both internal and external environments.
But how does this spell out for the role of the human worker? Decreasing the need for human capital to be spent on structured tasks and basic analysis will certainly displace jobs in the short-term, but it opens a new role for the employee. Instead of human talent being used to do the legwork on preliminary tasks like background research and financial planning, employees could focus on even higher orders of value added work. Tasks in this category include customizing software applications to specific market trends, calibrating specific statistical analyses, finding system level synergies and advancing product development. Freeing up capacity within an organization to perform these advanced functions will create tremendous value long-term.
Short of turning ourselves into robots or software applications, how can today’s youth and young professionals adapt to this momentous shift and add value to an economy that is changing structurally at a breakneck pace? (A large) Part of the answer might be in education.
McKinsey released a report in late 2012 detailing the “Education to Employment Gap”. The findings were not terribly surprising, but it was good to see a sober declaration of major misalignments between human capital stakeholders: employers, educators and students. According to the study, “less than half of employers and young people believe that graduates are prepared for the workforce” compared to the near 75% of educators who believe students are being prepared.
Educators are likely too complacent and optimistic about the fates of their students entering the workforce. Data confirms this position. Estimates are floating around that over 3 million jobs are not being filled due to a lack of candidates with requisite skills: the unemployment rate of individuals between the ages of 20 and 24 lies at a staggering 14.5% (the second highest peak in twenty years). The current silo-ed model of education will not provide the interdisciplinary skills needed to succeed in the coming era of business and organization. Ideal employees will need to understand the enterprise impact of their unit, realize physical and human capital efficiencies, and adopt a systems level approach to analysis.
But the skills gap is a two-way street. Hearing technology requirements can be a scare for potential applicants. Many students see lines of green coding flash across a screen when they hear “technology” and immediately feel insecure about their lack of math or programming skills. They don’t even bother applying. The reality, however, is that the “hard tech” skills are only part of the picture. Some of the most in demand technology jobs revolve around implementation, training, organizational transformation, etc. which complement well to many students’ backgrounds. Young professionals and students across the country, indeed the world, need to realize that technology is not a foreign language but a way of looking at solving problems by leveraging capital and labor.
Does the onus fall on educators, employers or students to build this understanding? It falls on all parties. Firms have to share their needs, educators must train students in these new skills and students must be open to the new model. IBM is one firm that seems to be excelling at this stakeholder triangle. They have just partnered with Ohio State University to build a center that will focus on the research of advanced computing and training of a new generation of students. This type of collaboration should be the framework for what education should look like moving forward. (Note: this does not only apply to business functions – research, humanities, arts, etc. can all benefit from this model)
In summary, the reality is that “tech” as it is referred to in industry is really just capital investment. If we take a classical view we can begin to realize that technology isn’t just fancy algorithms and software but rather the interaction of those programs and the people who create, integrate, monitor and exploit them.
You must be logged in to post a comment.