The skills shortage facing businesses has been widely reported. Some frame it in the language of war, discussing the battle for talent in terms that would make the writers of "Game of Thrones" blush. So how are institutions of higher education responding to this increased demand?
We reached out to officials at Massachusetts Institute of Technology (MIT) and the Georgia Institute of Technology (Georgia Tech) to gain their perspective.
Representatives from both universities said they keep tabs on the changing face of technology and have adapted curriculum. However, they will not fall into the trap of changing up courses on the fly to reflect each and every shifting digital and technology trend of the day, according to Markus J. Buehler, McAfee professor of engineering and department head of civil and environmental engineering at MIT. His university wants to balance technology real-world trends with the recognition that degrees like engineering will always require its fundamental skills.
Keep Up With Trends, But Teach the Fundamentals
“We in academia say, ‘Wait a minute, we’re not going to change everything,'” Buehler said. “Maybe something is a hot thing today, but maybe it changes .… We consider ourselves thought leaders, of course, but in some ways the industries are ahead of us because they’re solving a real problem. And the way we learn about it is we have the industry come to MIT through our classes, and through our internships.”
The university remains very “in tune” with trends in artificial intelligence (AI), for instance, and tries to be as responsive as possible. However, Buehler said, MIT keeps its education "grounded in a way that we're still teaching the fundamentals because we want to educate students that are going to be high-caliber employees 20 to 30 years from now, and not just two years from now. That’s our mission.”
Related Article: Poor Digital Skills Hinder Digital Workplace Progress
Skills Can Be Short-Term
About 1,100 miles south in Atlanta, curriculum officials at Georgia Tech face similar challenges: contending with “ups and downs” and “faddishness” of technology, according to Dr. Colin Potts, Georgia Tech's vice provost for undergraduate education. He cited the 2000 dot-com crash as an example.
In some cases, reported “skills shortages” can represent “ephemeral skills which graduates would only be able to use for a handful of years, and then they will become obsolete." "That's a real concern," Potts said, “and I think that we have to bear that in mind.”
HR leaders increasingly say needed skills have a short shelf life. A significant share of employers (40%) estimate a skill is usable for four years or less, according to the study by Wiley Education Services and Future Workplace (registration required). “Fast-paced obsolescence escalates the need for employers to hire or upskill workers when gaps form,” researchers found.
How do we teach vital cybersecurity topics like policy, law, international affairs, and management in the traditional curriculum? @peterswire offers a framework for @CACMmag. https://t.co/DMwDB7XHNN pic.twitter.com/1JPltRVkoO
— IISP at Georgia Tech (@GaTechCyber) October 12, 2018
It’s important to resist the urge to rewrite curriculum at all times, Potts added. “I used to hear, and I don't hear very much anymore, concerns from employers that students didn't know the latest language programming language. With the proliferation of web-based platforms, there are so many customizable platforms, APIs and languages and frameworks now that you can't keep up with all of them."
Related Article: Key Skills Every Digital Workplace Practitioner Needs
Taking a Big Picture Approach to Education
This doesn’t mean colleges aren’t adapting curriculum to emerging technologies and societal ramifications around them.
The MIT Professional Education professional education program — which offers courses to anyone who wants to enroll, and not just students who apply/qualify — works with organizations, such as Accenture, to create training and development programs. It connects professors and students to professionals on issues/trends emerging outside academia. Professors can also infuse what they learn at the undergraduate/graduate level.
Some recently-added courses include:
- MIT Masterclass: Technology Strategy & Innovation: lessons in digital technologies that lead to products and innovation.
- Ethics of AI: Safeguarding Humanity: Navigating ethical challenges associated with AI development and implementation.
- Applied Deep Learning Boot Camp: Using deep learning tools to process data in different modalities, ranging from text, images and graphs.
Buehler called AI and machine learning the most disruptive change in technology today. Engineers now have new access to data because of advancements in AI and machine learning that allow multiple industries to take the data, identify patterns and make predictions. MIT also recently launched a College of Computing, a $1 billion commitment to “address the global opportunities and challenges presented by the ubiquity of computing — across industries and academic disciplines — and by the rise of artificial intelligence.”
We're meeting tomorrow at 5pm in 32-144! Topic: if, when, and how innovative algorithms with potential malicious applications should be released via an interactive activity @MIT_Quest @MIT_CSAIL @TppMit @MITPSTS
— mitaiethics (@mitaiethics) May 1, 2019
MIT officials also consider when discussing curriculum “grand challenges” such as energy, sustainability, climate, recycling and transportation to name a few. These factors, Buehler said, can also have a “very strong social component, societal component, political component and regulation component.” Engineers can no longer be trained in silos because, in order to be successful today, engineers “really need to think broader.”
Related Article: Higher Education's Unique Digital Workplace Challenges
Cybersecurity Offering Responding to Privacy Concerns
The changing face of privacy expectations from consumers is also impacting curriculum, according to Georgia Tech’s Potts. The university added an online master's program in cybersecurity this year in response to growing concerns over things like secure firewalls, secure algorithms and secure hardware, policy issues about privacy and data governance and stewardship. Fewer than a third of global organizations have what they consider to be the right amount of cybersecurity staffing.
When AI Meets Ethics
Ethics, specifically in relation to AI but also within the broader framework of technology also is top of mind. Computer science majors at Georgia Tech are required to take ethics courses as part of the major. Potts said the university wants its students to think about the values that computer technology promotes. With social networks, how does that information find its way into your feed? What does this technology change? And then how do you relate to facts and political opinions differently?
“If our students don't embrace that as their problem, and if they think that they're basically solving a technical puzzle and these are other peoples' problems to worry to worry about, then I don't think we're really educating them fully,” Potts said. “That's something we've taken very seriously.”
MIT also ties ethics lessons in with engineering studies. AI and similar intelligent technologies can be “potentially very destructive,” according to Buehler. It’s why MIT has a “strong focus here in ethics, and especially bringing ethics into the engineering education."
Related Article: Digital Transformation Demands Evolving Workplace Skills
Why One Engineering Student Is Learning to Code
One tech university engineering student has adapted to the growing skills needs in the workplace by adding something she feels important to her skills repertoire: coding.
Sydney Packard, 21, a senior and chemical engineering major at Worcester Polytechnic Institute in Mass., said she's interested in metabolic engineering, particularly transforming microorganisms like bacteria to produce different compounds.
However, coding has entered the picture as well. She wasn't interested in coding before getting to college.
"But now," Packard said, "I think it's becoming more and more popular for people to have at least a basic knowledge of coding, and that has influenced me to try it out. I've taken a few classes related to coding/biotechnology that I wouldn't have taken otherwise. I've noticed that having computer-related skills like coding makes you much more competitive in industry, even though my major, chemical engineering, doesn't traditionally require those skillsets."