Something has happened in the tech hardware space that no one saw coming. PC shipments have increased for the first time in six years, according to recent figures released by Gartner. Despite all the predictions about the decline of the desktop, it seems reports of its demise have been greatly exaggerated.
Gartner’s figures show that worldwide PC shipments totaled 62.1 million units in the second quarter of 2018, a 1.4 percent increase from the second quarter of 2017. This is the first quarter of year-over-year global PC shipment growth since the first quarter of 2012.
The figures also show the growth is happening worldwide, with all regions experiencing some kind of growth over the past 12 months — growth that was principally driven by the business community. "PC shipment growth in the second quarter of 2018 was driven by demand in the business market, which was offset by declining shipments in the consumer segment," said Mikako Kitagawa, principal analyst at Gartner in a statement.
“[However] In the consumer space, the fundamental market structure, due to changes on PC user behavior, still remains, and continues to impact market growth. Consumers are using their smartphones for even more daily tasks, such as checking social media, calendaring, banking and shopping, which is reducing the need for a consumer PC.”
The growth, according to Kitagawa, is being driven by replacement cycles following a period of weak investment during the economic downturn. She predicted this growth will be short-lived, weakening in two years after Windows 10 has peaked, leading her to encourage PC vendors to find other ways to stimulate growth.
Without reading too much into the figures themselves it is worth nothing they attribute desk-based PC growth in the US to continued high usage in the US public sector. Mobile PCs also grew in the US, but strong Chromebook demand in the education market adversely affected PC growth.
New Computing Requires New Computers
While the natural business replacement cycle explains some of the growth, it’s only part of the story. Over the past few years artificial intelligence, business intelligence, big data and big data sets have all become standard features in the enterprise. Intelligence is also moving to the edge, which means enterprise hardware is badly in need of a revamp.
Edge computing is a method of optimizing cloud computing systems by taking the control of computing applications, data and services away from some central nodes (the "core") to the other logical extreme (the "edge") of the internet which makes contact with the physical world. "Edge" is not a new concept, but several trends, including the internet of things, the industrial internet of things and now blockchain have come together to create an opportunity to help industrial organizations turn massive amounts of machine-based data into actionable intelligence closer to the source of the data.
With cognitive intelligence and situational awareness embedded in edge devices, this brings the ability to read sensor data and analyze it in the context of historical data, human expertise, and overall system performance goals to solve problems on the spot, in real-time.
Related Article: Edge Computing: What it Is and Why It's a Game Changer
Why Powerful Computing Is Necessary
The result is it will be easier to deliver the benefits of AI to industries as diverse as healthcare applications in clinical patient care, industrial process control in remote or dangerous locations, or bringing human expertise to every node in a network, no matter how geographically dispersed. It also requires more powerful computing for everyone.
“As the number of commercial and industrial IoT devices proliferate, connecting them and getting them to behave intelligently are among the biggest challenges to IoT realizing its full potential,” AJ Abdallat, CEO of Glendale, Calif.-based industrial cognitive AI solution provider Beyond Limits said. “One important strategy for obtaining timely actionable intelligence is embedding intelligence at the source of the sensing. This development enables decisions to be made at the sensor rather than 'phoning home' to headquarters or a cloud service for 'what to do next.'"
Abdallat explained the importance of making quick decisions due to the fact that many IoT applications have operational control. Unfortunately, the latency inherent in data processing and decision support far from the edge is currently too slow for many applications. He added that today, 25 percent of organizations with established IoT strategies are also investing in AI. However, what commonly passes as "AI" these days — conventional software approaches designed to handle very large, complex data sets, or chatbots that possess rudimentary contextual awareness — are not sufficient.
Some chip companies are working on incorporating AI software on their chips, others, like Abdallat’s Beyond Limits, is doing the reverse: building advanced cognitive AI that can be embedded in off-the-shelf inexpensive chips.
“When edge devices are equipped with cognitive intelligence and are able to act without moving all the data to remote data centers for analysis, the number and type of new smart IoT applications is virtually limitless,” Abdallat said.
The Role Of Processing-Intense Applications
Dimitry Fisher is chief AI officer at San Diego-based Analytics Ventures, a global venture studio fund providing a front-to-end infrastructure for brand new ventures in artificial intelligence (AI). He pointed out the rise of AI in general, and machine learning in particular, is generating interest in processing-intense applications and therefore a demand for both CPU- and GPU-intense computations. This requires:
- Far more processing power than is available on today's smartphones and tablets.
- Use of software that is better suited to be run on traditional laptop and desktop computers.
A number of laptops on the market have good GPUs. However, their power consumption (and, as a consequence, heat dissipation) is on par with some workstations, so both laptops and traditional workstations seem to be desirable. This has three consequences, Fisher said. They include:
Advances in machine learning have driven applications that use natural language processing into the mainstream, which in turn have expanded the use of computers, phones and other devices because they are now so much easier to use.
Yet another factor, the number of people entering the AI/ML field and the number of companies building AI/ML teams is also driving the need for high-end computers on both the corporate front and the educational/academic front. Additionally, Bitcoin mining has caused a spike in demand for high-end GPUs some months ago, although that seems to have been a short-lived trend.
3. Desktop Revival
Many people have predicted the demise of traditional personal computers in the last few years, based on increased use of hand-held devices that rely on the cloud for most of the processing. As it often happens, when this prediction failed to materialize, people who held off from buying a laptop or a desktop are now reconsidering their decisions and buying them.
Finally, it was also hypothesized that the end of Moore's Law would spell the end of the personal computer product cycle or, in some cases, even the end of the personal computer as a marketable product. Moore’s Law is the observation made by Intel co-founder Gordon Moore that the number of transistors on a chip doubles every year while the costs are halved.
“This did not quite happen. However, the novel non-Von Neumann architectures that will eventually augment and/or replace the traditional CPUs are not yet mature, either. So in the meanwhile the traditional personal computers are not going anywhere,” Fisher said.
Related Article: Where Moore's Law Dead-Ends
How Businesses Will Access AI
We are only at the start of an evolution that is likely to change the way businesses access AI in the coming years. AI and more importantly machine learning will become integrated within most business operations and software within five years, according to Remy Kouffman, co-founder and CEO of Knockout AI.
The rise in PCs as well as cloud-based applications and services is primarily due to their suitability for aggregating, storing and running AI/machine learning models and analysis. “Data science and model generation and optimization cannot be done on a tablet or smartphone, so until certain software/middleware or applications allow it to be the case, I'd predict more PCs and an increasing amount of market share going to AWS and other cloud computing companies based on the growing trend of AI,” he said.