Experts Release Scorecard for IEEE Computer Society’s 2020 Tech Predictions

AI@Edge, Additive manufacturing, Adversarial ML, and AI and Critical Systems Earn Top Scores
IEEE Computer Society Team
Published 12/03/2020
Share this on:

In December of 2019, tech experts unveiled their annual predictions for the future of tech, presenting what they believed would be the most widely adopted technology trends in 2020. Today, the scorecard for IEEE Computer Society’s 2020 Technology Predictions has been released, with an overall score of a B-.

“Last year was the least predictable one of all years we have conducted technology predictions, and not surprisingly our grade was B-,” said Dejan Milojicic, IEEE CS past president (2014), and Distinguished Technologist at Hewlett Packard Labs Palo Alto, California. “The advancement of most technologies slowed down due to the on-going pandemic, but then a few were actually accelerated. Stay tuned for the Technology Predictions for 2021 to be revealed by the Computer Society in the coming weeks.”

Technology Predictions for 2021 will be released soon – sign up for complete access!

 

 

The topics leading the pack were AI@Edge, Additive manufacturing, Adversarial ML, and AI and critical systems:

 

  • AI@Edge (graded A-) was driven by the need to automate and filter data close to the edge by applying AI, it was critical in collecting pandemics related information.
  • Additive manufacturing (graded A/B) helped produce critical medical components and provided visceral evidence that distributed, local manufacturing capabilities can be essential during times of supply chain upheaval.
  • Adversarial ML (graded B+) was increasingly used as systems continue to incorporate Machine Learning (ML), in particular through variants of reinforcement learning and neural networks.
  • AI and critical systems (also graded B+) were deployed increasingly in more systems that affect public health, safety, and welfare.

 

The following is the complete list of the 2020 IEEE Computer Society predictions, their report card grades, and analysis:


1. AI@Edge (A-).
Not surprisingly, the adoption of AI at edge dominated the predictions in year 2020. Exacerbated by pandemics, the need to automate and filter data close to the edge was a natural use case for applying AI. This was true for thin IoT clients with low computational power, for edge computers deployed nearby IoT devices, and for heavy edges in HPC computing. Intelligent processing of data in predictions 3, 4, 7, 8, 9, and 11 below is yet another proof for the need of AI@Edge. We anticipate that its adoption will only continue to grow in the following years.


2. Additive manufacturing (A/B).
The bulk of our predictions for additive manufacturing proved valid in 2020. While the pandemic caused some companies to delay capital expenditures on items such as industrial 3D printers, the global effort to produce critical medical components provided visceral evidence that distributed, local manufacturing capabilities can be essential during times of supply chain upheaval. We saw less growth in additive manufacturing for mass customization than expected, but there were notable new efforts to transition to 3D-printing parts that are hard, expensive, or time consuming to produce using traditional manufacturing means. We expect this trend to continue in the coming year.


3-4. Adversarial ML (B+)
We predicted that adversarial machine learning would become increasingly important in 2020 as systems continue to incorporate ML, in particular through variants of reinforcement learning and neural networks. Indeed, during the last year researchers have identified a series of new attacks against ML systems – both detectable and undetectable to the naked eye – that highlight the importance of incorporating adversarial machine learning into both the design and testing process for ML systems.


3-4. AI and critical systems (B+).
We predicted that Artificial intelligence (AI) would be deployed increasingly in more systems that affect public health, safety, and welfare (as opposed to, for example, entertainment systems) over the next five years.  The anticipation was that these AI in critical systems could help better utilize scarce resources; prevent disasters; and increase safety, and reliability. Since then, there have been many reported advances in AI for use in such applications including transportation, energy and healthcare. There are still significant concerns over many aspects of these systems such as trust and explainability. For these reasons the team rated our prediction as a B+.


5. Non-volatile memory products, interfaces and applications (B).
Non-volatile memory enables next generation computing, in the data center, at the edge and embedded in industrial and consumer products.  NVMe storage devices are widespread, NVMe over fabrics provides remote memory access and new forms of storage virtualization.  3D XPoint is displacing DRAM and as SSDs to enable the use of lower cost QLC NAND flash.  MRAM is showing up in AI inference embedded devices.  CXL and GenZ create high performance heterogeneous memory environments.  Domain specific processors provide computational storage and are bringing in-memory processing within reach.


6. Legal related implications to reflect security and privacy (B)
. We predicted that legal and policy responses to security and privacy concerns would continue to demand the attention of engineers, the public, and policymakers.  As predicted, during the last year, concerns over exploitation of security vulnerabilities and repurposed uses of information have continued to result in regulatory enforcement actions and interest. In particular, collection, storage, and use of health-related information by COVID-19 applications heightened awareness of the limitations of current legal approaches to security and privacy. Unfortunately, not enough tools exist so far, but we are making progress in that direction.


7. Digital Twins, including Cognitive Twins (B-).
Digital Twins are now mainstream in business, particularly in the manufacturing area with availability of industrial platforms to support them (GE and Siemens main players in this area). We have seen their adoption in automotive, construction, healthcare, manufacturing, smart cities and more industries are adopting them. There are now a few events for practitioners on DTs. The ad hoc group set up by the Italian Government to provide guidance on managing the epidemic and propose actions for economic recovery has included Digital Twins as an enabling technology, urging its adoption by the PA. Gartner has included DT evolution in the personal space (personal DT) in its 2020 hype-cycle curve. This is probably the most promising and challenging evolution for the coming 3 years, with a specific focus on Cognitive Digital Twins supporting distributed knowledge and distributed AI.


8. Reliability and Safety Challenges for Intelligent Systems (B/C).
Intelligent systems, capable of taking autonomous decisions based on AI algorithms, are becoming increasingly widespread in several application fields (e.g., autonomous robots, vehicles, etc). This thanks to their possible adoption to replace and/or collaborate with humans in dangerous environments and/or difficult jobs. Such systems have been proven useful to aid humans in facing the pandemic emergency (e.g., service robots, autonomous vehicles, drones, etc.). Consequently, a significant market increase has recently been predicted for autonomous vehicles and autonomous mobile robots by 2025. However, to increase the autonomy level of such intelligent systems, functional safety and reliability challenges have to be faced.


9. Applying AI to Cybersecurity (B/C).
We expected that AI/ML will start being widely adopted in cybersecurity and even envisioned broad participation of industry, government and academia. We even predicted IEEE leadership in conducting annual Grand Challenging to drive this effort. While all of this is still underway the speed and breadth of adoption were impacted by pandemics. Regulatory compliance and extensive budgets were essential but had to be delayed to the following year due to elections in US and refocus to pandemics. We still expect this area to grow albeit at slower speed.


10. Practical delivery drones (B/C)).
Our team is largely in consensus that the promise for practical delivery drones hasn’t panned out during 2020.  We saw some progress on the regulatory front (e.g., FAA’s approval for Amazon Prime Air), and more field testing (e.g., Walmart and UPS). However, we didn’t see the wide-scale adoption that we had predicted. Curiously, one of the factors that likely slowed down this development and many others, the Covid-19 pandemic, may also be the catalyst for renewed attention. As people self-isolate and rely increasingly more on delivery services, the promise of contact-free delivery by drone may usher further adoption in the near future.


11. Cognitive Skills for Robots (C+).
We predicted that recent breakthroughs in large-scale simulations, deep reinforcement learning, and computer vision collectively would bring forth a basic level of cognitive abilities to robots that would lead to significant improvements of robotic applications. While the field made significant progress in the past year, we did not see the different disciplines of perception, machine learning, and simulation integrate to the extent that we expected. Mastering reinforcement learning remains challenging and transferring implementations from simulation to the real world (Sim2Real) is still work in progress. Despite this, we remain optimistic as we see continued investment in important enabling technologies such as Machine Common Sense (https://www.darpa.mil/program/machine-common-sense).


12. Quantum Computing (C+).
Quantum computing gained tremendous visibility in 2020. For example, the IEEE Computer Society had a very successful first “quantum week” conference with over 800 attendees and a lot of industry participation. Our 2020 prediction about simulating chemicals on a quantum computer did occur (see the following news article). However, this simulation was not of record performance. Somebody will have to build a new quantum computer for that. It didn’t happen in 2020 and it may not happen in the very near future either. Therefore, despite tremendous popularity, we ranked quantum computing C+.

 

Sign up for 2021 Technology Predictions