Embedded Technology Watchlist: 2020-2030

Fireworks Celebration in Singapore (Source: Wikimedia Commons, Author: chensiyuan, CC BY-SA 4.0)
Fireworks Celebration in Singapore (Source: Wikimedia Commons, Author: chensiyuan, CC BY-SA 4.0, edited)

Embedded systems are constantly changing the way we live and work. Now that the year 2020 is here, we’ve decided to compile a non-exhaustive list of electronic systems that should have the greatest impact on the embedded systems industry.

Many of these technologies are already well-established and will continue to scale immensely, thanks to the many talented researchers in the field. There are also a few surprises, so let’s get started!

High-Performance Analog to Digital Conversion

Embedded systems like the microcontrollers that run automotive electronics need to process real-world information. Before that can happen, the information needs to be captured by a sensor and then be digitized into 0s and 1s. This is where analog to digital converter (ADC) technology comes into play.

Block Diagram of a Successive Approximation Register (SAR) style of Analog to Digital Converter (ADC). (Source: Wikimedia Commons, Author: White Flye, License: CC BY-SA 2.5 )

The technique has been around for generations and in many forms. ADCs are present in single-board computers such as the Arduino. High-speed versions are also present in electronic instrumentation such as the analog front-ends (AFE) of laboratory test equipment. As of January 2020, the ADC architecture gaining the most attention in research is the Successive Approximation Register (SAR). ADC and SAR researchers are faced with two major design challenges:

  1. Making ADCs faster without sacrificing accuracy or precision.
  2. Making existing ADC technologies more energy-efficient. This is important to the growing field of wearable electronics.

In short, ADCs may not be the most glamorous technology on the list, but their many end-uses will make them indispensable for the foreseeable future.

Smart Grid: Battery Energy Storage

In the energy sector: “the grid” refers to the vast network of transmission lines, substations, transformers, and similar assets that deliver electricity to customers. In the United States, the grid has existed since the late 19th century. What separates a “grid” from a “smart grid” is the use of digital communications in managing the network. You may already be familiar with “smart meters” that monitor energy use in real-time. This is only one aspect of the smart grid strategy.

Large, Containerized Vanadium Flow Batteries located in Washington State (USA). (Source: Wikimedia Commons, Author: UniEnergy Technologies, License: CC BY-SA 4.0)

Right now, our grids operate mostly on the Just-In-Time (JIT) inventory model. This means that nearly all electricity generated by turbines and solar panels is consumed instantly. This is a problem for wind energy, which may be at its peak during overnight hours when grid energy demand is at its lowest. To compensate, power grid operators must constantly rebalance what is generated against what customers want to consume.

Bulk energy storage seeks to alleviate some of this burden. This is done by storing surplus energy and then releasing it hours or days later. One method likely to dominate involves the largest batteries in the world. These high-capacity batteries are based on sodium-ion, lithium-ion, and vanadium flow batteries to name a few. And of course, the smart grid will need plenty of digital infrastructure to coordinate them all.

Field Programmable Gate Arrays

FPGAs can take advantage of hardware parallelism—meaning they can carry out many calculations simultaneously. While multi-threading and multi-cored processors would also give this ability to common CPUs, FPGAs take the idea to its full potential.

A Field Programmable Gate Array on a Printed Circuit Board Assembly. (Source: Wikimedia Commons, Author: Altera Corporation, License: CC BY 3.0)

For developers, FPGA’s are like the ultimate prototyping tool: reducing engineering costs thanks to their great flexibility. The FPGA really has too many applications to do it justice, and this should become apparent in the coming decade. This is a powerful tool for everything from high-end gaming hardware to crypto mining, and radar imaging.

Beamforming Antennas

Speaking of radar, when we think of them, we often imagine the huge revolving antennas that dominated after World War 2. And that’s not wrong. But the newest generation of antennas will be surprisingly stationary for what they can do.

An Array of 8 Antennas Transmitting Slightly Out-of-Phase will Steer the Beam toward Angle Theta. (Souce: Wikimedia Commons, Author: Chetvorno, License: CC0)

Radio waves—such as the kind used in radar—can be a challenge to aim. Instead of traveling straight forward as one simple beam, they spread out into oddly-shaped patterns of high-powered areas (lobes) and dead zones (nulls) that make beamforming a challenge.

With a carefully calculated design, engineers can reshape and steer these beams into a more desirable shape or direction. And instead of having just one transmitting antenna, radios can combine several tiny antennas that transmit slightly out of synch. These practices are known as beamforming, and they will play a critical role in wireless communication and sensing.

Landmine Clearance

In Belgium and parts of France, combat from World War 2 has left behind an “iron harvest” of buried landmines, grenades, artillery shells and similar unexploded ordnance (UXO). In the small nation of Laos, the situation is even worse, making it a humanitarian crisis in the making.

An Engineer Tests a Neutron-Emitting Landmine Detection System. (Source: Wikimedia Commons, Author: US Department of Energy, License: Public Domain)

Anti-personnel mines pose a particularly high safety threat. When factoring in other countries like Afghanistan, Angola, Cambodia, Chad, Iraq, and others; these hazardous devices number in the millions globally. Experts in the field expect it could take centuries to clear regions within Belgium and French with a high degree of certainty.

Simple metal detection isn’t always effective. Therefore, researchers are challenging the most effective ways to find buried ordinance. Some of the technological solutions have used Ground Penetrating Radar (GPR), X-ray backscatter, chemical vapor detectors, and hyperspectral sensors that can see more wavelengths of light than a human can.

The process of distinguishing real hazards from random materials in the earth is a tall order. But advances in image processing are making the technology more feasible.

Image Processing: Machine Vision

Once a field too computationally demanding for mobile use, we’ve almost grown accustomed to basic image processing tools such as facial recognition, and filters for smartphone apps. The 2020s will only see this field grow further due to more efficient algorithms and lower prices.

A Collection of Common Objects Marked with Bounding Boxes by the YOLO (You Only Look Once) Version 3 Object Detection Model. (Source: Wikimedia Commons, Author: MTheiler, License: CC BY-SA 4.0)

There are numerous sub-fields of electrical engineering that stand to benefit after 2020, but two fields stand out immediately:

  • Autonomous vehicles, which are not quite available on the consumer market yet, but likely will be by the mid-2020s.
  • Medical imaging, which has in recent months surpassed trained doctors in detecting breast cancer and numerous eye diseases.

Internet of Things (IoT) Modules

Increasingly, consumers are turning to internet-capable devices. These growing trends represent a convergence toward the internet of things (IoT), where machines utilize the internet more than humans. In factory automation: internet-ready devices are already being used in 2020 to improve shop floor operations and even improve safety.

A Wireless Internet Module: Espressif ESP-WROOM-32 with Wi-Fi and Bluetooth Capabilities and an Integrated Antenna. (Source: Wikimedia Commons, Author: Brian Krent, License: CC BY-SA 4.0)

For electrical engineers, integrating an IoT system from scratch is costly. Doing so would require implementing complex communications stacks, meeting regulatory requirements, providing timely cybersecurity updates and so on.

Luckily, the industry has taken note. According to TechRepublic, IoT security spending is poised to hit the $3 billion mark in 2021. For embedded developers, some of this help will come in the form of pre-engineered IoT modules. These modules contain their own certified radios, plus firmware for establishing secure connections to remote servers. For radio engineers who can design IoT modules for companies, this presents an excellent job opportunity. The modules can also receive firmware updates over the air for the life of the IoT product.

Of course, remote accessibility also creates security threats. In recent years, consumers have become worried about data breaches as well as private information collected by IoT devices.

Cloud Data Privacy and Security

The working definition of “cloud computing” tends to be nebulous. For our purposes, I refer to the 2018 National Institute of Standards and Technology (NIST) definition:

“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”

Diagram of Cloud Computing Services and Applications. (Source: Wikimedia Commons, Author: Sam Johnston, License: CC BY-SA 3.0)

In short, cloud computing goes beyond simply being “someone else’s computer.” This made it one of the dominant information technologies of the previous decade. This also reintroduced the problem of trust between cloud service providers and end customers. Particularly, there is growing concern over how personal data is secured and whether or not security breaches are being reported responsibly. This has touched off a few privacy-related data laws in recent years such as the General Data Protection Regulation (GDPR) in the European Union and the California Consumer Privacy Act (CCPA) in the United States.

The growing number of cloud-connected IoT electronics means we can expect to see more collaboration between cybersecurity experts and embedded systems engineers. New standards like UL 2900, and NIST publication NISTIR 8259 are already providing some organized guidance on cloud-related “best practices.” We expect to see this trend accelerate for the foreseeable future.

5G Communications

5G refers to the fifth generation of wireless telephony. The “G” numbering convention is really more of a marketing term than a technical one. But it’s fair to say every major wireless carrier across the globe has unveiled plans to upgrade their networks to 5G as soon as possible.

5G Cellular Presentation (Source: Wikimedia Commons, Author: Olaf Kosinsky, CC BY-SA 3.0)

5G exists because much of our conventional radio spectrum is already allocated to one use or another. 5G services will operate at much higher frequencies than common existing Wi-Fi, Bluetooth, or cellular systems. We can expect 5G services to happen in the 6 GHz to 300 GHz portion of the radio spectrum. Until 2020, this part of the spectrum has been largely relegated to use in radar and communications between satellites. These signals are excellent for carrying data at record speeds, but they are also easily blocked by common building materials, making deployment a challenge.

As such, 5G signals will depend on a network of small cell towers that are very high in number and are located closer to consumer homes and businesses than in previous generations. The towers themselves continue to be one of the 2020s greatest controversies, as critics point to health-related concerns over radio energy exposure. This makes 5G one of the most hotly debated technologies of the year.

Deep Learning on the Edge

Our number-one pick for technologies to watch is Deep Learning on the Edge. This is a combination of deep learning and edge computing.

As a sub-field of artificial intelligence, deep learning is a function that imitates a living brain and its networks of neurons. Deep learning systems use layers of artificial neurons that have been trained to excel at certain tasks, such as navigating a vehicle or understanding human languages.

A Diagram of Interactions Between Cloud Servers, Edge Servers, and Devices. (Source: Wikimedia Commons, Author: NoMore201, License: CC BY-SA 4.0)

This requires a lot of computation and a lot of electrical power to do in real-time. So cloud-connected assets such as smartphones often depend on fast and reliable internet connections to cloud servers in order to apply deep learning in real-time. This is called computation offloading.

This constant offloading of live data introduces performance and privacy problems. This is especially true of video. Edge computing attempts to at least address the performance problem: latency. It does this by performing the computation close to the location that needs it done (i.e.: near the “edge” of the cloud network). Deep learning on the edge is dependent on many other technologies on this list, so it is likely to dominate toward the latter half of the 2020s.

References

[1] Institute of Electrical and Electronics Engineers, “Browse Popular – IEEEXplore,” [Online]. Available: https://ieeexplore.ieee.org/xpl/browsePopular.jsp. [Accessed 4 Jan. 2020].
[2] United States Department of Energy, “What is the Smart Grid?,” Office of Electricity Delivery and Energy Reliability, [Online]. Available: https://www.smartgrid.gov/the_smart_grid/smart_grid.html. [Accessed 4 July 2018].
[3] A. A. Akhil and e. al., “Batteries for Electrical Energy Storage Applications,” in Linden’s Handbook of Batteries, 4th ed., T. B. Reddy, Ed., McGraw-Hill Education, 2010, pp. 1-18.
[4] National Instruments, “Introduction to FPGA Technology: Top 5 Benefits,” [Online]. Available: http://www.ni.com/white-paper/6984/en/. [Accessed 6 July 2018].
[5] H. Samuel, “Somme ‘Iron Harvest’ will take 500 years to clear, say bomb disposal experts on centenary of bloody battle,” The Telegraph, 30 June 2016.
[6] A. C. Jarocki, “US Navy funds underwater drone swarms,” DefenseNews.com, 26 June 2018.
[7] E. Y. Fu, H. V. Leong, G. Ngai and S. Chan, “Automatic Fight Detection Based on Motion Analysis,” in 2015 IEEE International Symposium on Multimedia (ISM), Miami, FL, 2015.
[8] United States Department of Commerce, “Final Version of NIST Cloud Computing Definition Published,” National Institute of Standards and Technology, 8 January 2018. [Online]. Available: https://www.nist.gov/news-events/news/2011/10/final-version-nist-cloud-computing-definition-published. [Accessed 7 July 2018].
[9] R. E. Bryant, R. H. Katz and E. D. Lazowska, “Big-Data Computing: Creating revolutionary,” Computing Community Consortium, 2008.
[10] Investopedia, “Big Data,” [Online]. Available: https://www.investopedia.com/terms/b/big-data.asp. [Accessed 8 July 2018].
[11] Seeking Alpha, “Investing In The Background Of The 5G Revolution,” 12 May 2018. [Online]. Available: https://seekingalpha.com/article/4155504-investing-background-5g-revolution. [Accessed 8 July 2018].
[12] A. Nordrum, K. Clark and IEEE Spectrum Staff, “5G Bytes: Millimeter Waves Explained,” IEEE Spectrum, 6 May 2017.
[13] A. M. Antonopolis, in Mastering Bitcoin: programming the open blockchain, Beijing, OReilly, 2017, p. 8.
[14] D. Zanoni, “Blockchain Is The Next Big Thing,” Seeking Alpha, 27 May 2018. [Online]. Available: https://seekingalpha.com/article/4177564-blockchain-next-big-thing. [Accessed 8 July 2018].
[15] S. Samantha, “Blockchain engineers are in demand,” TechCrunch, 14 February 2018.
[16] A. Jamwal, “The Industrial Internet: Six Ways Manufacturers Can Fuse Big Data, Automation and IoT for Better Operations,” Industry Week, 10 November 2016.
[17] J. Greig, “IoT security spending to hit $1.5B in 2018 as targeted cyberattacks grow rampant,” TechRepublic, 21 March 2018.
[18] X. Yu, “Venture capital investment in AI doubled to US$12 billion in 2017,” South China Morning Post, 19 January 2018.