Neil Ballinger explains how these tiny particles will make a large impact in the future of engineering
At one billionth of a metre in size, it can be hard to grasp just how small nanoparticles are. To put it into perspective, if a nanoparticle was the size of a football, then a kiwi would be the size of the Earth by comparison.
Nanoengineering involves the careful manipulation of structures on the nanoscale and has been steadily growing as an area of interest since the invention of the electron microscope in the 1930s. Because they use beams of accelerated electrons as a source of illumination, electron microscopes have a higher resolving power than conventional light microscopes and can reveal the structure of much smaller objects.
These early interactions with structures on the nanoscale were primarily observational, rather than direct interactions, but they marked the beginning of our journey into nanotechnology.
Without them, it is likely that the Scanning Tunneling Microscope (STM) wouldn’t have been invented in the 1980s. This was the first microscope that could image and manipulate structures on the nanoscale and earned its inventors, Gerd Binnig and Heinrich Rohrer, the Nobel Prize for Physics in 1986.
Now, the uses of nanotechnology are growing and look set to make a big impact in industrial settings over the next decade. From drug delivery systems to smart sensors, the potential applications of this technology in the engineering industry are vast.
Nature is full of examples of big-data-like processes being performed efficiently by nanostructures in real-time, like components of the eye that turn external signals into information for the brain.
Now, engineers are experimenting with using nanomaterials and novel manufacturing techniques to develop smart sensors that are smaller, more complex and more energy efficient than their conventional counterparts.
For example, finely tuned sensors are printed onto flexible rolls of plastic and placed across key points of critical infrastructure to constantly monitor performance and structural integrity.
These new sensors produce large volumes of data at unprecedented rates, so new data handling techniques must be developed to process it efficiently. This will create new pattern recognition capabilities and revolutionise how we use sensors.
For example, traffic sensors that use nanotechnology to boost data handling rates and facilitate automatic congestion management programs on even the busiest roads, making them safer.
Furthermore, nanotechnology is being used to develop ultra-dense memory systems capable of storing an unprecedented wealth of data. But it’s also providing the inspiration for ultra-efficient machine learning algorithms that can process, encrypt and communicate data without compromising its reliability.
In the factory, this nano-scale machine learning will usher in the next generation of predictive maintenance that will identify faults earlier, with greater accuracy. This will lead to a further decrease in unplanned machine downtime as site managers and engineers can order replacement parts from an industrial parts supplier before a breakdown occurs, saving time and money.
As technology continues to evolve and obsolescence accelerates, obsolescence management has never been more important.
Neil Ballinger is head of EMEA at industrial parts supplier EU Automation.