Pete Beckman, PhD, Computing, Environment & Life Science at Argonne National Laboratory and Northwestern University/Argonne Institute for Science & Engineering
The number of network-connected devices (sensors, actuators, instruments, computers, and data stores) now substantially exceeds the number of humans on this planet. Billions of things that sense, think, and act are connected to a planet-spanning network of cloud and high-performance computing centers that contain more computers than the entire Internet did just a few years ago. Parallel computation and machine learning are providing the basis for this new computing continuum that analyses data at the edge and uses HPC to model, predict, and learn. This new paradigm is giving rise to intelligent cities, smart agriculture, and advanced manufacturing. The Waggle (www.wa8.gl) platform supports parallel computing, deep learning, and computer vision for advanced intelligent edge computing. The platform is being used by the University of Chicago Array Of Things (AoT) project (https://arrayofthings.github.io), which is deploying hundreds of nodes in Chicago and across the world. Leveraging deep learning frameworks such as TensorFlow and computer vision packages such as OpenCV, nodes can understand their surroundings while also measuring air quality and meteorological conditions. Edge computing and deep learning is changing how data is collected and processed, and how autonomous actuation will operate scientific instruments, from radar systems looking up at clouds to cameras looking across ecosystems. This presentation will explore the computing continuum, and how artificial intelligence at the edge is now firmly connected to supercomputing.