Sensor Solutions International 2019 saw a record number of delegates and sponsors attend the event and the fourth annual Sensor Solutions International conference will be looking to grow this even further and build on the success of the previous events.
As in previous years, the 2021 conference covered all aspects of the industry, with presentations grouped into six key topics.
Autonomous transport and delivery industries present some of the greatest opportunities for advanced sensors including those employed in LiDAR, Radar, ultrasonic and digital imaging systems. The autonomous transport/delivery market can be divided into three primary areas: cars like those on today’s roadways that employ ever more sophisticated ADAS systems; with each model more autonomous functionality is added to achieve greater self-driving capabilities between Level 3 and Level 5—full autonomy; robo taxis and shuttles; pilotless/ automated delivery platforms. In its June 2019 report, Yole Developpement projected that the ADAS imaging segment (digital cameras and vision processors) within the overall market amounted to (USD) $4.4 billion in 2018 and is projected to reach $8.7 billion by 2024. Radar and LiDAR markets are expected to grow in a similar fashion as key applications enter production including gesture recognition, 3D perception, 3D LiDAR, night vision and mirror replacement. We will explore key sensors and sensor-critical systems that will enable these advances while considering the role of essential related technologies such as AI and edge data processing. We will also review key analysis of market potential while diving into test, assembly and packaging (TAP) needs. We will also look into systematic approaches designed to improve performance, reduce latency and enhance product and system-level reliability.
Wearable and portable sensor-enabled diagnostic and treatment tools present outsized opportunities for patients and manufacturers since existing devices are typically non-portable, designed only for clinical use, or require advanced training to operate. Highly accurate sensors are crucial components of next-generation healthcare devices designed to enable monitoring of movement, temperature, speed, muscle activity, blood pressure, and other physical indicators that can reveal critical health details. According to a June 2019 IHS Markit analysis, a key growth driver is personalization: the ability to go beyond data logging so that medically relevant information can be used for a variety of purposes. Advanced sensors combined with AI / data analytics are further enabling individualization trends that can foster greater patient self-sufficiency and be life-saving in remote regions lacking healthcare infrastructure. Advanced medical devices are forecast to spark global revenue growth of nearly (USD) $5 billion during the next five years to more than $15.5 billion by 2023 according to IHS Markit. We will explore sensors and SoCs for healthcare, AI’s role, and the need for secure data processing to monitor, diagnose, and treat a wide range of medical conditions. We will also delve into steps needed to increase confidence regarding the medical efficacy of next-generation healthcare devices.
Intelligent infrastructure systems with machine learning capabilities working in tandem with cloud-based analytics or in-device processing are fundamentally changing the way we think about edge computing services. The need for smarter, simpler, more self-sufficient edge-of-network computing is being driven by broad industry and consumer trends including the Internet of Things (IoT), autonomous transport and the rollout of 5G networks that can all benefit from or enable more powerful, low-latency analytic processing. For many businesses the edge has become the most mission critical part of their digital ecosystem. As requirements for data analytics and real-time processing become more urgent, the importance of the edge will accelerate to create opportunities for sensor systems that can ‘self-process’ data, delivering analytics as close to the user as possible. We will explore use cases where edge of network computing can transform sensory designs. We will also consider cases in which cloud computing for manufacturers isn’t suitable for those unwilling or unable to store or process data outside their facilities due to security considerations. We will dive into the growth in edge networking devices and network protocols (such as OPC UA and MQTT) and ways these are evolving to enable completely new analytical capabilities. We will also explore how sensing devices as well as analytical/AI software can be rethought to help manufacturers utilize their data for improved profitability and market reach.
Sensor fusion is the process of utilizing software to intelligently combine data from multiple sensors to enhance situational awareness and through this process improve application or system performance. By combining data from multiple sensors, designers and programmers can correct for deficiencies or limitations of individual sensors to, for example, calculate accurate position and orientation information along with a host of other tasks essential to safe operation and optimal functionality of the overall system. Whether to steer an autonomous vehicle, guide manufacturing robots or achieve any other desirable outcome the ultimate goal of fusion strategies is to aid sensors and sensory arrays to better comprehend their environments. We will explore the complementary roles of sensor fusion and data fusion as a means to integrate multiple data sources to produce more consistent, accurate, and useful information than is possible using a single data source. We will further explore the variety of key sensor fusion algorithms available to designers including those most commonly used across today’s markets in consumer and commercial applications. We will also explore challenges to the advancement of fusion technologies and how manufacturers are using AI/machine learning to expand capabilities while reducing latency and power consumption.
There are increasing needs for sensing technologies that can operate in harsh environments, such as those required for infrastructure monitoring in transportation and civil engineering, smart building, precision agriculture, and those exposed to chemicals and other adverse conditions in a variety of industries and environmental sectors including nuclear, oil and gas, mining, subsurface operations, as well as myriad terrestrial environments in which extremes of heat, cold, humidity and water incursion are common. Any situation that poses short-term hazards and/or long-term risk of adverse effects for people or life-sustaining infrastructures could all be deemed hazardous; situations that adversely affect humans frequently have similar effects on microelectronics. We will explore the challenges of manufacturing sensors that withstand harsh environmental conditions along with key issues tied to powering devices and securely collecting data from sensors deployed in harsh environments. We will also explore the expanding need for sensors and sensor fusion approaches specifically for use in harsh environments as well as the special challenges in designing for space, aviation and aerospace applications.
Whether designing a new sensor, subsystem or entire autonomous vehicles, simulation and modelling play increasingly important roles in technology maturation. Industry 4.0 principles aligned with IIoT capabilities can further streamline product development from conception to distribution. Simulation tools make it possible to prove concepts, create virtual prototypes and test changes in manufacturing processes long before any idea, platform or product is built. For autonomous transport, simulation is often the most practical and cost-effective means to enhance safety and performance by ensuring that key components function as designed and also operate harmoniously within a larger system. Simulation has become so important that some major vehicle autonomy developers offer ‘open source’ access to speed work and create transparency. Simulation is also at home on the factory floor where installation, test, trouble shooting and maintenance processes can all be enhanced through predictive data analytics; complex repair or installation procedures can be simplified through AR-enhanced resources. Model accuracy and flexibility will be explored. We will also delve into how simulation and automation providers are integrating new sensors and SoCs into testbeds for enhanced performance. We will illustrate how sensor and data fusion, machine learning and AI can enhance simulator performance while extending the benefits of automation into non-digitized sectors.