Editor’s Note: At Inductive Automation, we like to engage in conversations with other thought leaders in the industrial space. Recently, our marketing content team had the opportunity to interview Johnny Chen, the Solutions Architect for OnLogic, which is a leading maker of built-to-last industrial computers and a partner in the Ignition Onboard program. See what Johnny has to say about edge computing, digital transformation, and other timely topics.
OnLogic has a diverse product line, much of which is ruggedized. What are some common use cases for rugged computing devices?
Our ruggedized computer systems and gateways are designed for environments that would typically damage or destroy traditional computer hardware. Common use cases include manufacturing environments like steel plants, carbon fiber cutting, injection molding or any plant floor that's subject to extreme temperatures, airborne particulate, variable power or vibration forces. In a situation where downtime can cost thousands of dollars per hour, you want a system that is designed to work 24/7, and that's where OnLogic hardware comes in. We also have a lot of customers who use our systems in remote or mobile installations where devices may not be easily accessible or need to be set up and forgotten about for long periods of time.
Why did OnLogic choose to start including Ignition Edge in their products? What needs were you trying to solve?
We wanted to give our clients access to the easiest and most powerful SCADA-in-a-box solution we could. We picked Ignition Edge by Inductive Automation because of its flexibility and ease of use in many of the applications our clients work in. When our systems are combined with full Ignition or Ignition Edge, it allows our systems to not only go into factories that require SCADA, but also be utilized in non-traditional industrial equipment in the field. OnLogic hardware with Ignition is being used in everything from in-field water filtration systems to industrial compressors that fit in the back of a truck. The versatility of Ignition matches perfectly with the way we've designed our hardware to fit and operate anywhere.
Can you elaborate on what considerations are important when developing hardware for use at the edge of the network?
Where the system will be used is a key consideration, and that will dictate the level of environmental protection the hardware needs. Beyond that, the amount of compute needed for the given application is the next decision point that will drive the development process. We've tried to create a line of hardware that's easily customizable to client needs and can be configured online. Today's edge applications come in all shapes and sizes, so it's really about providing as many options as possible so users get exactly what they need and nothing they don't in a hardware platform.
How important do you think edge computing is in the overall picture of IIoT and Digital Transformation?
I personally think it's the most important part of that transformation. Data needs are only getting larger, and the requirements in terms of speed of reaction are getting faster. You want to gather data and make decisions right at the edge. This reduces downtime, latency, and increases productivity and autonomy for large organizations, particularly at scale.
When a customer approaches new IIoT or Digital Transformation projects, is hardware usually the first consideration or is software? Are those types of projects primarily driven by hardware or by software?
Increasingly, the truth is neither. In the end, our customers care about the end solution. They want to know how they can most quickly and economically get from idea to implementation. The particular hardware and software is most commonly an afterthought, so it's our job to make implementation as easy as possible. That's why it's been so great to work with Inductive Automation, because your users have the same interests, and by combining our hardware knowledge with your software expertise, we're putting clients in a position to get to rollout and realize ROI much faster than if they had to put the pieces together themselves.
What are the primary communication networks that are important for edge-of-network installations? Does that affect the type of hardware and software that is used?
The truth is that we're often looking to implement whatever will work for a particular application. As much as standardization would make things easier, the fact is that every installation is different. In some factories, due to building design or existing infrastructure, they only have Wi- Fi to work with, and that's probably the most common scenario. But we have the ability to deploy everything from wired networks, Wi-Fi, LORA, LTE-M1, or 4G. For mobile applications we've seen a lot of success with 4G deployments, and we expect 5G to come along at some point, but we're a ways off from seeing that be a go-to technology in the industrial space.
In any industrial environment, wireless communication can be tricky with so many networks working simultaneously, so we're often engaging our Application Engineers to help clients access their infrastructure to get a clear picture of everything that needs to work together. That information then helps guide the hardware selection process. We'll need to determine if they need a single communication method, or if simultaneous transmission is necessary for efficient data handling or failover protection. That's where having experts in-house really pays dividends because we can help clients avoid some of the roadblocks we've seen other organizations run into.
MQTT has emerged as one of the leading communication protocols for IIoT. What are your thoughts about MQTT and other IIoT protocols?
MQTT has emerged as one of the default protocols and it's the one that most often comes up in conversations with clients, partially because it's the closest to a standard that we have right now. It's simple to use, lightweight and easy to implement on top of whatever else you're building. Ultimately, like the rest of the IIoT landscape, the protocol that's going to work best comes down to what you're trying to do.
Where do AI and machine learning fit in with Digital Transformation initiatives? Also, how do hardware and software play a role?
Hardware and software will work together to make this a reality, especially at the edge. To work in real time, and to work with autonomy, we must work together to create Artificial Intelligence (AI) running models at the edge. The next step is creating a learning loop between edge and server to inform and transform the Machine Learning (ML) model, allowing it to be self-sustaining and always improving.