Snowflake Exhibitor Demo: Unlocking Smart Manufacturing with IT/OT Convergence on the Snowflake AI Data Cloud

27 min video  /  20 minute read
 

Modern manufacturing generates vast amounts of data from diverse sources, creating challenges in data integration and utilization. Traditionally, data silos have hindered the scalability of analytics across manufacturing and supply chains. The Snowflake AI Data Cloud breaks down these barriers by seamlessly converging IT and OT data, accelerating smart manufacturing initiatives. Join us to explore how Snowflake empowers manufacturers to harness the full potential of their data, driving innovation and operational excellence in the era of AI and Industry 4.0.

Transcript:

00:05
Greg Sloyer: Well, thank you for coming. Sort of during lunch, before one of the keynotes, I'd like to thank Inductive Automation for having us present again. This is our second year presenting at the conference. My name is Greg Sloyer. I'm from Snowflake. I am the Manufacturing Industry Principal, so I look at the business side of things from Snowflake. All the usual, do not buy or sell stocks based on what I'm talking about. Don't plan your 401(k)s and retirement. I've been doing data and analytics for manufacturing, supply chain, operations, logistics, all that for abo ut 17 years now, not all of which was Snowflake. Prior to that, I had 20 years in the chemical industry, DuPont, BASF, and I ran global supply chains and logistics and all sorts of things like that in the chemical industry. So, why is Snowflake at the Inductive Automation ICC conference? I will set this up by saying, Snowflake, how many people are familiar with Snowflake today? Okay, so about half. So, Snowflake started out as a data warehouse, data lake kind of thing in the cloud.

01:17
Greg Sloyer: It's been about 12 years now, and in 2014, the big thing here is we operate across AWS, Azure, and GCP, so all across all the major three clouds. Our big thing, especially in the 2018 timeframe, when you see this disrupt collaboration and this cool-looking thing in the middle, which is maybe a little hard to see, but there's a lot of starbursts and fireworks-looking things, that is data sharing in Snowflake. This is between customers and suppliers, between partners and OEMs, between logistics groups and manufacturers, between what we call our marketplace providers, so data providers in Snowflake, providing things like weather data, commodity pricing, freight rates, logistics, things like that. There's about 2,600 data sets or so in Snowflake that are available. Really cool thing is we do this all without moving data. We're not moving data in Snowflake. It is pointers. We've gotten rid of the ETLs and FTPs and emails, and heaven forbid you put stuff in CSV files and ship them over to a friend of yours. This is all essentially permissions.

02:39
Greg Sloyer: You give permission for somebody to see a table or set of data, or they give you permission to see a table or a set of data or a set of tables. Once that permission is granted, that data shows up in your database like it's one of your tables. So now you are extending your, to incorporate that data in analytics and reporting; you extend your SQL with a join statement. That's what it comes down to. That was in 2018. We've been exploiting that, and more so we have been now building applications. So you're seeing major applications like Blue Yonder for supply chain and others replatforming to Snowflake. And this has really been the progression, and we continue to add on to this. A lot of AI, Gen AI, ML types of capabilities, I'm gonna talk about a couple of them today, being brought to the data. So what we didn't want you to do is spend a lot of time bringing all the data and talk about IT data and OT data today, bringing all that to Snowflake just to then pull it out and have to do something else with it somewhere else.

03:45
Greg Sloyer: The idea is let the data there; let's bring all those capabilities to the data so you can operate all within Snowflake. We launched what's called a manufacturing data cloud at Hannover Messe about a year and a half ago, April of last year. And we looked at what was needed in the industry, manufacturing in general, and what were a lot of these opportunities people are struggling with, things like that. Hopefully, a number of these resonate. So one was IT and OT convergence. Okay. This has been a big topic now for a number of years, and Snowflake had been great at bringing in typical ERP, especially like SAP data, into Snowflake. We've been doing that for a number of years. Lots of big customers who are doing that today with not just one SAP or ERP system but 10s, 20s, 30s. And all of this is published when I provide a name, but Carrier, for example, has 140 ERPs that they consolidate their data into Snowflake on. What we weren't as strong was on the OT side. So bringing in the shop floor data.

05:15
Greg Sloyer: This is where we really pivoted about 18 months, two years ago, working very closely with Inductive Automation, Cirrus Link, and a number of other partners to provide different architectural ways to bring the shop floor data into Snowflake and take advantage of the time series capabilities and a number of those other capabilities we'll talk about in terms of AI, ML, Gen AI, things like that, to the data, which is sort of that third point, which is bringing and really deploying advanced analytics to the data. The middle one is taking advantage of that data sharing. So this is broadening the visibility outside of IT, outside of OT, so the enterprise, and really extending that to that partner network, broadening the view of the supply chain, incorporating that visibility into the decision and analytics process. Really taking advantage of a lot of these different Snowflake capabilities. The difficulties, and I'm sure many of you have experienced this, is that for years, decades, the shop floor manufacturing sites have generally been an island. Different organizations, different functional reporting roles from that systems sort of standpoint.

06:18
Greg Sloyer: OT sometimes reported into the CIO, but generally not. It reported into VP manufacturing. This created a lot of separations from a systems standpoint, made it not technically difficult but more organizationally difficult to sort of integrate and bring that data in, integrate it with sort of the rest of the data. There's some architectural discussions that happen, things like that. So different opportunities. And for those of you who have multiple plants, what I always say is if you have 50 plants, you probably have 48 different MES, MRO, LIMS, or QM, and all those systems because a lot of those plants came up from acquisition. As I said, it's an island. Investments weren't made, or if it wasn't broken, we're not gonna fix it. That kind of thing. So a lot of that is changing, but also the architectural patterns that you have to utilize that data, especially to bring it to the cloud, need to account for the fact that all these plants are different. They may have different protocols, different configurations, all that kind of stuff. So the solutions have to be adaptable.

0:07:38.7
Greg Sloyer: And that's really where, in partnering with Inductive Automation, all that, we've helped simplify the environment of bringing that data into the cloud, into Snowflake. So let's first take a very broad view as we look at the supply chain. So as I mentioned, we have everywhere from marketplace providers with commodity data, pricing, availability, geopolitical kinds of things that impact supply, the logistics areas, really bringing that sort of view to the plant, as well as then when you look outside from the customer standpoint, especially if you're building connected products, so products out in the field that are generating their own data after they've left manufacturing. How do I incorporate and create that visibility up and through the supply chain, through the ecosystem, to be able to make decisions more holistically to not just help manufacturing but to help that whole enterprise environment? And then this is the thing we were really putting in 18 months or so, two years ago, which is that ability to get much more fine-grained, granular data at speed.

09:04
Greg Sloyer: I won't call it real-time. Let's call it near real-time into the cloud. This is not replacing shop floor systems. If you have a safety system, your cloud is not your best spot to put those. That's gonna be edge-driven kinds of stuff. But as we look at how do I take advantage of that data, how do I bring and broaden access to it, how do I look across those 50 plants, how do I run much more advanced mathematics on that data to do root cause, cycle time, predictive quality, all those kinds of things? That's really why we're pulling that data into the cloud and combining it with a number of other components of the data. For example, let's say you are great at doing quality control and things like that, even looking at that from the shop floor, the isolated plant level. What we wanna do is show that vision extending that supply chain to say, okay, that's not the only variables going into your manufacturing facility. Your supplier quality, that delivery variability, all of those things come into play when you start looking at quality or predictive maintenance, those aspects. And then how do returns, how does warranty quality, for those of you more on the discreet side, how does that impact and how can I utilize that information from customer service, from field maintenance, things of that nature to see what that potential root causes were that started in manufacturing and started in supply?

10:37
Greg Sloyer: So again, being able to broaden that view for those organizations that have started moving beyond doing really cool and fancy analytics on their shop floor data, how do I paint that vision of the future? And this is where we really see the extending of that data and incorporating more of the IT types of data into those decision processes. So Snowflake has many, many more partners than this. These were the partners as part of our manufacturing cloud launch. Marketplace partners, there's 2,600, 2,700 different data sets in the marketplace, but there's everywhere, as I mentioned, financial data, there's ESG data. If you type in ESG into Snowflake marketplace in the search, you're gonna come up with 40 or 50 different data sets that are available, freight rates for kites and dart, things of that nature. From that perspective, again, to help provide that greater visibility. As I mentioned, Snowflake has really been doubling down in terms of applications and the capabilities there, building those on Snowflake.

12:00
Greg Sloyer: So Blue Yonder and a number of others are replatforming. In a lot of cases, there's ones that were built specifically by companies on top of Snowflake, taking advantage of that power of the cloud and cross-cloud. So as they build something, it's not just in AWS or in Azure; it goes across all three. And then system integrators, the SIs there. And again, many, many more are partners. But these were ones that stood up and said, I have built in Snowflake a supply chain or manufacturing or operations type of solution with a customer. And they're raising their hand and going, "Yep, they did a great job. We did it all in Snowflake. And that's how they were on this list." And this continues to expand as we work through. The main areas I mentioned so supply chain optimization, smart manufacturing, and connected products are the three areas where we start utilizing the manufacturing data, the supply chain data, and that sensor data, whether it's coming from the shop floor or from connected devices out in the field, to be able to really provide that visibility, take advantage of the cloud infrastructure.

13:23
Greg Sloyer: And depending on which booths you go to behind you here, you'll see slightly different versions of this. This is my extended version. And sometimes today if I get surprised by a slide, it's because somebody's legal, they're legal, our legal, somebody's marketing department got a hold of them. So I always enjoy this 'cause then the slides are as much a surprise to me as they are to you. So with Ignition, and we've had this now in place with them a little over a year, I wanna say closer to 18 months that we've been working with Ignition or Inductive Automation and Cirrus Link. But this is the easy button for getting data into Snowflake. This is zero code. When the part of the secret sauce is that IoT bridge for Snowflake that is available via Cirrus Link. And this drops the data in from Ignition. So not just the tag data, but the metadata all around that, that structured data. So all of that lands in Snowflake. And if you have a chance to see a demo, Arlen Nipper, I don't know if Arlen's in the audience today. Arlen and others have done this many, many times. I'm probably on many, many calls per month with him with different customers and prospects.

14:55
Greg Sloyer: And within that demo, Snowflake goes from knowing nothing about your shop floor to knowing everything about it that's coming through Ignition. So very, very fast, very, very easy method for getting the data into Snowflake. And these are some of the reasons why we're one of the partners of this versus some of the other ways you can land that data within the cloud is we really looked at this with them in the process engineer. So the plant is driving the configuration in Snowflake. Snowflake is not defining a structure that you've got to be so many levels deep and it has to have certain kinds of attributes and all that. It is driven from the edge. So the plant defines how you look in Snowflake. Snowflake does not define that. That's one of the keys. And then some of these other nuances, for those of you who get into the much more excruciating detail about data types and things like that and what can you land. But we're landing this all with MQTT. The cool part of the demo for me in terms of the processes within Snowflake is that MQTT is great for transmitting data and for storing data.

16:22
Greg Sloyer: Really, really small footprints for both. Allows you to go very quick, allows you to get the data up because of that event sort of driven change control that they have. Not so great for BI and for analytics. It has lots of nulls. Mathematics tends not to like nulls, and BI tends to not like a lot of zeros with a spike and a lot of zeros with a spike. Snowflake's got a, we use views in Snowflake, so you're not gonna repopulate and have to store all that data. But from the view perspective, it's hydrating those nulls with the previously good value. So now you have an analytic data set that your data scientists tend to like without having to code anything. This is out of the box. It's driven the moment you've set up this connection. And your BI tools like it because it's not that flatline spike, flatline spike. So, great. You got the data in Snowflake. Now what can I do with it? This is where the past year or two, Snowflake has been bringing in a lot of AI, ML, Gen AI capabilities into Snowflake. We continue to release new stuff. This happens to be one of our, I'll call it an easy button.

18:01
Greg Sloyer: 'Cause different levels, different organizations have different capabilities around data science, around analytics use, things like that. And we have for the data scientists in the room, Python, Java, Scala, all that can be done in Snowflake. It's not just a SQL house. So you can be writing all the cool data science stuff. I don't think I have it on here, but we're in the booth right across the hall. For those of you into optimization, you can be running optimization, mathematical optimization in Snowflake through the Python libraries. It's really cool. I've been, in the '90s, I watched optimization fail. And I'll talk a little bit on why I think it failed. But the ways we're getting and the capabilities that now are being brought to this data are really, to me, driving a lot of really cool stuff happening in manufacturing. But this two lines of code here with anomaly detection, this is using an ML function. I think of this as the trend function in Excel. So for all of you who are familiar with Excel and you use the trend function, all you had to know to generate a forecast or see the trend of data was trend parameters, and what are those two, three, four parameters I had to put in the trend function.

19:21
Greg Sloyer: You did not have to know the mathematics behind it was least squares at the time. You didn't have to care. You could just write a trend function. This anomaly detection, and there's about a dozen of these, what we call Cortex ML functions, that are available is something similar. You have to know the parameters, just like I had to know with the trend function. But now I can run an ML-based, I think it's gradient boost, anomaly detection on the data as it lands. I don't have to be a data scientist to apply that mathematics to the data. So there are functions like forecasting, and there's a couple of others that are out there that are more manufacturing-based. Like I said, there's about a dozen overall. But the ones I tend to see for manufacturing, supply chain, or anomaly detection, forecasting, and there's some contribution factors, things like that, that really get exciting, 'cause you're applying ML techniques without having to be a data scientist. So simplifying this approach.

20:22
Greg Sloyer: The screens here that I'm showing are actually built in what we call Streamlit. This is a Python-based graphics package that is in Snowflake. So the data does not have to go out. It will not compete with a Tableau or a Power BI. We can also go to there. So if you wanna do really cool and fancy dashboards, super. We operate with those. But for folks, data scientists especially, who wanna just show quick, easy visualizations of the data, of the results of their very cool mathematics, this is available for them as well. So why do I think optimization failed? But why do I get nervous about being able to use advanced analytics broadly across your organization? And I'll point to this and go, there's different reasons. But the biggest one, and why I think in the 90s optimization failed, for example, and what I don't wanna see with things like Gen AI and AI and ML, which are really cool tools, is that we had organizations wanting to go from here to here without doing the groundwork in between. There was too much change management.

21:33
Greg Sloyer: There was too much. There was not enough data governance, data quality in those processes. Optimization is great if you've got really good data quality, especially pricing, timing, things like that. The mathematical models rely on that. That is no different for Gen AI, AI, and ML. The square root of a bad number is still a bad number. It doesn't get better because I threw cooler mathematics at it. So this is where we, working with the partners, working with you folks, it's not that we're gonna say, "No, don't ever do this." What I'm saying is, as a warning, keep in mind that those structures in place, where there's governance, and this is really where the IT and the OT coming back together to help, through this process, really help then create the environments where AI and ML are gonna be a lot more successful. Make sense so far? All right.

22:45
Greg Sloyer: So data foundation is necessary. We build these out. We work with the customers and the partners to deploy these things. Like I said, we've been great at IT data, really excited about all the partnerships we have to bring in the OT data, take advantage of our time series, geospatial capabilities, things of that nature. So you can do all sorts of cool math with that. And then extending those with the partner data or sharing that data with your partners, customers, suppliers, logistics, for example. So what's that mean? So from the Unified Namespace, this is what we are continuing to develop: bringing IT, OT, connected products, getting all that within Snowflake, improving that visibility, and allowing you then to run greater AI, ML models, Gen AI, at the data, not, again, separating it out. So that you can take advantage of not just the ingestion of that kind of data, but what do I do with it after I've got it somewhere? So with that, any questions before I send you across the hall to the 1:00 o'clock keynote?

24:15
Audience Member 1: For a lot of us, the issue is not just the data. It's also the application. So you saw, basically, a lot of the applications there. What's the result look like if you wanna get the data out of Snowflake and give it to an individual in order for someone on the shop floor to be able to use it? Does it have to live alongside each other? And should we not think about it like it's a replacement for a data broker? It's just something that lets you do higher-level data?

24:37
Greg Sloyer: Generally, the question is, is there a path to go from Snowflake, let's say, back to Ignition as well? There are organizations that have gone down that route. I would say that the Ignition group, Inductive Automation, is the best ones to talk to. There's always the security and protocols and things like that that you have to work through on that. Technically, I do not believe it's an issue. But generally, it's been a one-way path to go up into Snowflake, because then you're looking, like I said, if you have 50 sites, you may have 50 Ignition brokers or whatever, and they're coming up into Snowflake. So you're looking more holistically at that data. I've not seen SAP data go down to Ignition or anything like that. That's usually staying up within Snowflake. Sure.

25:25
Audience Member 1: Oh, somebody else. So at the beginning of the presentation, you talked about how it's kind of a big permission space, rather than storage space. But then later on.

25:32
Greg Sloyer: For the, for data sharing.

25:41
Audience Member 1: Okay, 'cause when we saw the architecture diagram, if you define it in the namespace for Cirrus Link, it moves up. Where is the storage part in that situation?

25:50
Greg Sloyer: So it's in Snowflake. The data is coming into Snowflake. It's stored there. You have chosen, as a customer organization, AWS or Azure or both, let's say, for different reasons. And Snowflake sits on top of that. So physically, they can talk to you about where it makes most sense. But generally, it's in Snowflake. One last question real quick. Yes.

26:11
Audience Member 2: No. Yeah, that was it.

26:14
Greg Sloyer: No. Oh, okay. All right. Super. So I've already been shown the hook kind of thing 'cause they want you to get across the hall for the 1:00 o'clock. But thank you. Appreciate your time. And we are across the hall for any more detailed questions.

Posted on December 5, 2024