Player FM - Internet Radio Done Right
Checked 16d ago
הוסף לפני thirty-one שבועות
תוכן מסופק על ידי Sonic Futures and The Green Software Foundation. כל תוכן הפודקאסטים כולל פרקים, גרפיקה ותיאורי פודקאסטים מועלים ומסופקים ישירות על ידי Sonic Futures and The Green Software Foundation או שותף פלטפורמת הפודקאסט שלהם. אם אתה מאמין שמישהו משתמש ביצירה שלך המוגנת בזכויות יוצרים ללא רשותך, אתה יכול לעקוב אחר התהליך המתואר כאן https://he.player.fm/legal.
Player FM - אפליקציית פודקאסט
התחל במצב לא מקוון עם האפליקציה Player FM !
התחל במצב לא מקוון עם האפליקציה Player FM !
פודקאסטים ששווה להאזין
בחסות
T
This Is Woman's Work with Nicole Kalil
30:48
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי
30:48Parenting comes with a lot of expectations, but have you ever stopped to ask where they come from? On this episode of This Is Woman’s Work , we unpack the myth of “the good mother” with Nancy Reddy , author of The Good Mother Myth . Nancy shares her personal experience of confronting the unrealistic ideal of motherhood: endlessly patient, always available, and completely selfless. She hilariously and heartbreakingly debunks these outdated ideals, rooted in flawed mid-20th-century research by figures like Harry Harlow and Dr. Spock. Nancy explains how bad science from the past continues to haunt modern parenting, creating pressure, guilt, and shame for today’s mothers. But this conversation is about more than debunking myths—it’s about empowerment. Because when you prioritize what matters and let go of perfectionism, you’re not just doing woman’s work—you’re modeling it. Connect with Nancy: Newsletter: https://nancyreddy.substack.com/ Book: https://read.macmillan.com/lp/the-good-mother-myth-9781250336644/ IG: https://www.instagram.com/nancy.o.reddy/ Essay: I Was Promised That The “Golden Hour” Would Make Me A Mom: https://slate.com/human-interest/2022/04/golden-hour-c-section-bonding-attachment-baby-friendly-hospitals.html Related Podcast Episodes: 126 / The Parenting Map with Dr. Shefali 090 / Unmasking Modern Motherhood with Katherine Wintsch 155 / Executive Motherhood with Ashley Quinto Powell Share the Love: If you found this episode insightful, please share it with a friend, tag us on social media, and leave a review on your favorite podcast platform! 🔗 Subscribe & Review: Apple Podcasts | Spotify | Amazon Music…
CXO Bytes
סמן הכל כלא נצפה...
Manage series 3582716
תוכן מסופק על ידי Sonic Futures and The Green Software Foundation. כל תוכן הפודקאסטים כולל פרקים, גרפיקה ותיאורי פודקאסטים מועלים ומסופקים ישירות על ידי Sonic Futures and The Green Software Foundation או שותף פלטפורמת הפודקאסט שלהם. אם אתה מאמין שמישהו משתמש ביצירה שלך המוגנת בזכויות יוצרים ללא רשותך, אתה יכול לעקוב אחר התהליך המתואר כאן https://he.player.fm/legal.
Tech leaders, your balancing act between innovation and sustainability just got a guide with the Green Software Foundation’s latest podcast series, CXO Bytes hosted by Sanjay Podder, Chairperson of the Green Software Foundation. In each episode, we will be joined by industry leaders to explore strategies to green software and how to effectively reduce software’s environmental impacts while fulfilling a drive for innovation and enterprise growth.
…
continue reading
7 פרקים
סמן הכל כלא נצפה...
Manage series 3582716
תוכן מסופק על ידי Sonic Futures and The Green Software Foundation. כל תוכן הפודקאסטים כולל פרקים, גרפיקה ותיאורי פודקאסטים מועלים ומסופקים ישירות על ידי Sonic Futures and The Green Software Foundation או שותף פלטפורמת הפודקאסט שלהם. אם אתה מאמין שמישהו משתמש ביצירה שלך המוגנת בזכויות יוצרים ללא רשותך, אתה יכול לעקוב אחר התהליך המתואר כאן https://he.player.fm/legal.
Tech leaders, your balancing act between innovation and sustainability just got a guide with the Green Software Foundation’s latest podcast series, CXO Bytes hosted by Sanjay Podder, Chairperson of the Green Software Foundation. In each episode, we will be joined by industry leaders to explore strategies to green software and how to effectively reduce software’s environmental impacts while fulfilling a drive for innovation and enterprise growth.
…
continue reading
7 פרקים
כל הפרקים
×1 Green Manufacturing and Supply Chains and the Role of Green IT and Responsible AI with May Yap 24:37
24:37
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי
24:37CXO Bytes host Sanjay Podder is joined by May Yap, Senior Vice President and CIO of Jabil, to talk about the intersection of green IT, responsible AI, and sustainable manufacturing. May shares how Jabil integrates renewable energy, circular economy principles, and AI-driven solutions into its global operations, contributing to its recognition as one of America's Most Responsible Companies. The discussion delves into Jabil's ambitious sustainability goals, including achieving carbon neutrality by 2045, and highlights initiatives such as energy-efficient manufacturing, water conservation, and e-waste management. May also emphasizes the importance of responsible AI and green IT practices like desktop-as-a-service, no-code platforms, and energy-efficient algorithms in driving sustainable innovation across Jabil's manufacturing and supply chain ecosystems. Learn more about our people: Sanjay Podder: LinkedIn May Yap: LinkedIn | Website Find out more about the GSF: The Green Software Foundation Website Sign up to the Green Software Foundation Newsletter Resources: America's Most Responsible Companies 2025 - Newsweek Rankings [07:11] Data centres & networks - IEA [11:08] Jabil Makes Meaningful Sustainability Progress, Releases Fiscal Year 2022 Report [15:59] Electronic waste (e-waste) | WHO [19:27] If you enjoyed this episode then please either: Follow, rate, and review on Apple Podcasts Follow and rate on Spotify Watch our videos on The Green Software Foundation YouTube Channel! Connect with us on Twitter , Github and LinkedIn !…
1 HBR Türkiye Business Summit: Sustainable Technology with Sanjay Poddar 19:50
19:50
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי
19:50In this episode of CXO Bytes , Sanjay Podder is hosted by Beliz Kudat to talk about the dual role of technology in driving sustainability while also contributing to environmental challenges. They explore how businesses can integrate sustainable strategies into their technology operations to minimize carbon footprints, optimize data center energy consumption, and leverage tools like AI and cloud solutions responsibly. Sanjay highlights actionable techniques such as carbon-aware scheduling, efficient coding practices, and emerging tools to measure the energy impact of AI. The discussion also emphasizes the business value of sustainability, including improved ESG scores, employee attraction, and outperforming competitors in shareholder returns, making sustainable technology a critical strategic imperative for organizations. Learn more about our people: Sanjay Podder: LinkedIn Beliz Kudat: LinkedIn Find out more about the GSF: The Green Software Foundation Website Sign up to the Green Software Foundation Newsletter Resources: Key Findings Data Centres Metered Electricity Consumption 2023 - Central Statistics Office ) [10:55] Carbon Aware SDK [12:54] CarbonCloud [15:40] Impact Framework 15:56] If you enjoyed this episode then please either: Follow, rate, and review on Apple Podcasts Follow and rate on Spotify Watch our videos on The Green Software Foundation YouTube Channel! Connect with us on Twitter , Github and LinkedIn ! TRANSCRIPT BELOW: Sanjay Podder: Hello and welcome to CXO Bytes, a podcast brought to you by the Green Software Foundation and dedicated to supporting chiefs of information, technology, sustainability, and AI as they aim to shape a sustainable future through green software. We will uncover the strategies and a big green move that's helped drive results for business and for the planet. I am your host, Sanjay Podder. Beliz Kudat: Okay, Sanjay, welcome to our business summit. Sanjay Poddar: Thank you so much for having me today. My pleasure. Beliz Kudat: It's a pleasure having you. So, you know, in today's rapidly developing digital technologies and this digital transformation, a significant dilemma arises, especially for sustainability. And on one hand, these technologies offer substantial, huge potential to address environmental issues. And on the other hand, there exists an entire substantial resource consumption. So first, we'd like to start by asking your perspective on this and how can technologies both solve and exacerbate environmental problems? Sanjay Poddar: Great question. And there's a duality here between technology and sustainability. You know, when you look at sustainability, and if you look at sustainable development goals that we have, the 17 sustainable development goals, one thing that strikes you that they are exponential in nature. The impact is huge. You know, we are not talking about small things. We are talking about scale. And you cannot do anything at scale without technology. And in this case, if we talk about information communication technology, we talk about artificial intelligence, for example, these are precisely the kind of tools we need today to address the sustainability challenges that we are facing, whether is it climate change, whether it is, you know, issues of building a more inclusive society, for example, biodiversity destruction that is happening. Each of these areas, you need technology, you need AI, you need blockchain, you know, you need digital, right? There is no second thought about it. In fact, we did a survey of companies and we found out that 70 percent of the companies we surveyed, who were able to reduce the carbon emissions in the production, in their operation, they were able to do it because they use artificial intelligence. Now, so there is absolutely no question about the role of technology in sustainability. But what we miss out is, you know, if we are not using this technology in the right way, in the right manner, technology itself has a carbon footprint. Technology can cause a big environmental impact. For example, technology can amplify the issues of bias. For example, privacy. So, we have to make sure that while we use this technology, we have to use it in a very sustainable and responsible way. And the data points, are very interesting. For example, the same AI that is going to help us so much. You know, if you look at AI, you know, you take a large language model like Bloom, which is open source, so some of the data we have, we know. A 160, 176 billion parameter model. When they trained it, you know, I think the carbon emission out of it is somewhere around 24.7 metric tons of CO2 equivalent. And if you look at all its life cycle, including the embodied carbon of the hardware on which it was trained, it goes up to 50 metric tons of CO2 equivalent, for example. And if you take larger models, you know, all the more popular large language models, they may go as high as 500 metric tons of CO2 equivalent. So the same technology that is helping us on one hand is also causing emission, carbon emission. And the impact is not just restricted there, as we know. It is also on other resources like water. You know, we can, you do some, you know, very harmless query to your, you know, the large language models for some questions, "where do I, which other cities I should visit in Turkey in my next trip to Turkey?" Right? You know, you asked 20, 30 questions. Behind the scene, that's half a liter of water that was used. For cooling the data centers, for generation of electricity, and we also know about the other dimension about energy use. So, that's the whole thing. Now, the good part is, we don't necessarily have to have such a severe impact. There are tools and techniques and methods whereby we can design, develop, deploy these systems in a way that they are much more, having lower impact on the environment. For example, they are safeguarding privacy, they give you much more safer response, so you know, there's less bias. So overall, it is very much possible to bring a culture such that the software you write is more sustainable and more responsible. So that's the silver lining, right? So to your first question, a big duality. If you are in business, therefore your strategy, your technology and sustainability strategy needs to be integrated. And you have to look at it very holistically, not just at sustainability by technology. And "how do I use tech to do sustainability," but sustainability in technology, "how do I make sure that the technology is being used in a much more sustainable and responsible way?" Beliz Kudat: Yeah. This is the crucial question as you said, and technology is crucial, as you mentioned in all those sustainability efforts as well. And we also know that software is at the core of all these technologies and companies need to adapt the way software is designed, developed, deployed, as you said, and used to minimize its carbon footprint. So how can they achieve this? Sanjay Poddar: Well, you know, the software stack, there are many decarbonization levers in the software stack. When you talk about a software stack, there's obviously the code itself, which has to be written in a manner that it makes less demand on the underlying hardware, for example, right? So you need to bring that kind of design patterns, architectures, choice of programming languages, all that have a bearing on the emissions or the energy use and emissions. For example, you know, there is a whole study about interpreted languages and compiled languages. You know, a language like C++, if you write a code and you write a similar code for doing the same thing with Python, obviously it is found that the C++ code will need less energy and will emit less carbon. Now, not to say that people have to write in C++ but it's just a data point that, are you even thinking about, you know, which language are you selecting? And then there are, around architectures, for example. And then a very interesting decarbonization lever is the migration of your workloads to hyperscalers, for example, to the cloud. And why does that reduce emission? Because the hyperscalers because of the scale and investments, they invest a lot in renewable energy. They have the right technology, like they use AI, for example, to make sure that their data centers are run with a relatively lower power usage, efficiency, what we call the PUE. So they have the elasticity because of economy of scale. Their utilizations are higher, so the idle time of hardware is less. And now if you see, there is, you know, a lot of investment in what they call the custom silicon chip. And that's the next big thing where you write software with the underlying hardware in mind, optimizing the capabilities of the underlying chips. And now, when you do all this, you know, the code you write, the system you build, it needs less energy. And also because this, cloud centers are typically, you know, you can select where you want to put your workload. You can select a location if your business strategy permits, where the carbon intensity of electricity is lower. In other words, the electricity is more generated by renewable energy, for example. As a result of this, not only you're using less energy, you're also, you know, emitting less carbon. And there are similar decarbonization levers even in the field of AI. You can, you know, you don't need to take the biggest of the large language models. You know, you don't need to use models with billions and trillions of parameters. You have to use the model which is fit for purpose. You have to use the model which gives you the required accuracy. And there are a lot of startups coming up in this field that allow you to do, for example, dynamic routing to a large language model, which has less emission, for example, right? And in the field of AI, a number of different techniques, you can do pruning, quantization. You can write your prompts in a way, you know, so that the overall emissions are lower. It's called green prompting techniques, for example. Probably that's a whole session I can take, but... Beliz Kudat: But I really would like to, I really would like to come to the AI and what can the companies can do about it, especially regarding the energy consumption. But I want to dig in a little bit more in the data centers, because we've been talking about data centers and everybody knows that how they impact the global energy consumption. And, so what innovative technological solutions can be applied here in data centers? And can we specifically discuss softwar-based solutions here? Sanjay Poddar: Yeah, you know, a number of different things can be done when it comes to data centers, and you're right, you know, the data centers are mushrooming, thanks to the generative AI, widespread adoption, in fact, some data points, I was looking, for Ireland, for example, the data center power usage, went, quadrupled from 2015 to 2023 from 5% to 21%. There are cities like London, which is not allowing new housing because there is a challenge of power. The power is being consumed by the data centers. Now, what are the kind of solution one can think about now? First of all, not all data centers are same, right? The data centers, and I also touched upon it in my earlier response, you know, we did a very detailed study for one of the hyperscalers, to understand, you know, if you move a workload from one data center to a hyperscaler, how much emission reduction is possible? You know, anywhere from 50 to 90%, for example. Again, there are several different factors based on which is the hyperscaler, which is the location, and so on and so forth, but typically you will see the PUE of hyperscalers because they run it at scale and for all the reasons I've mentioned, it's far better, right? That is one. Now, from a software-based approach perspective, you know, when you design workloads for a particular, system for to be run on the data center, you can make it much more carbon-aware. Now, what do I mean by carbon-aware? You know, your backup jobs, for example, will run when there is renewable energy, so they are, they're scheduled at the time of the day, or they will be run in a location where there's a bit lower carbon intensity of electricity, right? So, yeah, you know, I'm also the co-founder and chairman of the Green Software Foundation. One of the things that we built was, we have defined is the carbon-aware SDK. So you can, for example, use a carbon-aware SDK to figure out how do you make your systems, you know, run at a time when the carbon emissions are lower, the carbon intensity of electricity is lower. That is one thing. You can build systems which are more cloud native. Serverless architectures, for example. That is the other thing you can do. There are, the software-based solutions that, you know, more advanced data centers use, they use AI, for example, to predict how they can lower the energy that is used for non IT purpose, for example, cooling purpose, right? So they are able to optimize and distribute that energy. So there's a lot of use of AI there. Beliz Kudat: So when you just, I'm sorry I interrupted you, but when you just mentioned the AI, I also want to ask my other question too. Maybe you would like to combine your answers with them because I really would like to, we would like to learn about the tools and methods that can be used to measure the energy consumption of AI and machine learning models, too. So maybe you can... Sanjay Poddar: and, you know, this is, again, an evolving area, but I can tell you what state of the art, because a lot of new things are happening as we speak. But when it comes to AI, you know, there are, you have to look at AI very holistically across its life cycle, right? In traditional AI, people were more worried about training, whereas in the generative AI, people are now more worried about inferencing because that's where more emission is happening. Now, in each of these cases, how do you really measure the emissions happening or energy use, right? So when you are deploying AI, if you are deploying in your own infrastructure, the first thing you can do is the carbon accounting tools that each of the hyperscalers use, give you, right? And then you can use that to figure out, you know, how much emission is happening, how can you lower that? Because you can only reduce what you can measure. So that is, because end of the day, AI is also a workload, right? So you can, that's on the cloud side. And then, you know, there are techniques which are more on the software side that have come up, like the very recent ISO standard by Green Software Foundation called the Software Carbon Intensity Specification, that also can be used. Then there are a host of open source tools, you know, code that can be used with Python libraries. There is a, you know, Cloud Carbon, CCF Cloud Carbon Framework, you know, and again, the Green Software Foundation has created an impact framework. And then I'm also coming across a lot of API calls, which help you open source, which help you to tell how much was the carbon emission for every prompt that you just made, right? And there are a lot of startup systems coming up in this space. So this is a very evolving field. But, it's, coming up with a lot of open source solutions, a lot of solutions from big tech players, from the startup community. That's a big opportunity for the startup community. So that's what you have. A lot of host of tools. The GSF which I chair, we are also currently, focused a lot on the SCI for AI. That's the version that we are working on. Beliz Kudat: Okay, so, of course, there's this issue of ESG goals when we're especially talking about sustainability. So, how can promoting the sustainable use of technology contribute to the companies in achieving their ESG goals, and in attracting talented employees at the same time? Sanjay Poddar: No, I think this is the best question, right? Why should we even do it? You know, I know there's a bigger climate change sustainability thing, so it does appeal to the, you know, all the talented, you know, youngsters who are entering the field, right? You know, but the fact is, they want to work for organizations who are serious about the sustainability issues. So, that's about the talent. So, I'm aware of businesses which are weaving the sustainability messages in their corporate communication because they want to reach to their employees, to the stakeholders about what are they doing about it, and employees want to work for such organizations. There's a lot of research in that area. The other important research, in fact, we did in Accenture, was the correlation between sustainable technology, the ESG score, and business performance. And what we observed in our study was that sustainable technology, organizations which have a strategy on sustainable technology, they have a correlation to better ESG score compared to their peers in the market. And the other interesting fact was that, businesses which have better ESG score, they outperform their competitors 2.6 times in the total, you know, shareholder value they return, right? So, even if you are not as concerned for the planet as you should be, you still have a real tangible reason because your business benefits when you have better ESG score and your ESG score benefits when you embrace sustainable technology. So there's a pure correlation and obviously your employees are asking for it. So I think that is the other big driver for decision makers. These things cannot happen unless it comes from the top. It's a culture change, right? All things that we discussed. And it is a very strategic imperative. And that's what sustainable technology is all about. Beliz Kudat: Sanjay, thank you very much. It was a pleasure having you in our summit. And thank you for all these valuable insights. Sanjay Poddar: Thank you. The pleasure is all mine. Sanjay Podder: Hey, everyone. Thanks for listening. Just a reminder to follow CXO Bytes on Spotify, Apple, YouTube, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show. And of course, we want more listeners. To find out more about the Green Software Foundation, please visit greensoftware.foundation. Thanks again, and see you in the next episode.…
1 Sustainable IT and Supply Chains with Niklas Sundberg 27:05
27:05
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי
27:05Host Sanjay Podder is joined by a guest who embodies what it means to lead with purpose in the digital age. Niklas Sundberg is the Senior Vice President and Chief Information Officer at Kuehne+Nagel, one of the world’s leading logistics companies, with a mission to drive sustainable change across the supply chain industry. Niklas is a trailblazer in sustainable IT, author of Sustainable IT Playbook for Technology Leaders, and a respected voice on the role technology plays in building a sustainable future. His work goes beyond the logistics sector to shape the conversation on how technology leaders can achieve climate goals and navigate the challenges of data and energy efficiency. They explore how Kuehne+Nagel’s Vision 2030 aligns with sustainability initiatives and the Green Software Foundation’s Climate Commitments. From data storage practices to carbon-aware computing, they uncover what it takes to create a truly sustainable digital ecosystem. Learn more about our people: Sanjay Podder: LinkedIn Niklas Sundberg: LinkedIn | Book | Website Find out more about the GSF: The Green Software Foundation Website Sign up to the Green Software Foundation Newsletter Resources: Tackling AI’s Climate Change Problem | MIT Sloan Management Review [16:00] E-waste challenges of generative artificial intelligence | Nature [23:40] If you enjoyed this episode then please either: Follow, rate, and review on Apple Podcasts Follow and rate on Spotify Watch our videos on The Green Software Foundation YouTube Channel! Connect with us on Twitter , Github and LinkedIn ! TRANSCRIPT BELOW: Sanjay Podder: Hello and welcome to CXO Bytes, a podcast brought to you by the Green Software Foundation and dedicated to supporting chiefs of information, technology, sustainability, and AI as they aim to shape a sustainable future through green software. We will uncover the strategies and a big green move that's helped drive results for business and for the planet. I am your host, Sanjay Podder. Hello, welcome to another episode of CXO Bytes, where we bring you unique insights into the world of sustainable software development from the view of the C suite. I am your host, Sanjay Podder. Today, we are joined by a guest who embodies what it means to lead with purpose in the digital age. Niklas Sundberg is the Senior Vice President and Chief Digital Officer at Kuehne+Nagel, one of the world's leading logistics company with a mission to drive sustainable change across the supply chain industry. Niklas is a trailblazer in sustainable IT, author of Sustainable IT Playbook for Technology Leaders and a respected voice on the role of technology in building a sustainable future. His work goes beyond the logistic sector to shape the conversation on how technology leaders can achieve climate goals and navigate the challenges of data and energy efficiency. Today we will dive into how Kuehne+Nagel's Vision 2030 aligns with sustainability initiatives and the Green Software Foundation's climate commitments. From data storage practices to carbon aware computing, we'll uncover what it takes to create a truly sustainable digital ecosystem. Niklas, thank you for joining us on CXO Bytes. Welcome to the show. Please can you introduce yourself? Niklas Sundberg: Thank you, Sanjay. Very happy to be here. Yes. I'm Niklas Sundberg. I'm the Senior Vice President and Chief Digital Officer at Kuehne+Nagel and I'm very looking forward to our conversation here today. I'm also a member of the board of SustainableIT.org, which is a sister organization to Green Software Foundation and we do some work together as well to advance the field of sustainability within technology. So really looking forward to our conversation to hear and also share your journey into this. Sanjay Podder: Thanks Niklas. So my first question Niklas, you have just come off the back of Kuehne+Nagel's first ever Tech Summit, Beyond Boundaries; where a lot of focus was on the role of AI and innovation in logistics. Can you share some of the AI driven solutions Kuehne+Nagel is implementing and how they are transforming the logistics landscape? Niklas Sundberg: Sure, happy to, it was a great event, internal where we discussed not only about AI, but also, how we unlock data and traceability, asset tracking and so forth, real time, ETAs and so forth. So if I could just share some examples that we are working on that, I can talk publicly about it would be, how we work with customer service, for example, to be able to use an agent to respond to customer inquiries, both internally and externally, for example, and here we have scaled that out to a number of agents, but now we're looking to actually take the next step and scale it out to about 10-11, 000 people of the population. So really a mass adoption at scale, which I think is quite tremendous. And this type of use case can also be scaled across other types of functions like HR, finance, and other types of business units. Another one, which is maybe not that obvious, but extremely powerful in our business, where data is extremely important. The data quality is paramount when you speak to our customers because everybody wants to automate the whole supply chain flow as much as possible. What we talk about is e-touch, where we want to make the processes as streamlined as possible, run without human intervention, and so forth. And here we actually see that we can use gen AI to clean up our data and also staying clean. And we see that we actually get better results than a human would get where maybe we get about 70 percent quality with a human correcting, cleaning data and so forth. But with a gen AI agent, we get up to north of 95 percent data quality and also helps us to stay clean. And obviously this is very cost efficient as well. So we see a cost improvement of 95 percent on this use case. Another one I think is quite exciting is estimated time of arrival. So, together with our customers that they share data with us so that we can give them better ETAs when things will be arriving at port, arriving at a warehouse, and so forth. This leads to that the customers will have better staff planning, for example, they don't have to pay for excessive overtime, they get a better flow of goods into their warehouses, for example. So this is something that really benefits our customers, so to say. A fourth one, which is not that related to AI, would also be, how we do asset tracking, across the world, because I think this is extremely important. Where is my goods? Has it arrived at the airport? Has it passed customs? So to say, so not only looking at it from a wide perspective and looking at it from it has arrived at the airport, but also looking at the opportunity of geofencing, for example, so really very precise identity of where the goods are, so to say. So, really excited about what we're doing about our digital ecosystem. And a lot of our customers are also quite excited about this. And just to put this a little bit into perspective, when I looked at the numbers last year, we were roughly trading at one and a half billion messages per year with our customers and partners. So that's an extreme number, but we're actually continuously growing by 30%. So, the power of the digital ecosystem is extremely powerful where we can really integrate seamlessly into our customers, and make their operations run more smoothly. Sanjay Podder: Excellent, and I think the last point you made about how the number, you're scaling up your whole digital ecosystem, right? And it, it makes me wonder. about the sustainability implication, because I know, Kuehne+Nagel's vision 2030 to build a, you know, trusted and sustainable supply chain. My, my, the question that pops up in my mind, Niklas, is when you try to use all these wonderful technologies, generative AI, you spoke about customer support, accuracy of information; some of the challenges of technologies like gen AI is, for example, hallucination. you know, how do you make it bias free explainable when you have to exactly say where is the, you know, the good in the supply chain? So a lot of this risk that comes with gen AI. What we also talk about responsible AI risk. I would like to hear a little bit more from you on you know, how are you making this wonderful new transformation responsible so that, you know, there is less bias, there is more accuracy? You know, you spoke about the data, is the data free of bias? So yeah, can you just educate the audience and me a little bit more on what, how you're thinking about this dimension? Niklas Sundberg: Yeah, for sure. So I think it's important, regardless of any technology that you work with, that you're not trying to go out and find a problem to, with the technology, so to say. It's important to identify what is the problem that you're trying to solve. So what we have adopted internally, And we, within our responsible AI policy, we have nine principles that we are, or are, are really sort of targeting and communicating wide across. It's all obviously about data privacy, it's to make sure that we put a human in the loop, it's that we build AI in a sustainable way, that we are conscious about energy consumptions, water, and so forth. The key thing to adopt this technology is, is really to think about, the problem. What problem are we trying to solve? And then, secondly, the people. We always need to make sure that we have a people, a person in the loop regarding the technology when we buy. Because I think we are also in a nascent state with gen AI. We talked about hallucination. We also need to make sure that we continue to build trust in this. And I think this will take some time. So we really enforce a strong force that point that we also need to have a people aspect into this. And then the third thing is that we need to be principle driven. So coming back to the nine principles that we have, derived as part of our responsible AI policy. So to really sum up, it's three P's, which is quite easy to, remember. Problem, people, and principle. Sanjay Podder: Wonderful. Easy to remember the three Ps and I will probably put the S outside the P, which is about the next question, which you are so passionate about. I love the Amazon bestseller that you have written; the Sustainable it Playbook. The environmental impact, because very often when we talk about responsible AI, we forget the environmental impact, and that's something both SustainableIT.org, Green Software Foundation, we champion a lot right? What is the demand on energy, demand on water resources, emission, which is obviously going to snowball with the widespread adoption of AI as we see, but we all believe that there are steps one can take to bring this, you know, issues under control. So any thoughts on going back again to your book, as well as the big problems that you are, the first of the three P, the problems you are trying to solve, right? You know, how are you bringing the sustainability dimension and putting it in the center, right? You know, the environmental impact, also, given you're in Europe, the EU AI Act, a lot of new regulations coming up, would really like to see how this is translating in into practice, the perspective from a practitioner right from the top? Niklas Sundberg: Yeah, I think when I wrote the book, a couple of years back, that was sort of in the nascent state of gen AI. At that point, I think that the narrative around sustainable IT was a bit easier because if you're, you can run your code more efficient, use your hardware more efficient, use it for longer time, data centers powered by renewable energy and so forth, then that's also a positive case on IT cost. So, if you are efficient to reduce IT cost, then you can also be quite efficient to improve the sustainability parameters, if you are aware of the different levers, so to say. I think the challenge now that we have, if you fast forward a couple of years, and we see that the gen AI, Race is really powered by, you know, three, four powerhouses, so to say, into the space. And that forces everyone really to put more pressure on these larger organizations, like Microsoft, Google, and Amazon of this world. Unfortunately, what we see, obviously, is that we have, if you look at Google, they have increased their emissions by 48%. Microsoft have increased their emissions by 40%. So the promise that they made back in, 2020 when Microsoft, for example, said that by 2030 we're gonna be net positive, and by 2050 we're gonna give back all of the CO2 that we have. expanded since the inauguration of the company in the early 1970s. I think they are really struggling to meet this commitment now, which is becoming a bit of a problem. And also, if you take another example with water, for example, we also see that Microsoft, in the last two years, with the build out of OpenAI infrastructure, they have increased water usage by 14 million cubic meters of water. And just to put that into perspective into something that's tangible, 14 million cubic meters is the same amount that Reykjavik, the capital of Iceland, with 300,000 people in population uses in one year. So obviously, the water is becoming a big issue as well. And we already see that in the U.S in Iowa, for example, where Microsoft has put a lot of their data centers. So it's a competition between, do we build out the agriculture and the farming, versus building data centers? So, I do think we have, Some major challenges ahead of us as well, but then on the positive side, I think we also seen some black swan events like the energy crisis in Europe that wasn't really planned, so to say, but it came because of the evil of the war in Ukraine and everybody had to rethink their energy security, so to say. So within a couple of years, you really saw the build out of a lot of renewable energy sources. A lot of companies was rethinking in terms of the energy security, and then we're going more towards renewable energy sources. And I think that needs to continue to happen because otherwise we're really going to have a massive problem on our hands. When I wrote the book two years ago, nobody in the U. S., was not really talking about the energy consumption in, in the U. S., but now there are some numbers stipulating that if we're not careful, also in the U. S., by 2030, 25 percent of the energy, is going to go to data centers. So I think we are starting to build this massive problem and we really need to find, you know, cross industry solutions to start building these things out, so to say. And to find a good way in terms of how we build this harmony because I think the genie is out of the bottle when it comes AI, you're not gonna be able to stop it. But I think we need to find more sustainable ways to build sustainable infrastructure around this. I do think we also, unfortunately, need some legislation. I think we need to bring some more awareness. last year I wrote an article on MIT Sloan Management Review, where we also, put out a number where that one chat GPT call is equal to 100 Google searches, for example. So obviously it's massively consuming, not only when you train the model, but also when you're actually consuming the model, so to say. So, I think also we need to bring the awareness and I think you, we need to. have some more legislation around how we can build out sustainable infrastructure and not only sell the promise of what we can do with AI and build these sustainable solutions, but I think we have a very great responsibility to make sure that we build sustainable digital infrastructure. Sanjay Podder: Absolutely. It looks like a big reason for you to write the next version of your book. You know, you can have a whole chapter dedicated to sustainable AI, sustainable AI training, inferencing, fine tuning, and this is indeed a big challenge, you know, without any doubt, gen AI is going to transform our business in a very positive way, but we have to manage this risk at the same time. Going back to your book, Nicholas, you spoke at that time about three pillars when you spoke in the context of the IT strategy, sustainability in tech, by tech, and sustainability in IT, by IT, and IT for society, right? Do you want to talk a little bit more, especially given some of the recent challenges that you spoke about? You know, how do organizations, how do the chief information officers, and chief digital officers, look across these pillars as they craft these sustainable IT strategies for their organization. Any thoughts? Niklas Sundberg: Sure, so I think, you know, what I also mentioned in my book is the, the EU CSRD. Which now also, comes into life, so by 2025, you need to start reporting on your scope one, your scope two, and also your scope three, and for example, for a company like Kuehne+Nagel, 98 percent of our, scope is in scope three, which also means that we are reliant on our providers, our vendors to provide us with reliable data. And I think here it also needs to mature within the IT realm, to advance this sort of say. I think, I don't think that reporting will be perfect, but I think it's a good starting point to, to really start putting the headlights on these topics. but I think that what I still would recommend to do is that you at least establish a baseline within the context of sustainability in IT, in terms of your own footprint, to look at, okay, what are the bigger levers that, that you can leverage to become more sustainable? Is it in your hardware? Is it how you develop your software? Is it how you leverage cloud versus data centers? Can you leverage more automation? Is there an opportunity to relocate, your workload to, renewable data centers or renewable cloud providers, locations where they have renewable cloud? So I think it's important to not get overwhelmed because you can easily find 15 different great initiatives, so to say, but I think, you know, pick the three to four based on your baseline that really makes a difference. And then you will probably see that this can probably make 70-80 percent impact on the total scope of your emissions, so to say. Sanjay Podder: In your present role as well as earlier as a CIO of Assa Abloy, what are those top three levers you found very helpful, you know, where people should start with? It may, it may vary by organizations, but any particular ones you would like to highlight that we, like the low hanging fruits, we miss out often? Niklas Sundberg: I think, you know, it's important to also understand that every company has different starting points, so to say. But if you work with the large companies like Assa Abloy or Kuehne+Nagel of this world, I think you're always in a mix of having cloud, being on premise. So I think that's a great opportunity to look at your landscape and say, okay, how can we optimize this, for example, what makes sense to put in the cloud? Where can we put this more into containers? Where can we re-architect in terms of function as a service? And so forth, where the code is only, consumed in terms of energy expenditure when it's run, for example, rather than having a, a monolith application that just, it's idling for a very long period of time, but it's still consuming energy, so to say. So, I would really recommend to look into your data centers versus your cloud, where the opportunities are there. I would start measuring if you have a lot of internal software development. I would look to measure, your internal product teams, and make it a little bit of gamification here to really show how efficient your code is running, how can you optimize it and really bring that awareness to really put the power at the hands of the software engineers. I think that's a very important message. And then the third thing is, is obviously the e-waste, because I think any company that of a larger size, a company of 30,000 people with 30,000 assets and then a refresh cycle of three years. Only that hardware for laptops is 50,000 ton of CO2 over a 10 year period. So here, obviously there's a lot to do as well in terms of working more in a circular way, work with reputable partners that can help you to refurbish, upgrade, then either you can sell that hardware or you can actually donate it. So also to, to instill that third principle, IT for society or, or tech for good. So at the moment, for example, we are instilling one of those programs in Portugal where we see a huge need for teenagers, that go to school that don't have access to computers at home, for example. So we're, I think it's also important to think about the democratization of technology. So that, the ChatGPT, the technology that we are developing, it's not only for the privileged few, but we also need to make sure that we bring it in a democratized way. I think also that the IT for society piece is also important to think about to see if we can donate hardware, for example, because if we keep the hardware alive for another year, if we calculate on a three to four year life cycle, it's another 25 percent reduction in CO2 spent per hardware asset. So we really have a great benefit of keeping the hardware alive because this is also a massive, massive challenge, with 60 million ton of electronic waste every year. So it's comparable to the same amount as the Great Chinese Wall, when we talk about 60 million ton of electronic waste. And it's the fastest growing, e-waste stream in the world. So we really need to curb this. And since we're talking about AI, I recently saw a study as well from Nature, where they suggest that by the end of this decade, by 2030, the hardware, just from AI, could contribute to 5 percent of that, electronic waste by the end of this decade. So I think also we also need to make sure that we think not only about the energy, the water, but also, the circularity in terms of how we manage the hardware and the life cycle. Sanjay Podder: Excellent. And I'm glad you brought out the whole point of, you know, gamification and building a culture amongst developers, for example, for, because, you know, this is not just about writing efficient code or using less hardware. This is about organizations embracing a culture of sustainability IT of green software practices and that needs to come right from the top. And that's the purpose of this podcast as well. I also like the fact that you highlighted on e waste because very often we lose sight of embodied carbon. We are all thinking about operational carbon emission and the embodied in some cases may be larger than even the operational emissions. and therefore everyone needs to take care of how long you use that hardware and, you know, how do you do your procurement practices, which are much more sustainable and stuff like that. So these are all great, great insights. So finally, thank you, Nicholas. You know, this was, I'm sure for all practitioners, this is going to be very good insights to get started on this journey or even fine tune the journey. I'm looking forward to the next version of your book. I was very delighted to write a small piece in your earlier book, but would love to contribute on the sustainable AI part of the next version of your book. And thanks for everything you're doing to, accelerate the adoption of sustainable IT, with the playbook that you wrote, which for me is one of the finest books I have read on the topic. Thanks for joining us today. And I hope to see more progress in this area with your thought leadership, and SustainableIT.org, which is again a very fine organization that Green Software Foundation we love to work with. thank you so much. So, until the next time, this is the end of this episode, but I look forward to meeting you all in the next episode of CXO Bytes. And, just as a reminder, everything that we discussed, it will be linked in the show description below the episode. So see you again in the next episode of CXO Bytes. Niklas Sundberg: Thank you. Sanjay Podder: Hey, everyone. Thanks for listening. Just a reminder to follow CXO Bytes on Spotify, Apple, YouTube, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show. And of course, we want more listeners. To find out more about the Green Software Foundation, please visit greensoftware.foundation. Thanks again, and see you in the next episode.…
1 The Future of Green Payments with George Maddaloni of Mastercard 29:24
29:24
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי
29:24In this episode of CXO Bytes, George Maddaloni, CTO of Operations at Mastercard, joins Sanjay Podder to discuss how Mastercard is driving innovation in sustainable technology through green software practices. George shares insights on the company's approach to reducing energy consumption in software development, the role of AI and data in enhancing sustainability, and the importance of fostering a culture of green software from the top down. He also highlights Mastercard’s collaboration with the Green Software Foundation and how the organization is helping to shape their ESG goals. From edge computing to responsible AI, George provides a comprehensive look at how Mastercard is balancing technological advancement with environmental responsibility. Learn more about our people: Sanjay Podder: LinkedIn Geroge Maddaloni: LinkedIn Find out more about the GSF: The Green Software Foundation Website Sign up to the Green Software Foundation Newsletter Resources: Quantum cyber threats are likely years away. Why — and how — we're working today to stop them | Mastercard Events | Mastercard If you enjoyed this episode then please either: Follow, rate, and review on Apple Podcasts Follow and rate on Spotify Watch our videos on The Green Software Foundation YouTube Channel! Connect with us on Twitter , Github and LinkedIn ! TRANSCRIPT BELOW: Sanjay Podder: Hello and welcome to CXOBytes, a podcast brought to you by the Green Software Foundation and dedicated to supporting Chiefs of Information, Technology, Sustainability, and AI as they aim to shape a sustainable future through green software. We will uncover the strategies and a big green move that's helped drive results for business and for the planet. I am your host, Sanjay Poddar. Hello. Welcome to another episode of CXOBytes where we try to get into the world of sustainable software development from the perspective of the C-Suite. Today, I am extremely delighted to have a special guest, George Maddaloni. George is the CTO of Operations at Mastercard. I would like to talk to George today more about how Mastercard, a leading payment giant, is navigating the intersection of technology and sustainability. Also, Mastercard has been a member of the Green Software Foundation. So what has been the role of the foundation to help shape Mastercard's approach to sustainable technology? And finally, also talk a little bit about the future of sustainable technology the context of financial services. George, welcome to the podcast. George Maddaloni: Thanks for having me. Really appreciate it and looking forward to it. Sanjay Podder: Absolutely. And I'm so delighted about the contribution George is having in the field of sustainable technology. It's very rare to find a CIO who is so passionate about this topic. So this is going to be a great conversation. Before we dive further, I would like to give a reminder that all the things we discuss will be available and linked to the show note below the episode. So George, why don't we start with a few words from you about Mastercard and about yourself. George Maddaloni: Yeah, sure. So, George Maddaloni, CTO of operations for Mastercard. And maybe I'll start with some things that probably most people know about Mastercard and then lead into a little bit that, from a technology perspective, you don't. Mastercard connects billions of individuals and businesses to the digital economy and makes them an equal part of, or an inclusive part of that economy and really views its mission as enabling, empowering people as well as powering economies so that anybody that's out there, whether they're paying their bills, buying their groceries, they have that capability. And from a technology perspective, that means a lot in terms of what Mastercard does on a day to day basis. Every product we offer is a technology product. And for my team, that goes back to the mission that we have, which is to provide reliable, scalable, secure, and sustainable, technology platforms to continue to transform the payments industry. We, the team itself runs a vast network, that connects those billions of people to hundreds of millions of acceptance points, thousands of financial institutions, and back to many data centers around the world. To make all of that happen on a global basis, in a very low latency manner, because time matters in our business, and I really use this phrase within Mastercard that we're running a cardiovascular system of the company, and it's why our team is one of the larger technology divisions in the company and has a really front row seat to all the innovation, all of the product development that occurs and it helps also enable all of the employee technology across globe for the company. So, it's a great job. Something I'm really passionate about is technology, and it's great have the Sanjay Podder: opportunity to lead such a great team. Fantastic, George. And there cannot be a better use case for sustainability, given what you are trying to do at an enterprise scale, and I'm curious I have seen you as a person who deeply cares about sustainable technology. You have got recently the board of sustainableIT.org, Mastercard has been a member, a very important member of the Green Software Foundation. What makes you feel that this is an area that you are, that is important and you are particularly passionate about? What drives that passion? George Maddaloni: Yeah, I think, first off, the impact that technology has is great, but you can absolutely see that the growth that is occurring in the technology landscape, at no time, have, has technology been more important to everybody's day to day life than now. And I think as we think about our ESG goals as a company, all across ESNG, that Mastercard is a place that really deeply cares about those goals. We've put actually, both executive and our own compensation goals around that. And it's important as I said, that we're not just impacting, here for the business, but we're here for the world and the impact that we're making on the world, across those goals. So as a technologist, it's kind of natural to be understanding what's happening from an energy perspective. And for me, this became a little bit of how are we managing consumption in a more efficient way as things are growing? And a little bit of a platform for, to help, my team understand that consumption, and make sure that we're making the right choices. so I think it was both, "hey, macro level, this is having an impact, from an energy perspective." Even at a micro level or in a day to day decision making, how can we use that lens to think about the consumption we're about to put forward for a project or a refresh or software development, and what is that going to mean in terms of our overall footprint? Sanjay Podder: Fantastic. George, it just reinforces the conviction we had in the Green Software Foundation that while we focus a lot on the developers and how we enable them to write greener code, what is actually required is a culture of green software or sustainable tech. A culture that comes right from the top and you kind of reinforce that because unless it comes from the top, sustainability will never be a first class concern in your software development process. So that in some sense is the essence behind this podcast series where we are able to articulate what are leaders like you doing to make this real, make, it should not be academic, but it should be actionable, you know? And, this is great to hear from you. Very recently, I read a nice article from Mastercard, your Technology Trends 2024, extremely insightful. And I like the three areas that you have articulated, AI, computing, and data. And the question is, you also kind of explored the confluence of AI, computing, and data and how that's going to reshape commerce. And that's very powerful with all the examples. Now, as a consumer I'm also a user of Mastercard Payment. I'm thinking "how does that translate into kind of innovations you foresee happening in the way we do payment as consumers?" Right? And there was a very interesting statement in the report which I really liked. It said that, in the context of computing, it said that it's not about how computing will get more powerful, it's about how do you make that power, how do you distribute that power in an intelligent, trustworthy, and sustainable way? And that's again the interplay of sustainability and technology. So the question that I had in my mind is, as a consumer, how do I see that innovation playing out in the payment process? And how do you balance, therefore, that innovation with the sustainability dimension, given you're thinking about AI and data George Maddaloni: computing. Yeah, I Sanjay Podder: computing? George Maddaloni: I'll start actually where you started, back to the culture, and it, does start at the top, the company was one of the first first payments organizations to really put forward its net zero, an aggressive net zero goal. And that really, again, started the organization rallying around this particular topic. But every, piece of technology and those investments that we're thinking about as well, we're, back to this principle of "are we empowering people?" And a lot of times we have to think about, especially in the world of AI and data, this has been reinforced, what are our principles and how are we going to approach this, specific innovation? So, and you can't enable AI without a key focus on data. I think knows that, especially in the generative AI world, the more data that you have, the more effective that, that model is going to run. so we established a set of data, data, principles, handling principles. One very focused on eliminating bias, another very focused on making sure it's used from an inclusive perspective, and of course consumer protection, at the forefront a lot of the regulation that we're subject to, but much less helping influence. So when you're swiping your Mastercard as you were gracefully articulating, the power of the network includes AI capabilities. those have been around for over a decade, now and we've actually won other awards for a particular AI tool that we've developed that focuses on fraud, detection. And, you know, we've seen the improvements that we're able to make with new techniques in the AI space that, specifically generative AI techniques that we can employ in In those models, and I mean, they were already effective as well. I think everybody over the past 10 years has experienced some, "hey, is this fraud, is this not?" Or actually something that was detected and you get call saying someone is using your card. So that, those capabilities have been there for a long time, but using these generative AI techniques, we've been able to see, an improvement in terms of, on average, a 20% plus improvement in terms of the fraud detection and even a better improvement in terms of elimination of false positives, which, we, all want that time back. So from our perspective, that's, sending billions of dollars back into the economy, into people's pockets. and just out of the criminal hands and also preventing people from, or discouraging, at least, people stealing identity and stealing that data. So, when we're approaching these topics, I mean, sure, there's a sustainability topic, but there's also just "are you doing the right thing?" perspective. Are you holding your principles as you're deploying this new technology? And I think that area specifically has been a core example of where that mission really fruits out. Sanjay Podder: Fantastic. And I did read about your solution for detecting fakes, scoring and approving billions of transactions. And that caught my attention because the moment you start using generative AI and you also, I think, use recurrent neural networks. And when you use it at scale, the way you are you are actually scoring billions of transactions. George Maddaloni: Yeah, Sanjay Podder: is a George Maddaloni: 140 billion last year alone with a growth rate on top of it. So Sanjay Podder: But what did catch my attention was the fact that we all know that there was a time where we were all concerned about the energy use and emissions during training of generative AI and AI, but now, post generative AI, we're also concerned about the inferencing part, because every inference you make, there has been a lot of interesting studies, one from Hugging Face that says 30 to 40 inference, for example, is half a liter of water, and things like that, right? So there is an environmental aspect when we talk about training, refining the models, or, inferencing the models. So, clearly, in addition to bias, in addition to ensuring data privacy and various such responsible AI practices, a new concern is sustainability, which is also being highlighted in the EU AI Act. So I was wondering How does one address that aspect when, especially when you're talking about large volumes, billions of transactions? George Maddaloni: Great question, and actually, you hit on some things that are core in our technology training models. Large large data models, especially language gets very, consumes a lot very quickly. Payments language is not quite like the English language, first and foremost. So I think you got to think about from a training perspective, what do you need to focus on? And, not bring in too much information. Two, inferencing, absolutely more intense, rightfully so. I'd even double down on that and say, inferencing in real, in a real time business is an order of magnitude different. So, this comes back to, kind of topic we talk about a lot in the practices, you know, with the Green Software Foundation and other industry partners that we've been discussing, is how are you engineering this stuff and that decision you're making around what you're about to consume and bring into the fold? Because A, it could be very power hungry, but B, it could be very costly too. So our business wants us to make that decision at the same time. So these things usually go hand in hand. But the practices that we focus on first and foremost from a data center perspective the traditional PUE and how we're engineering the efficiency of our data center or where we're running that workload is absolutely critical and we're constantly looking at that and improving on that. But two, these practices around engineering for sustainability. We've incorporated for our technology team, we've created an ESG guide for technologists that cover, the things that they need to think about when considering a new technology, so from a procurement perspective, what are the things we want to see? From a new supplier, be it hardware or software for that matter, what are, what kind of information can we glean from what we're about to use? Because a lot of these tools that we're referring to, and you know, there's some heavy software underpinning that, that's making that consumption happen. And then number two, from a software practices perspective, development of code, use of data, back to that thing I was referencing before, like, you don't need to bring the whole model, right? You need to be focused on your actual, deployment. And then thinking about right sizing of the workload for either training or inferencing and making those decisions so that, You can create both a reactive, a responsive application and something that's not going to over-consume CPU storage, these things that we that we care about it comes to actually the deployment the technology itself. Sanjay Podder: Absolutely. All some great ideas. And, George, I also, read in the report, about your views on edge computing and you also referred right now real time, low latency. So, how do you see edge computing playing a role in your, operations process, and when you design with edge computing, how do you make sure that the processes are therefore getting more energy efficient, for example? Because with edge computing, with the new cloud computing at the other end you can now distribute the workloads in a very interesting way. So, how do you bring all these ideas together with the sustainability in the center to make the systems more energy efficient? You know, emit less carbon. George Maddaloni: It's, the nature of Mastercard's technology footprint. Edge computing is a it, and it's something that I realized, when I first got here, four years ago. You look at those thousands of endpoints that we run, which connect the largest financial institutions and the smallest financial, all financial institutions around the world. And it is, by and large, can be thought of as a large edge computing network. And We make some decisions there, we make some back in a data center or in a cloud in depending on the capability that we're, looking for, the product that we're running. And so, I think there's two things that we think about in this landscape. One is, can we make that edge computing more efficient? And we've actually explored that very, very carefully over the past year because that is such a critical part of the whole overall technology, that cardiovascular system that I referenced. And we were able to, via the advances that you can make now in a smaller footprint of equipment, we've easily been able to run that 25 percent more efficient from a power perspective using new generation of processors and capabilities on those servers. You think about a small rack mounted server and we can do more with it. So you can get 25 percent more capability out of something that's 25 percent more efficient right? That is huge capability that you can now enable at the edge to put more decisions there. And the better part of that, as you think about it, is you're not bringing all of that decisioning and forcing that CPU back home. So you're eliminating network, you're eliminating data center CPU cycles and things of that nature. So, I absolutely think getting that right, mean it is an engineering prequationhat you need to make, but getting that right is something that we focused on and I think is something that's a bit of core of Mastercard technology too. Sanjay Podder: Wonderful. George, moving to a slightly different topic on Mastercard has always been known for the programs of financial institutions, you know, empowering communities. I'm sure these initiatives are backed with intelligent digital solutions. When you build the solutions, how are sustainability factored in? Any thoughts there that you'd like to share? George Maddaloni: Yeah, sure. We've, I think everybody globally is starting to get interested what is in their own personal consumption, so we've actually, at a payments level begin to enrich, we not only do we enrich data that we share in our network around fraud, as I mentioned earlier, but we do have a service that some financial institutions take on in "hey, what is the sustainability information for this particular purchase people are making?" And we've partnered with various data sources that. are out there to provide that information. 'Cause it is, as you know, and we're very good at providing the technology as we think about that sustainability, scoping or the, what's the CO2 of an emissions aspect and all that. So we're bringing that data through as best we can and it's been a successful launch of that sustainability product and certain cusrtomers and financial institutions are using it. A lot of our capabilities continue to focus on "what are we doing for small and businesses give them competitive capabilities and to grow?" Countless examples of that in and probably, small, micro businesses, giving them the ability, throughout the world, great examples going on in India as well, in terms of community pass programmes things of that nature. Sanjay Podder: Great. And, George, I think it has been sometimes, you have maybe a year since you joined the foundation, the Green Software. Very curious to know, has the foundation been able to influence some of your thinking as you are trying to bring sustainability in the center of the way you do technology in Mastercard? So anything you would like to share? George Maddaloni: Yeah, the partnership with the Green Software Foundation has certainly helped us. I referenced before this ESG guide for technologists and some of the, we point people to the training program that the Green Software Foundation created with the Linux Foundation. I've received the certificates. I was one of many people at Mastercard that went through that training, and I think it's great because it helps people understand what is the index of the technology that they're deploying. There were several great examples provided from other organizations are doing great work. And I think any company, I think, gets very focused on what they're doing. So it's great to have that exposure of outside in. And we get that through a lot of the that we have. We've hosted before. We, have a software engineering guild or practice within Mastercard that has been participating in the Green Software Foundation. They have a special interest group within Mastercard where people can questions. The members that have participated in the sessions with the Green Software Foundation can help answer those questions. So, while we have close to, I think over 6 000 software engineers at Mastercard, all of those can be, you know, feeding off one another in terms of the knowledge that they're getting. And then by hosting some of the events, we have some of those players show up, hear from the other people that come in, and I think those are great examples, and I know we've got one coming up in October. Sanjay Podder: Absolutely, we are all looking forward to the summit in October. George it has been a wonderful discussion, but there's one question I like to ask all my guests. What would be your bite sized advice to tech leaders as they try to balance innovation and sustainability? George Maddaloni: I think, three things. I'll try to keep them bite sized. You know, all this is extremely important that we continue to collaborate within communities like we're here talking about, and others to understand how to work on sustainability and green practices. It's, this is a complex problem that I don't think anybody's out of the box, figured out. So that's number one. Keep collaborating. Number two is get ffocused on your data. Youknow, there's, like we just talked through, our tech footprint is a lot different than everybody else's tech footprint. You've got to dive into your own data and really get focused on that and understand what's happening. And then number three, I go back to, this is a great, focus on consumption. This is a great parallel to what you're spending, what you're producing, what you're consuming. And I've seen that culture, when you bring that to the table, the culture in your organization can really benefit from that. Because that's something that everybody wants understand and can start to make decisions off of it. So, it's collaborate, focus on the data, and then bring it to the culture. Sanjay Podder: Thank you, George. Thank you, George, for your contribution to this area of sustainable tech and for joining this CXOBytes podcast. So, that's all for this podcast. But again, a reminder, everything we discussed will be linked in the show notes below the episode. I will also request you to go to podcast.greensoftware.foundation and listen to other episodes of CXOBytes. So until then, I hope to meet you in the next podcast. Bye for now. Hey, everyone. Thanks for listening. Just a reminder to follow CXOBytes on Spotify, Apple, YouTube, or wherever you get your podcast. And please do leave a rating and review if you like what we are doing. It helps other people discover the show. And of course, we want more listeners. To find out more about the Green Software Foundation, please visit greensoftware.foundation. Thanks again, and see you in the next episode.…
1 AWS Summit AI-Ready Infrastructure Panel with Prasad Kalyanaraman, David Issacs & Neil Thompson 45:43
45:43
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי
45:43CXO Bytes host Sanjay Podder is joined by Prasad Kalyanaraman, David Isaacs and Neil Thompson at the AI-Ready Infrastructure Panel at the AWS Summit in Washington, June 2024. The discussion featured insights on the transformative potential of generative AI, the global semiconductor innovation race, and the impact of the CHIPS Act on supply chain resilience. The panel also explored the infrastructure requirements for AI, including considerations for sustainable data center locations, responsible AI usage, and innovations in water and energy efficiency. The episode offers a comprehensive look at the future of AI infrastructure and its implications for business and sustainability. Learn more about our people: Sanjay Podder: LinkedIn Prasad Kalyanaraman: LinkedIn David Isaacs: Website Neil Thompson: LinkedIn Find out more about the GSF: The Green Software Foundation Website Sign up to the Green Software Foundation Newsletter Resources: CHIPS and Science Act - Wikipedia [06:46] IMDA introduces sustainability standard for data centres operating in tropical climates [12:54] If you enjoyed this episode then please either: Follow, rate, and review on Apple Podcasts Follow and rate on Spotify Watch our videos on The Green Software Foundation YouTube Channel! Connect with us on Twitter , Github and LinkedIn ! TRANSCRIPT BELOW: Sanjay Podder: Hello and welcome to CXO Bytes, a podcast brought to you by the Green Software Foundation and dedicated to supporting Chiefs of Information, Technology, Sustainability, and AI as they aim to shape a sustainable future through green software. We will uncover the strategies and a big green move that's helped drive results for business and for the planet. I am your host, Sanjay Poddar. Welcome to another episode of CXO Bytes, where we bring you unique insights into the world of sustainable software development. I am your host Sanjay Poddar. Today we are excited to bring you highlights from a captivating panel discussion at the recent AWS Summit in Washington held in June 2024. The AI-Ready Infrastructure Panel featured industry heavyweights including Prasad Kalyanaraman, VP of Infrastructure Services at AWS, David Isaacs from the Semiconductor Industry Association, and renowned researcher Neil Thompson from MIT, and it was chaired by Axios Senior Business Reporter Hope King. During this panel, we take a look at the transformative potential of generative AI, the global race for semiconductor innovation, and the significance of the CHIPS Act in strengthening supply chain resilience. Together, we will hopefully have a better picture of the future of AI infrastructure and the innovations driving this field forward. And before we dive in here, a reminder that everything we talk about will be linked in the show notes below this episode. So without further ado, let's dive into the AI-Ready Infrastructure Panel from the AWS Summit. Prasad Kalyanaraman: Well, well first, for the avoidance of doubt, generative AI is an extremely transformative technology for us. You know, sometimes I liken it to the internet, right, the internet revolution. So, I think there's, we're very early in that journey. I would say that, at least the way we've thought about generative AI, we think about it in three layers of the stack, right? The underlying infrastructure layer is one of them, and I'll get into more details there. And then there is the frameworks. We build a set of capabilities that makes it easy to run generative AI models. And then the third layer is the application layer, which is where, you know, many people are familiar with like chat applications and so on. That's the third layer of the stack, right? Thinking into the infrastructure layer It always starts from, you know, obviously, finding land and pouring concrete and building data centers out of it. And then, on top of it, there's a lot more that goes in inside a data center in terms of the networks that you build, in terms of how you think about electrical systems that are designed for it, how you land a set of servers, what kind of servers do you land, it's not just the GPUs that many people are familiar with, because you need a lot more in terms of storage, in terms of network, in terms of other compute capability. And then you have to actually cluster these, servers together because it's not a single cluster that does these, training models. You can broadly think about generative AIs like training versus inference, and they both require slightly different infrastructure. Hope King: Okay, so talk about the generative, what that needs first, and then the inference. Prasad Kalyanaraman: Yeah, so the training models are typically large models that have, you know, you might have heard the term number of parameters, and typically, think of them as, and there are billions of parameters. So, you take, the content which is actually available out there on the internet and then you start, the models start learning about them. And once they start learning about them, then they start associating weights with different parameters, with different, parts of that content that's there. And so when you ask, the generated AI models, for completing the set of tasks, that's the part which is inference. So you first create the model, which requires large clusters to be built, and then you have, a set of capabilities that allows you to do inference on these models. So outcome of the model training exercise is a model with a different set of parameters and weights. And then inference workloads require these models. And then you merge that with your own customer data, so that customers can actually go and look at it to say, okay, what does, this model produce for my particular use case? Hope King: Okay, let's, and, you know, let's go backwards to, to just even finding the land. Yeah. You know, where are the areas in the world where a company like Amazon AWS is first looking? Where are they, what, are the areas that are most ideal to actually build data centers that will end up? Producing these models and training and all the applications on top of that. Prasad Kalyanaraman: There's a lot of parameters that go into, picking those locations. Well, first, you know, we're a very customer obsessed company, so our customers really tell us that we, that they need, the capacity. But land is one part of the equation. It's actually also about availability of green renewable power, which I'm sure we'll talk about through the course of this conversation. Being able to actually provide enough amounts of power and renewable sources to be able to run these compute capabilities is a fairly important consideration. Beyond that, there are regulations about, like, what kind of, of, content that you can actually process. Then the availability of networks that allows you to connect these, these, servers together, as well as connect them to users who are going to use those models. And finally, it's about the availability of hardware and chips that are capable of processing this. And, you know, I'd say that, this is an area of pretty significant innovation over the last few years now. We've been investing in machine learning chips and machine learning for 12 years now. And so, we have a lot of experience designing these servers. And so it takes network, land, power, regulations, renewable energy and so on. Hope King: David, I want to bring you in here because you know obviously the chips are a very important part of building the entity of AI and the brain and connecting that with the physical infrastructure. Where do you look geographically? What, or your, body of organizations when you're looking at maybe even diversifying the supply chain, to build, you know, even more chips as demand increases. David Isaacs: Yeah, so I think it's around the world, quite frankly, and many of you may be familiar with the CHIPS Act, the past, two years ago here in the US, something very near and dear to my heart. That's resulting in incentivizing significant investment here in the US, and we think that's extremely important to make the supply chain more resilient overall, to help feed the demand that AI is bringing about. I would also add that the green energy that was just alluded to that will also require, substantial, semiconductor innovation and new demand. So we think that improving our, you know, the diversity of chip output, right now it's overly concentrated in ways that are, subject to geopolitical tensions, natural disasters, other disruptions, you know, we saw during the pandemic, the problems that can arise from the supply chain and the problems, I guess most prominently illustrated in the, automotive industry, we don't want that holding up the growth in AI. And so we think that having a more diversified supply chain, including a robust, manufacturing presence here in the US is what we're trying to achieve. Hope King: Is there any area of the world, though, that is safe from any of those risks? I mean, you know, we're in the middle of a heat wave right now, right? And we're, you know, and we're not, and we're gonna talk about cooling because it's an important part. But do you, just to be specific, see any parts of the world that are more ideal to set up these systems and these buildings, these data centers for resiliency going forward? David Isaacs: No, probably not. But just like, you know, your investment portfolio, rule one is to diversify. I think we need a diversified supply chain for semiconductors. And, you know, right now, The US and the world, for that matter, is reliant 92 percent on leading edge chips from the island of Taiwan and the remaining 8 percent from South Korea. You don't need to be a geopolitical genius or a risk analyst to recognize that is dangerous and a problem waiting to happen. So, as a result of the investments we're seeing under the CHIPS Act, we projected, and in a report we issued last month with Boston Consulting Group, that the US will achieve 28 percent of leading edge chip production by 2032. That's, I think, good for the US and, good for the global economy. Hope King: Alright, I'll ask this question one last time in a different way. Are there governments that are more proactive in reaching out to the industry to say, please come and build your plants here, data centers here? David Isaacs: I think there's sort of a global race to attract these investments. There's, counterparts to the CHIPS Act being enacted in other countries. I think governments around the world view this as an industry of strategic importance. Not just for AI, but for clean energy, for national defense, for telecom, for etc. And, so there's a race for these investments and, you know, we're just glad to see that the US is stepping up and implementing policy measures to attract some of these investments. Hope King: Sanjay, can you plug in any holes that maybe Prasad and David haven't mentioned in terms of looking just at the land and where are the most ideal areas around the world to build new infrastructure to support the growth of generative AI and other AI? Sanjay Podder: Well, I can only talk from the perspective of building data centers, which are for example greener, because, as you know, AI, classically and now gen AI. They are, they consume a lot of energy, right? And based on what is the carbon intensity of the electricity used, it causes emission. So wearing my sustainability hat, you know, I'm obviously concerned about energy, but I'm also concerned about carbon emission. So to me, probably, a recent study in fact we did with AWS points to the regional variability of what, for example, AWS has today in various parts of the world. So if you look at North America, that's US, Canada, EU, you will see that the data centers there, thanks to the cooler weathers, cold weather, the PUE, the Power Usage Effectiveness is much better, lower, right? Because you don't need a lot of energy to just cool the data centers, for example. Whereas, you will see that in, AsiaPac, for example, because of warmer conditions, you might need more power not only to power your IT, but also to keep the data centers cooler. Right? So, purely from a geography perspective, you will see that, there are areas of the world today where the carbon intensity of electricity is lower because the electricity is largely powered with renewable energy like Nordics. But at the same time, if you go to certain parts of the world, even today, a lot of the electricity is generated through fossil fuels, which means the carbon intensity is high. So purely from that perspective, if I see, you know, some of the locations like EU, North America, Canada, even Brazil, for example, a lot of their grid has renewable energy. the, the PUE factors are more favorable, but having said that, I have seen, for example, governments in Singapore, they're creating new, standards for how do you run data centers in, tropical climates. They, in fact, one of the interesting things that they have done is they have raised the temperature by one degree Celsius, the accepted level of temperature in the data center because that translates to a lot of energy savings. So, wearing my sustainability hat, if you ask me where should the data centers be, I would say they should be in locations where, you know, the carbon intensity of electricity is lower so that we can keep the emissions low. That is very important. And obviously, there are various other factors because one needs to also remember that these data centers are not small. They take a lot of space. And where will this space come from, you know? Hopefully they don't cause a trade off with other sustainability areas like nature and biodiversity preservation. So the last thing I would like to see is, you know, large pieces of forest making way to data centers, right? Hopefully good sense will prevail. Those things won't happen. But these are some of the factors one need to keep in mind, you know, if you bring the sustainability dimension. How do I keep emissions lower? How do I make sure impact to water resources are less? One of the studies shows that 40 to 50 Inferences translate to half a liter of water. So how do I make sure natural, resources are less impacted? How do I make sure the forest and biodiversity are preserved? These are the things one has to think holistically when you select. And obviously other factors that Prasad will know, proximity to water supply for cooling the centers. So there may, it's a complex decision when you take to select a data center location. Hope King: I love the description and how detailed you went into it. I mean, I just, I think for all of us, you know, looking at our jobs as members of the press, right, we want to know where the future is going, what it's going to look like, and from what I am putting together from what everyone has said so far, I'm, thinking more data centers are going to be closer to the poles where they're cooler, and maybe more remote areas away from people so that we're not draining resources from those communities. And Neil, you know, I don't know if this is just, well, it's probably a personality thing. But like, I sit there and I say to myself. I could ask ChatGPT, because I've been dabbling with it, you know, to help me with maybe restructuring the sentence that I'm writing. Is it worth taking away water from a community? Is it, is me asking the query of it worth all the things that are powering it? I mean, these are things that I think about, I'm like an avid composter, like, this is my life, right? What, are we ultimately doing, right? Like, is, this, what are we all ultimately talking about when we now say AI is going to be a big part of our lives and it's going to be a forever part of our lives, but then you, know, you're hearing, you know, David and Sanjay and Prasad talk about everything that is required just to do that one thing to give me a grammar check? Neil Thompson: Yeah, so, I mean, for sure it is remarkable the sort of demand that AI can place on the resources that we need. And it's been a real change, right? You don't think of saying like, well, should I run Excel? You know, am I gonna use a bunch of water from a community because I'm using Excel, right? You don't think about that. And it's, but you know, some of the calculations, you know, there are things that you can do in Excel and you're like, oh, maybe I've, you know, I've put it on ChatGPT and I shouldn't, I should have done it, put it on Excel or something like that. So, yeah, so you absolutely have this larger appetite for resources that come in with AI. And the question is sort of what do you do about that, right? And so, I mean, one of the nice things is, of course, that we don't have to put everything on the data center, right? People are working very hard to build models that are smaller so that it would live on your phone, right? And then you have, you know, you still have the energy of recharging your phone, but it's not so disproportionate to training a model that is requiring tens of thousands of GPUs running for months. So, I think that's one of the things that we can be thinking about here is the efficiency gains that we're going to get and how we can do that and that's happening both at the chip level and also at the algorithmic level and I'm happy to go into a lot more detail on that if folks would like. But I think that's the trade off we have there is, okay, we're going to make these models more efficient. But the thing is at the same time, there's this overwhelming trend that you see in AI which is if you make your models bigger, they become more powerful. And this is the race that all of the big folks, OpenAI, Anthropic, are all in, which is scaling up these models. And what you see is that scaling up does produce remarkable changes, right? Even the difference between, say, ChatGPT and GPT 4, if you look at its performance on things like the LSAT or other tests that it's doing, big, big jumps up as they scale up these models. So that's quite exciting, but it does come with all of these resource questions that we're having. And so there's a real tension here. Prasad Kalyanaraman: Yeah, I would add that so one of the things that's important for, everyone to realize is that it is so critical to use responsible AI, right? And that Is Hope King: me, using that for grammar? Is that responsible use of AI? Well, I mean I think I should know Prasad Kalyanaraman: Responsible AI. Yeah. what that means is that we have to be careful about how we use these resources, right? Because you asked a question about, like, how much water is it consuming when you use Excel or, any other such application. The key is, you know, this is the reason why when we looked at it and we said, look, we have a responsibility for the environment and, we were actually the, the, ones that were actually came back and said, we need to get to net zero carbon by 2040 with the climate pledge 10 years ahead of the climate accord. And then we said, we have to get to water positive. I'll give you a couple of anecdotes on this. So we will be returning more water to the communities than what we actually take on AWS by 2030. That's quite impressive if you actually think about it. And that is a capability that you can really innovate on if you try to think about how you use cooling and how you actually think about what you need to cool and so on, right. The other day I was actually reading another article, in Dublin, in Ireland, where we actually used heat from a data center to help with community heating, right? And so district heating is another one. So, I think there are lots of opportunities to innovate on this thing, to try and actually get the benefits of AI, at the same time be responsible in terms of how we actually use it. So, Hope King: talk more about the water. So, how exactly is AWS going to return more water than it takes? Prasad Kalyanaraman: Yeah, so I'll tell you a few things there. One is, so just the technology allowing us to look at leaks that we have in our pipes. It's a pretty significant amount of water that gets leaked and gets wasted today. And we've done a lot of research in this area trying to actually use some of our models, trying to use some of the technology that we built, to go look for these leaks from municipalities when they actually transfer it. And that's one area, renewable water sources is another area, so there's a lot of innovation that has happened already in trying to get to water positive. We took a pretty bold stand on getting to water positive because we have a high degree of confidence that this research will actually get us there. Hope King: Neil, how do you, is this the first time you're hearing about the water being, I mean this sounds like an incredible development. Neil Thompson: It is the first time I'm here, so without more details, I'm not sure I can say more about it specifically. But Hope King: is it, but it's almost like, you know, we need AI to solve these big problems. It's almost, it's, it's, a quagmire. I mean, it's, yeah, you have to use energy to save energy. Like that, that, it's a paradox. Neil Thompson: Sure, well, so let me, so I guess let me say two things here. So what, one is to say that it's, like, it is absolutely true that we have, there are a bunch of costs that are associated with using AI, but there are a bunch of benefits that come as well, right? And so we have this with almost all the technologies, right? We, when we produce you know, concrete roads and things like that. I mean, there's a bunch of stuff that goes into that, but it also makes us more efficient and the like, and in some of the modeling that my lab has done and others have done, right, the upside of using AI in the economy and even more so in research and development to make new discoveries can have a huge benefit to the economy, right? And so the question is like, okay, if we can get that benefit, right, it may come with some cost, but then we need to think carefully about, okay, well, what are we going to do to mitigate the fact that these costs exist, and how can we deal with the things that they come in? Hope King: All of these things are racing at the same time, right? So you've got the race to build these models, to use it to get the solutions, but then you've got to build the thing at the same time, and then you've got to find the chips. And, I mean, like, obviously it's not my job. My job is to ask the questions. I don't understand how we can measure like what, what of these branches of this infrastructure is actually moving faster and it doesn't, does one need to move faster than the other in order for everything to kind of follow along? I don't know if anybody in here Prasad Kalyanaraman: Yeah. Look, there's sometimes a misconception that these things started over the last 18 months. That's just not true. Of course. Right. Yeah. Data centers have been there for a long period of time. Okay. Trying to actually use, like, right now if I talk about cloud computing, you know, many of us are very familiar with that, like, ten years back or a decade back, we were very early on that. So, it's not something that is a change that has to happen overnight, right? It's something that has evolved over a long period of time. And so, the investments that we've been doing over 15 years now, are actually helping us do the next set of improvements that we have. So, you know, we say this internally, there's no compression algorithm for experience. And so, you have to have that experience and you have to actually have spent time going all the way down to the chip level, to the hardware level, to the cooling level, to the network level, and then you start actually adding up all of these things, then you start getting real large benefits. Hope King: So, on that point though, is it about building new or retrofitting old when it comes to... because I've seen reports that suggest that building new is more efficient in another way because, you know, rack space or whatever. So just really quickly on that, I know Neil wants to jump in. Prasad Kalyanaraman: It's a combination. It's never a one size fits all. Generally Hope King: speaking, as you look at the development data. Prasad Kalyanaraman: I would say that there's obviously a lot of new capacity that's being brought online. But it's also about like efficiencies in existing capacity. Because when we design our data centers, we design our data centers for like 20 plus years. But then, hardware typically actually lands, is useful about like six to seven years or so. After that you have to refresh. Right. So you have an opportunity to go and refresh older data centers. Yeah, Hope King: it's not an over there update. Yeah. You know, what were you going to say? Neil Thompson: So you asked specifically about the race. I think that to me is one of the most interesting questions and something we spend a lot of time on. And it certainly is the case there's been an enormous escalation. So since 2017, if you look at large language models have been something like a 10x increase in the amount of compute being used to train them each year, right? So that's a giant increase. Compared to that, some of these other increases in efficiency have not kept pace. So what you're talking about is the most cutting edge, the resources required are going up. But it's also the case that if you look at generally the diffusion of technology that's going on, right? Maybe some capacity already exists, but other firms want to be able to use it. That's just spreading out. Well, there you get to take advantage of the efficiency improvements that are going on. And at the, for example, at the chip level, if you look in terms of the floating point operations that are done, the improvement is between 40 and 50 percent per year in terms of costs of flops per dollar. That's pretty rapid, but even that is actually not that rapid compared to efficiency improvements. So efficiency improvements, for those who sort of don't think about it in this way, think about it as like, I have some algorithm that I have to run on the chip, that's going to use a certain number of operations, and the question is, can I design something that achieves that same goal, just asking for fewer operations to do, right? It's just a pure efficiency gain. And what we see is that in large language models, the efficiency is growing by 2 to 3x every year. Which is huge, right? So if you think about the diffusion side of things, actually there, efficiency gains are very high, and we should Feel a little reassured that as that happens, the demands will drop. Yeah, Hope King: but then you have to contend with the volume. Because, you know, even if you reduce, I mean, even if you're making each more efficient, you're still multiplying it by other needs, so what does that offset? Math, net. Neil Thompson: Yeah, so, I mean, the question there is exactly how fast is AI growing? Sure. And that actually is a very, turns out to be a very deep question that people are struggling with. So, you know, many, unfortunately, many of the early surveys that were done in this area were sort of what you might call a convenience sample, like you called people you, cared about because they were your customers. But that was not representative of the whole country. So a couple of years ago, the census did some work, found out that there was about six percent of firms had actually been operationalizing AI. Not that many. So we know that's going to be growing a lot, but exactly how fast, we're not sure. I think what we can say, though, is as that happens, you know, we could have a moment where it's happening faster, but as long as these efficiency increases, you know, continue over the longer term, and we do, you know, so far we've seen them to be remarkably robust. That suggests that as that diffusion happens, then it will go down, so long as we don't all move to the cutting, the biggest cutting edge model in order to do it. Hope King: David, I think you have probably some insight into how quickly the chip makers themselves are trying to design more efficient chips, you know, to this end. David Isaacs: Yeah, let me just, start with the caveat that I'm not a technologist. I'm, a scientist, but I'm a political scientist. But, you know, so, but I'm glad that conversation has turned to innovation and efficiency gains because some of the questions ten minutes ago were was assuming that, the technology would remain the same and that's not the case. Many people in the room are probably familiar with Moore's Law and the improvements in computing power, the improvements in efficiency, and the reduced costs that have been happening for decades now. That innovation pathway is continuing. It's not necessarily the same in terms of scaling and more transistors on silicon, but it's advanced packaging. It's new designs, new architectures. And then there's the software and algorithm gains and the, like. So, we believe that will result in the efficiency gains and the resource savings. There's been a lot of third party studies. We know for the semiconductor industry that, the technologies we enable have a significant multiplier effect in reducing climate emissions, conserving resources, whether it's in transportation or energy generation or manufacturing. So we think there's a very substantial net gain from all these technologies. And I guess the other thing I would add is, you know, the CHIPS Act, the formal name is the CHIPS and Science Act, and there's very substantial research investments. Some of which have been appropriated and are getting up and running. Unfortunately some of the science programs have been, and this is Washington speak, authorized but not yet appropriated. We need to fund those programs so that this innovation trajectory can continue. Hope King: Yeah, I mean, just by nature, you know, we're the skeptics in the room, and we're looking at just the present day, and the imagination that is required for your jobs, is one that's not easily accessible when a lot of us are concerned about the here and now. So, I appreciate that context, David. Sanjay, I want to talk about, what is going to take though in terms of energy, right? that is something that will remain the same in terms of the needs. So what are you seeing in terms of how new data centers are being built or even maybe systems, clusters of, infrastructure that support these data centers that, that you're seeing emerging and what is the sort of best solution as you know, more companies are building and looking for, land and, to actually grow all these AI systems? Like what, are the. Actual renewable sources of energy that are easy and sort of at hand right now to be built? Sanjay Podder: Well, I'm not an expert on that topic, so I won't be able to comment much on that, but I can talk about the fact that, you were discussing on efficiencies. For the same energy. You know, you can, do a lot more with ai and what I mean by that is, the way we build AI today, whether it's training or inferencing, there is a number of easy things to do, which has a huge impact in amount of energy you need for that AI. I think there was a reference to, for example, large gen AI models. They are typically preferred because, probably it gives more accuracy. But the reality is, in different business scenarios, you don't necessarily have to go to the largest of the models, right? And, in fact, most of the LLM providers today are giving you LLMs of different t shirt sizes. 7 billion parameters, 80 billion parameters. One of the intelligent things to do is, fit for purpose. You select a model which is good enough for your business use case. You don't necessarily go to the largest of the model. And the energy need is substantially lowered in the process, for example, right? And that to me are very practical levers. The other thing, for example, there was, I think, Prasad referred to that these models, end of the day, they are deep learning models. You know, there are techniques like quantizing, pruning, by which you can compress the models, such that smaller models will likely take less energy, for example. And then, of course, you know, inferencing. Traditionally, we have been always thinking about training with classical AI. But with generative AI, the game has changed. Because with gen AI, given it's how pervasive and everybody is using it, millions of queries, the inferencing part takes more energy than training. Now again, simple techniques can be used to lower that, like you have one shot inferencing, if you batch your prompts, now when you do these things, you come to your answers quicker. So, you know, you don't have to go and query the large model again and again So if you want to, you know, design your trip itinerary to San Francisco, instead of, you know, asking 15 prompts, you think about it, how do you batch it, this is my purpose, this is why I'm going, you know, this many days I'll be there, and, you know, it is very likely you'll get your itinerary in one shot, and in the process, a lot less energy will be used. So the point here is, and of course data centers, right? All the chips that we are talking about, all these can, the custom silicons can lower the energy needs. So while I may not be able to comment on, you know, what are the best sources of energy because renewable energies are of various types, solar, wind, now people are talking about nuclear fusion and whatnot, but I'm seeing even within the energy that we have, we can use it in a very sustainable way, in a very intelligent way so that you get the business value you're seeking without wasting that energy unnecessarily. Hope King: It sounds like you want companies to be more mindful of the, of their own data needs and not be overshoot, but be more even efficient in how they're engineering, what the, applications can do for them. Sanjay Podder: Absolutely right. And you know, and on that point, about 70 to 80%, maybe more of the data is dark data and what is dark data? These are data organizations that store with the hope that one day they will need it, and they never need, require them. So you are simply storing a lot of data, and these data also corresponds to, you know, energy needs. Right? So, there are a lot of rebound effect that you mentioned some time back, because compute, the cost of compute went down, storage went down, the programming community became lazy programmers. So what happened as a result of that is, You know, you are having all this, you're not bringing efficiency in the way you're doing software engineering. You are having all these virtual machines, which you are hardly using, they're hardly utilized. They all require energy to operate. So there's a lot of housekeeping we can do, and that can lower energy, you know, in a very big way. Energy needs, right? And then, because even if there's a lot of renewable energy, IT is not the only reason, or AI is not the only reason where the renewable energy should be used. There are other human endeavors. So, we have to change our mindset, be more sustainable, more responsible in the way we do IT today, and we do AI today. Hope King: Did you want to add to that? Prasad Kalyanaraman: Yeah, I would say that, you know, what he said is 100 percent right, which is that, as I said, like, you kind of have to actually think through the entire stack for this. And you have to start going off for every stack and thinking about how you optimize it. Like, I'll give you an instance about cooling, and I know you wanted to talk about cooling. Hope King: Yes, air conditioning for the data centers, and now the liquid cooling that's coming in to try to Exactly. Yes, go ahead. Prasad Kalyanaraman: So over the years, what we've actually done is, we have figured out that we don't need liquid cooling for a vast majority of compute. In fact, even today, like pretty much all our data centers run on air cooling. And what it means is we use outside air to actually cool our data centers. Okay. it, none of our data centers are nearly as cool as what you are in this room, by the way, just to be very clear. Hope King: It's not as cold as this room? Prasad Kalyanaraman: Not even close. Hope King: What's the average temperature in a data center? Prasad Kalyanaraman: It's well above 80 degrees. Hope King: Really? Yeah. Okay. what's the range? Prasad Kalyanaraman: It's between eight, between 80 to 85. Now, the thing is that you have to be careful about, cooling the data centers too much because you have to worry about relative humidity at that point as well. And so, we've spent a lot of time in computational fluid dynamics to look at it to say, do we really need to actually cool the datacenters as much? And it's one of the reasons why our datacenters run primarily on air, and outside air actually. Now there's a certain point of time where you cannot do it with just air, that's where liquid cooling comes in. But liquid cooling comes into effect because some of these AI chips have to be cooled at the microscopic level and air cannot actually get there fast enough. But even there, if you think about it, you only need to liquid cool a particular AI chip. But as I said, AI does not require just the chip, you need networks, you need like the storage and all that. Those things are still cooled by air. So, my estimate is that even in a data center that has primarily AI chips, only about 60%, 60 to 70 percent of that needs to be liquid cooled. The rest of it is just pure air cooled. Hope King: I think that's pretty fascinating, I think, because I think that's been, discussed more recently that liquid cooling is actually more crucial. I don't know if you saw Elon Musk tweeting a picture of the fans that he has. He like made a pun about how his fans are helping. Whatever, anyways, go check it out. It's on X. thank you for, to, to circle back on the cooling. I think that was definitely, you know, I think a big question for energy use because that, those systems require a lot of energy. As we come to the last couple of minutes, you know, I want to turn the conversation forward looking. The next two to five years, if I were to be back here with the four of you sitting on this stage, what would we be talking about? And where still do you think, at that point, gaps that we would need to, you know, that we're, that we haven't been fulfilled even, you know, since sitting here now, especially because I think, as you mentioned, right, like, infrastructure is not easily upgradable, like software is, and there will need to be those investments, physical investments, whether it's labor, whether it's land, physical resources, so, so in two years, what do you think we're going to be talking about? Prasad Kalyanaraman: I'll start. I think you're going to, we're already starting to talk about that, so I expect that we'll talk more about those things. There's going to be a lot more innovation, generative AI will spur that as well in terms of how to think about renewable sources and how to actually run these things very efficiently. Because some of the things that are realities is that generative AI is actually really expensive to run. And so, you're not going to actually spend a lot of money unless you actually get value out of it. And so there's going to be a lot of innovation on the size of these models, there's going to be a lot of innovation on chips, we're already starting to see that, and there's going to be a lot of innovation on, of course, nuclear, will be a part of the energy equation. I think the path from here to, at least in our minds, our path to get to, net zero carbon by 2020. It's going to be very non linear, it's not going to be this one size fits all, it's going to be extremely non linear. But I expect that there will be a lot more efficiency running in it. It's one of the reasons why we harp so much on efficiency, on how we actually run our infrastructure. One, it actually helps us in terms of our costs, which we actually translate to our customers. But it's also a very responsible thing for us to do. Hope King: David, I actually want to jump over to you just for a second on this because, you know, a lot could happen in the next couple of months when it comes to the administration here in the US is there anything that you see in the next two years, politically, that could change the direction or the pace of development when it comes to AI, infrastructure building, investments from corporations, maybe even pulling back, right, pressures from investors to see that ROI on the cost of these systems. David Isaacs: Yeah, I'm hesitant to engage in speculation on the political landscape, but I think things like the CHIPS Act, US leadership in AI are things that enjoy bipartisan support and I think that will continue regardless of the outcome of elections and short term political considerations. You know, I think getting to your question on what we'll be talking about a few years down the road, I'm an optimist, so I think we'll be talking about how we have a more resilient supply chain for chips around the world. I think we'll be, enjoying the benefits of some of the research investments that propel innovation. One additional point I'd like to raise real quickly is, soft infrastructure and human talent. I think that's an important challenge and, at least in the US, we have a huge, skills gap in, among the workforce. Whether it's, you know, K through 12 STEM education or, you know, retaining the foreign students at our top universities, and so, and that's not just a semiconductor issue, that's all technology, advanced manufacturing throughout the economy. So I think that will be a continuing challenge. Hope King: Are you seeing governments willing and interested to increase funding in those areas? David Isaacs: On a limited basis, but I think we have a lot of work Hope King: to do. What do you mean by limited? . David Isaacs: I think there's a strong interest in this, but, you know, I'm not sure governments are willing to step up and invest in the way we need to as a society. Hope King: Sanjay, To Two years. we're talking again. What's the, what's gonna be on our minds? Sanjay Podder: So we have not seen what gen AI will do to us. We are still like in kindergarten talking about infrastructure. It's like the internet boom time, right? You know, and then we know what happened with that. You know, our whole lives changed. With gen AI, enterprises will reinvent themselves. Our society will reinvent themselves and all that will be possible because of a lot of innovation happening in the hardware end as well, the software layer as well. What we need to, however, keep in mind as we transform. None of us in this room know how the world will look like. It will look very different. That's what's certain. But one thing that we need to keep in mind is this transformation should be responsible. It should be keeping a human in the center of this transformation. We have to keep the environment in the ESG, all three very important, so that, you know, our AI does not disenfranchise communities, people. I think that is going to be the biggest innovation challenge for us, because I'm very certain that we will, human ingenuity is such that we will build a better world than what we have today. There'll be a lot of innovations in all spheres but in the journey, let's make sure that responsible AI becomes the central aspect, sustainable and responsible AI become a central theme as we go through this journey, right? So I'm as eagerly looking forward to it as all of us here, how this world will look. Hope King: Yeah. No digital hoarding. Lastly, Neil, we didn't talk about the last mile customization problem, today, but I don't know if you, if that's something that you're looking for in the next two years to be solved. What other things? And you can speak to the last mile too, if you want. Neil Thompson: Sure. So, so for those who don't know, you know, the, idea of the last mile problem in AI is that you can build, say, a large language model and say, this works really well, in general, for people asking questions on the internet, but that might still not mean that it works really well for your company for some really specific thing. You know, if you're interacting with your own customers, right, on your own products, with those specific terms, with those things, that, that system may not work that well. And you have to do some customization, that customization may be easy, you just need a little prompt engineering, or you feed it a little bit of information from your company. Or it could be more substantial, it could, you could actually have to retrain it. And in those cases, that's gonna slow the spread of AI. Because it's going to mean that we're going to get lots of improvement in one area, but then you say, okay, well, think about all of the different companies that might want to adopt. They have to figure out how they can adapt the systems to work for the things they do, and that's going to take time and effort. And so that last mile is going to be one that I think is going to be really important, because I think it's very easy to say, I read on the newspaper, or I saw a demonstration that said, boy, AI can do amazing things, much more than it could do even three months ago. And that's absolutely true. But that's, then there's also this diffusion process and that's going to take a lot longer and so I think what we should expect over the next two years in terms of this is more of this sort of wedge of there are going to be some folks who are leading and these resource things that we've been talking about are being incredibly salient for them and they're going to be people who are behind for whom the economics of customization don't still work and they're going to be in the model that they were in ten years ago. And so that divide, I think, is going to get bigger over the next two years. Hope King: Alright, David, thank you for joining us, Sanjay, Neil, Prasad, and for all of you, hopefully you found this as informative as I did. Thanks, thanks everyone. Prasad Kalyanaraman: Thank you. Sanjay Podder: Hey, everyone. Thanks for listening. Just a reminder to follow CXO Bytes on Spotify, Apple, YouTube, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show. And of course, we want more listeners. To find out more about the Green Software Foundation, please visit greensoftware.foundation. Thanks again, and see you in the next episode.…
1 Greening Digital Sustainability with Dr. Ong Chen Hui 39:07
39:07
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי
39:07Welcome to the first episode of CXO Bytes! Join host Sanjay Podder as he talks to leaders in technology, sustainability, and AI in their pursuit of a sustainable future through green software. Joined by Dr. Ong Chen Hui, Assistant CEO of Singapore's Infocomm Media Development Authority (IMDA), the discussion focuses on Singapore's comprehensive approach to digital sustainability. Dr. Ong highlights IMDA's efforts to drive green software adoption across various sectors, emphasizing the importance of efficiency in data centers and the broader ICT ecosystem. So join us for an intriguing and though provoking conservation about the critical role of government and industry collaboration in achieving sustainability goals amidst the growing demand for digital technologies. Learn more about our people: Sanjay Podder: LinkedIn Dr. Ong Chen Hui: LinkedIn Find out more about the GSF: The Green Software Foundation Website Sign up to the Green Software Foundation Newsletter Resources: IMDA [01:20] Government Technology Agency | Singapore [01:54] Singapore Green Plan 2030 [02:55] IMDA and GovTech unveil new initiatives to drive digital sustainability | IMDA - Infocomm Media Development Authority [10:19] Your Guide to the Gartner Top Strategic Technology Trends in Software Engineering [10:56] Asia Tech x Singapore [23:37] Software Carbon Intensity (SCI) Specification Project | GSF [25:11] Welcome to Impact Framework {33:31] Green Software Foundation [34:00] Digital Sustainability Forum | ATxSummit [37:33] If you enjoyed this episode then please either: Follow, rate, and review on Apple Podcasts Follow and rate on Spotify Watch our videos on The Green Software Foundation YouTube Channel! Connect with us on Twitter , Github and LinkedIn ! TRANSCRIPT BELOW: Sanjay Podder: Hello and welcome to CXO Bytes, a podcast brought to you by the Green Software Foundation and dedicated to supporting chiefs of information, technology, sustainability, and AI as they aim to shape a sustainable future through green software. We will uncover the strategies and a big green move that's helped drive results for business and for the planet. I am your host, Sanjay Podder. Hello everyone. Welcome to CXO Bytes. This is our inaugural podcast on how do you use green software for building a sustainable future. This is a new podcast series and the whole idea behind it is, you know, embracing a culture of green software, it needs to come from the top. And we therefore want to talk with decision makers, with business leaders, with leaders who are running nation states like Singapore, for example, at sea level. You know, how are they driving this culture change when it comes to digital sustainability and green software, for example? Today I am super excited to invite Dr. Ong. She is the Assistant CEO of IMDA, which is the Infocomm Media Development Authority of Singapore. And we are going to chat on how IMDA is championing digital sustainability as well as green software. Welcome, Dr. Ong. Dr. Ong Chen Hui: Thank you for having me on your inaugural podcast on green software. Sanjay Podder: And you know, I had my own selfish reason for inviting you because while the Green Software Foundation has been interacting with many, many large businesses across the world, IMDA and Singapore GovTech, these are two members of Green Software Foundation who represent the government, right? And we all know the very important role that government will play in sustainability in general. So I wanted to understand from you, you know, how are you looking into this space? So we will talk a lot about that. The other aspect is probably to begin with, for our audience, a perspective on what is IMDA. You know, what is your specific remit, what you are trying to do in Singapore, if you can give us, you know, a few insights into that. Dr. Ong Chen Hui: Okay, so here in Singapore, of course, climate change is actually something that is a bit of a existential thing for us, us being a small nation state and we're also an island, to us, climate change and the associated rising sea level is a matter of concern. Right? So, as a result, we have put in a green plan that states our, sustainability goals by the time we reach 2050. And this is actually a whole government effort. So, I don't think it is a case where it's one ministry or one agency that's responsible for the whole world. It is about the whole of government working together in order to make sure that we meet the goals of our Green Plan. Now, what are some of the things that we are doing? Many things, for example, the National Environment Agency is actually rolling out some of the regulations. We have things like e-waste management, for example. Just now you mentioned GovTech, which is our sister agency. GovTech is also rolling out green procurement when they're actually procuring software solutions. Within IMDA, we are responsible for some of the industry development. We're also what we call a sectoral lead of the ICT sector. So, our own green strategy, comprised broadly of three different strokes. The first is about greening ourselves as an organization. The second is really about greening the sector that we are responsible for, that we are leading. So, in that case, there will be things like the telecommunications sector, the media sector. And the third thing we want to do is to enable our ICT solution providers to provide green solutions to the broader economy so that we can scale the adoption, we can ease the friction out there in the ecosystem. So essentially, that's greening ourselves, greening the sector, as the lead. And the third is really to kind of provide solutions through the ecosystem so that the wider community can actually benefit. Sanjay Podder: Now this is really a full 360 degree kind of approach and it is phenomenal. And, I was, I was wondering, you know, and you mentioned briefly on Singapore being an island state. I was thinking, why digital sustainability? What will happen if Singapore decides not to do it, for example, right? Do you have a point of view, say, because, you know, there are many different levers of, sustainability, you know, I can understand the larger sustainability, but what is the importance of digital sustainability? Do you think it's an important enough lever or maybe you can look at nature biodiversity or something else, right? So specifically for digital sustainability. What is it that triggers IMDA that this is a important initiative? And I'm, I'm seeing this is my second year in Asia Tech that, you know, this is something you give a lot of importance to. Bringing in leaders from various organizations. Doing deep deliberation. I also remember last year, you brought out your new data center standards, I think increasing the temperature by one degree that has an implication. If you could throw a little bit more light on digital sustainability in particular, Dr. Ong Chen Hui: Mm hmm. Sanjay Podder: why do you feel that's a very important lever for a country like Singapore and maybe for many other countries around the world? Dr. Ong Chen Hui: Yeah. Well, I think you're actually exactly right that when we are trying to drive sustainability, actually there are many different strokes. Some of it includes looking at energy sources and all that, which actually is also very important for Singapore because we are small. We do, have to look at, different kinds of energy sources and how we can potentially actually import some of them, right? Now, when it comes to digital sustainability, actually our journey, I would say started many years ago. Maybe more than a decade ago, when we started looking at, some of the research work within the research community about, making sure that our data centers, can operate more efficiently in the tropical climate. Now, data centers, comprise of almost a fifth, of the ICT carbon emissions. And because they are such a huge component of the carbon emissions, of course, their efficiency has always been top of the mind. Now in the tropical climate like ours, a large part of the energy sometimes is attributed to the cooling systems, right? The air conditioning that's actually needed to bring the temperatures down. So as you rightly pointed out, what we found out is that actually if you were to increase the temperature by one degree, that can lead to a savings of between two to five percent off. Carbon emissions. So, and that as a result, we have been investing in research within our academia, funding some of the innovation projects with our ITC players, in order to look at what actually works and what doesn't. Because I think in Singapore, regulations always need to be balanced with innovation. So that have kind of, led to what happened last year, which was that we released the first, standards for tropical data sensors. But we wanted to go a lot more, right, because some of those standards, around cooling and all that, that's kind of like looking at how efficient the radiators are in a car. But we also need to look at how efficient the engines are. And the reality is that, if you look at the trends of ICT usage of software applications. I mean, so much of our lives, whether it is watching videos, watching TikTok, right, our education, around all that, most of this have moved to become, to be enabled by digital technologies. And when we look at the consumption of, data centers and the kind of workload in it, it is increasing year by year. Now, with the explosion of AI, we know that the trend is probably that there will be more consumption of digital technologies. And those are the engines that sits withinssb the data centers. And we need to make them efficient. And as a result of that, we have decided that we need to also get onto this journey of greening the software stack. And greening the software stack means a few things. The first is, of course, I think this is still a fairly nascent area. How do we make software more measurable, so that there's a basis of comparison, so that we can identify hot spots that I think is important. The second part that I think is important is also, given all the trends today, GPUs, CPUs all needing to work together, how do you make them work efficiently? How do you process data efficiently? How do you make sure that the networks and the interconnects within the data centers are efficient. I think all of these are worthy problems, to look at. Some of it will rightfully stay, still in the research stage. So we'll be funding, research programs, called the Green Computing Funding Initiative around it. But at the same time, we also think that there are some practices that may be a bit more mature already, and we should encourage companies to actually innovate on top of it. So we're also conducting green software. Sanjay Podder: I've heard about that, you know, and that's so innovative. I myself try to engage with all my clients in embracing green software. It is not a trivial Dr. Ong Chen Hui: Yes. Sanjay Podder: challenge, you know, because I feel that it's a new way of doing things. In fact, I just read a Gartner report on top, software engineering trends in which they say green software engineering is one of the five. 10 percent of the organizations they surveyed, they have green software or sustainability as one of the non functional requirements of software, but they believe in the next three years, by 2027, 30 percent of organizations will have green software engineering as a requirement for software development. So this is. Indeed, growing very rapidly. But having said that, you know, the, there are a lot of adoption challenges because people are, even if they want to do it, there are a few challenges here. First of all, people may not want to do it thinking, "Oh, this is a small problem, not worth solving. I will procure a lot of renewable energy, or I will buy offset and my problem is gone," right? And then there will be people who will say, "Oh no, this is... offset is not the answer. Renewable energy is not the answer. We have to inherently lower the emission or the energy required for software, make software carbon efficient." But then, where are the standards? How do we do it? Our people do not know how to do it. And I'm talking about organization. You're talking about country. It's a very big problem. Now, the question to you, therefore, is, you know, how are you getting people excited in Singapore? And you also mentioned small, medium business, ecosystem. So it's a diverse ecosystem entity. So what has been your approach to make people excited, enable them? And what would be your North Star? Like, what will make you super happy that I've done my job, you know, this is what I wanted do. So, how do you look at it, Dr. Ong? Okay, Dr. Ong Chen Hui: Okay, I guess, maybe I'll answer the question of the North Star first. A country like Singapore, we do have a very limited carbon budget. Right? To me, if we can create some carbon headroom for Singapore, so that, we can have more options for, For different aspects of our economic growth, I think that will be, the best outcome that we can actually aim for. That's a great point. Yeah. Now, in order to be able to do that, you're right. It isn't just about the government waving a flag and saying that that's very important. It may not just be sufficient for a few companies to go about doing it. What we want to be able to drive this across the entire ecosystem. Now, of course, there will be some companies that are very much more forward looking, and may have already embraced many of these practices. They may even have, taken stock of what is it that they have, where they can actually improve, in terms of their carbon emissions. And there will be others who are still a bit more tentative and on the fence, right? And the question in my mind is, so what can we help this first group? How can we help that group in the middle as well? I think for those in the first group, and we have seen some which are very advanced, right? They are talking about the fact that maybe they already have a very large, applications footprint. E-commerce, all that, for example. And they are looking, constantly looking at how to refresh a stack. Because they need to perhaps drive down the cost of operations. And this particular group, they sometimes have very advanced needs. They may be talking about increasing the level of control so that they can dynamically schedule their tasks and bring down efficient, bring up the efficiency of the entire system. They may be talking about advanced partnerships, with some of their vendors in order to make sure that, they continue to leverage on the best and most efficient, from their supply chain. And for some of these, what we have, tried to do with them, is to, orchestrate, or we call it matchmaking, together with, our academia, right, to look at what projects that they can actually do together so that they can create new IP in this area so that they can continue to be a forefront and leverage on all these,, ideas, solutions, that perhaps the researchers are better equipped to provide. But there's a group in the middle, I think, who may want to see something a bit more concrete. They want, they may have read that there are things that they can implement, but they're not quite sure where to actually invest their time. And if we think a bit about it from an organization point of view, it's not like they can experiment indefinitely, right? So I think they want to be a bit more targeted. I think for this particular group, the question is, "how can we actually encourage innovation?" Are there, solution providers who may, know a bit more in this area that can increase awareness and kind of bring a bit more focus to the innovation? Are there best practices, guides, frameworks that we can put out there that can actually encourage the innovation and allow some of these companies to explain to their C suite that this is, how should I say, a systematic approach to innovation. And so for the former, what we wanted to do from the Green Software Trust is a little around that, right? To increase awareness, bring the solution providers to work together with the companies, let them see that actually some of this green software world, digital sustainability world, it's not just about doing good, but there's a possibility to do well as well. Right? In that you're actually, improving your bottom lines and all that. And the other part is really to work with organizations like Green Software Foundation in order to make sure that things like, best practices, guides, broader ecosystem awareness, as well as standards, is something that we can actually collaborate together as a whole ecosystem in order to make sure that over time, this being a journey of innovation, over time, we'll be able to mature many of these practices. And actually, maybe perhaps reduce some of the risk that organizations may perceive, that they want more clarity about how things ought to be done. Sanjay Podder: I think that's, that's a very nice, comprehensive, you know, approach to what you're doing. You mentioned a few things, DR. Ong, that caught my attention. One was, you mentioned about AI, right? And the whole world is talking about AI now. Which is good. It's almost magical what we are seeing with LLM. But then there is a dark side to it. And the dark side is, when you look at, some of the reports around, the impact of large language models on the environment. You know, there was a very recent study, from Hugging Face that says every time you generate an image in your LLM, it consumes a full charge of an iPhone, for example, right? So, and as consumers, we don't realize that, you know, we must be, we are like generating images, which we may not even look at it the next time, but we don't realize behind the scene. You know, so much energy was used, and these are like, we are talking about 176 billion parameters, models, and there's a mad rush everywhere in the world to create these large models, the bigger the better. But the bigger also means more energy needed. Dr. Ong Chen Hui: Yeah. Sanjay Podder: Every time you do an inference, the whole machine gets fired up. And, and then the, interesting bit is, the environmental impact because you need so much energy. Many parts of the world, you know, there is still fossil fuel used to generate those Dr. Ong Chen Hui: Hmm. Sanjay Podder: You need some of the biggest data centers to be built for our new AI world, right? And then there's impact to water, for example. Another very interesting study that pointed out that every 30 to 50 friends is half a liter of water for cooling, for example. So, while there is no doubt that generative AI is a magical technology that is going to change our world, I'm sure as, as a government body, as a regulator for media and info, this is something probably you're watching very closely, you know, how do we respond to this? challenge that this magical technology brings. What has been your, approach to Gen AI? I'm sure Asia Tech, you're going to find out a lot of answers, but yeah. Dr. Ong Chen Hui: I think this thing about greening of AI is a very important problem. When it comes to greening of AI, I think there are a few different dimensions to it. One is, can we actually design the AI a little bit differently? So that the training of it, doesn't take as much energy. Just like you mentioned about inference, right? Generating the, an image. But I was reading some of the statistics about, the training of AI models using, previous generation of transformer technologies and already that it may be equivalent to, the carbon emissions that a few cars, actually make in their entire lifetime. So I think when it comes to AI, while we refer to AI actually perhaps some of the thinking that we are hearing from both the industry as well as academia around us is that we may need to look at different phases of AI. So the training itself may be one kind, and it may, require a certain technology stack. Today, the inference technology stack is exactly the same as the technology stack for training. And perhaps that may need to specialize, so that you have a far more efficient kind of technology stack, that will be used for inference. And if there can be a more customized, more, targeted kind of technology stack. Perhaps that will lead to some kind of energy savings as well as reduction in emissions. But this, I think, is still very early because we are talking about specialized AI chips, right? I think some of it, may still be very much ideas in research or in early stage startups. Yeah. And then there is, of course, the other point, which is really about how much generative AI is actually consuming in terms of data and because of the amount of data that actually needs to be consumed, the kind of efficiency in data processing and all that, that also actually takes up, quite a bit of emissions. So around that, I think there may be a need to look at, different kinds of architectures that can make do with less data. And as a result of making do with less data, they, the footprint may be a lot smaller and the amount of energy usage may be a lot smaller. Yeah, but that's it. I think these are all, because of how nascent it is, we are, looking at, how the research community can actually, participate in this and perhaps develop and mature some of the sciences around us. So that, that can actually lead to, perhaps more innovation down the road. I'm fully cognizant that AI being so hot now, a lot of people are also talking about, the kind of environmental impact about AI. So I would imagine that perhaps next year when we have a, when we meet again at Asia Tech X, perhaps we can then compare notes and see where we actually see the whole ecosystem heading towards. Sanjay Podder: Absolutely. There's so much action happening on the custom silicon side as well, right? As you rightly pointed out, specific, specific chips for inferencing, for training... Dr. Ong Chen Hui: even for data processing. Sanjay Podder: yeah, this is going to be... you also spoke about the dark data problem and data itself, right? Because so much of data today is never used. It's dark, but you're, you're still storing it. And I sometimes suspect, with, Gen AI, people will store it even more. So, there is just so much, you know, some of the challenges get amplified in the process. It's a year since you joined, I think, Green Software Foundation. How has the experience been? Dr. Ong Chen Hui: I think it's been great. Certainly I think having that wider community that my team can actually tap on, bounce ideas and figure out what the, new wave of innovation is, I think has been very, very helpful. And it's something that we really want to be able to continue doing. And we want to be able to bring, some of our other partners within government, within academia in together as well. Because I think in areas like this, where we all have a shared responsibility to protect the environment,, We really should tap on all the best ideas that are actually out there. The second part, and I really want to congratulate you on this, is the, your ability to push for the SCI into, the standards. Actually on this, perhaps can I, tap on your brains a little bit on what sparked, the need for SCI and what do you want, for the standards next? Sanjay Podder: I think when we, in fact, last week was our, third year of existence, like we announced Green Software Foundation in Microsoft Build 2021, Accenture, Microsoft, GitHub, ThoughtWorks, and a few of us, Goldman Sachs, we came together, we announced the Green Software Foundation. To be very candid, I didn't, I I think that we'll get this kind of response. Today we have more than 60 members and some of the top companies from around the world, but right at the beginning we were very clear that this area is so new that there was absolutely no standard. There was no language to express the challenge. There was no training for people, right? And, and no organization, unless you're, of course, a government, but no organization can say that this is a standard. You know, you have to have a consensus. And, to me, the whole idea about the foundation was to build those consensus, to have a platform to deliberate, to share the challenges, to find an answer to the problem. The software carbon intensity, I think, is an amazing way of expressing, in a very simple way, you know, what are the dimensions of a software that you need to look into when you want to measure the carbon intensity? And this was like something that people were asking for, saying, how do I, how do I measure? How do I express? What is the language? The whole idea about embodied carbon is very important. People forget about it. People only think about the carbon emission during usage. But if you, I think, earlier you also pointed out e-procurement for example. You know, the whole idea about looking at it holistically from the factor of embodied carbon so that you don't lose sight of because by the time a laptop comes to you, a bulk of its emission, lifetime emission has already happened during manufacturing state. So, you know, how do you look at it holistically? Embodied carbon. Then things like, it's not a carbon offset discussion or a renewable energy discussion. It's about making software, inherently carbon efficient, which means looking into it. From a language stack, from architecture stack, the whole, you know, aspect of a software. So if you look at SCI, it brings in, for example, embodied carbon. It brings in, the carbon intensity of electricity. So that your software is more carbon aware, right? So the same software run in Singapore and run somewhere else, will have very different emission levels. And then, you know, per unit, what we call the R. So I think it's a very actionable way of expressing and what we therefore saw was rapid adoption of, SCI. In fact, some, startups have started incorporating SCI. So we are SCI compliant. Large organizations are embracing SCI and, the best thing that happened was when we were blessed with, as an ISO standard. So SCI is now an ISO standard. Dr. Ong Chen Hui: Yeah. Sanjay Podder: Looking into the future, I think, SCI is just the right analytical approach to think about how do you model emissions from, LLMs and generative AI. So that's going to be another area of, research. Exploration for us, how do we further build upon SCI to give a very actionable way of looking into emissions coming out of AI, both during training and inferencing. So I think it's one journey that we are super proud of. And all the members that came together to contribute and that's what we want to replicate, you know, creating more such, actionable deliverables from the Green Software Foundation for people to make this a reality, this space a reality, right? So, so I agree with you. Congratulations to you as well as being a member of the GSF. It's something we all share and feel very proud about. Dr. Ong Chen Hui: Yeah. Yeah, like I said, I think this is, certainly a course that IMDF is very, passionate about. And I guess, from the Green Software Foundation, it's certainly something that you have, been driving for the past three years, right? So beyond the SCI, right? The Software Carbon Intensity, do you actually see a need to work together with regulators of the world in order to get this adopted? Or do you feel that perhaps there may be something else that may, that may be the next priority for Green Software Foundation? Sanjay Podder: Yeah, I think there are a couple of things here. One is, as I mentioned sometime back at the start of the podcast, I think it's a culture change. Green software is a culture change. While the developers may want to do it, if there is no adequate support from the top, it is not possible to make your organization adopt green software in a very systematic fashion. So one of the area, and the podcast is a part of that, is to. Spread this awareness among organizations, that, you know, they need to have this in their over digital sustainability and green software in their, in either in the net zero goal or the larger scheme of things, I do recall, you know, just after you joined Green Software Foundation, you created your digital sustainability policies and things like that. I think that was great. So that, that to me is important. We obviously have to work, with the regulators by giving them inputs as and when we are called for as to what do we think about whether a greening of AI or what is possible so that, and an example would be that I am personally here, on, you know, invitation from IMDA to participate in Eurasia Tech. Very important to be part of these conversations, understand how this space, is going to evolve. How can we, contribute to solve some of these challenges? Because we all know that, as an example, generative AI is here to stay, the adoption will only increase. What can we do? You know, we cannot just sit here and give a doomsday, you know, scenario. We are here to solve the problem and say, "okay, so what can we do about the emissions? Is there a way to solve it? Is there a way to measure it? Are there best practices you can bring? Can we enable the ecosystem?" So I think, as I always say, we are the solution seekers, right? And to me, from a Green Software Foundation perspective, the members that we have, which is so diverse, not only governments, large organizations, nonprofits, academia, big businesses across industry. That is our strength because we are getting that unique support to find solutions to these very difficult problems. So very excited. I think I would love to focus on green AI. That's what we are trying to do. Love to focus on how do we give frameworks and enable organizations to embrace green AI and transform themselves and green software and transform themselves. So those are some of the... very recently we did the impact framework. Oh, yes. Again, the idea was. Very science based, complete transparency on how do you measure emissions. We did a carbon hack around the world. We saw immense participation from, you know, the developer community coming up with unique solutions. So there is a lot of excitement in this space. And, that's what the Green Software Foundation will continue to champion with all the members' support. And look out for our GSF Summit. It's coming soon. In October, we will do in all major cities around the world, including in Asia, we are going to have, so we look forward to all your support to further champion this cause. Dr. Ong Chen Hui: You mentioned about cultural change, right? And that certainly is also something that's very much top of the mind for us. When it comes to, cultural change, perhaps, let's exchange some notes around this. Do you find that it's more effective to have targeted conversations, let's say from Green Software Foundation to the C suite? Or is it, more important to equip internal teams, right, with measurable, changes that are possible when they adopt green software or green AI practices. Sanjay Podder: Yeah, no, I think it's a good question. And I think the answer is both. So if you look at our own journey in Green Software Foundation, we started first with. You know, actionable, measurable tools, practices that we want to empower the developer community. Very important. But then, you know, unless the organization embraces this as a priority, your, your sustainability priority is always competing with some other priority in the organization, which means that, you know, it will not receive the value or encouragement. So it is very important that, Given that most big organizations, most organizations today have some kind of sustainability commitment, net zero or otherwise. How do we create, you know, in a very systematic fashion, that intervention right from the top? Because when there's a, you know, support coming from the top leadership, everything Dr. Ong Chen Hui: of, yeah, everything falls in place, right. And all the innovation that you actually need, right. Yeah. Both from a internal processes point of view, as well as providing visibility to your supply chain partners. All that happens. Sanjay Podder: All that happens, right? Even beyond, as you have rightly said, beyond the four walls of your organization, supply chain, everything, right? This is important. Climate change is real. This is important. And as you know, I look at it as a triangle, right? What I have seen is there is the whole thing of climate change, emissions, which appeal to people, people are concerned. And then there is the whole energy crisis. So much energy. And the energy bills. And then finally, once you embrace green software practices, green AI practices, that show up in your bottom line. And you'll see, you know, at the heart of it, green software practices, green AI practices are green, are great software engineering practices. Which for some reason we have forgotten, given the era of abundance we have been. Right, with the falling cost of compute, storage, nobody's, you know, people will store everything they have without even wondering, you know, how, what do I really need? So these are great practices when you bring all together. You know, we find the answer to some of the pressing problems. So this has been a great conversation, Dr. Ong, you know, something that fits the very first podcast of the series. I hope some of the messages you gave today, you know, convey to other CXO leaders that how important this topic is and how you can make it a reality. How you can champion it, and, every year I'm so delighted to see the Digital Sustainability Forum in Asia Tech, you know, takes a very prime spot, and that shows a commitment from the top, you know, and, and that's what we have to do in every organization so that digital sustainability, which is fast growing, especially green software, is one of the major sources of greenhouse gas emission. We can control it right now rather than allowing it to snowball. So thank you so much for your time. Thank you for inviting me to Singapore. And as Green Software Foundation, we want to be a place of action, right? In this case, in Singapore, talk to the leaders, understand how we can collectively solve the problem. So, super excited with this conversation. And, thanks for joining us in the CXO bytes Podcast. Thank you. Dr. Ong Chen Hui: Thank you very much for having me. Thank you. Sanjay Podder: Thank you. Hey, everyone. Thanks for listening. Just a reminder to follow CXO Bytes on Spotify, Apple, YouTube, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show. And of course, we want more listeners. To find out more about the Green Software Foundation, please visit greensoftware.foundation. Thanks again, and see you in the next episode.…
Tech leaders, your balancing act between innovation and sustainability just got a guide with the Green Software Foundation’s latest podcast series, CXO Bytes. In each episode, Sanjay Podder, Chairperson of the Green Software Foundation, and host of CXO Bytes will be joined by industry leaders to explore strategies to green software and how to effectively reduce software’s environmental impacts while fulfilling a drive for innovation and enterprise growth. So join us for an invigorating chat that will keep you both informed and entertained on CXO Bytes. Just search for CXO Bytes wherever you get your podcasts. Find out more about the GSF: The Green Software Foundation Website Sign up to the Green Software Foundation Newsletter If you enjoyed this episode then please either: Follow, rate, and review on Apple Podcasts Follow and rate on Spotify Watch our videos on The Green Software Foundation YouTube Channel! Connect with us on Twitter , Github and LinkedIn !…
ברוכים הבאים אל Player FM!
Player FM סורק את האינטרנט עבור פודקאסטים באיכות גבוהה בשבילכם כדי שתהנו מהם כרגע. זה יישום הפודקאסט הטוב ביותר והוא עובד על אנדרואיד, iPhone ואינטרנט. הירשמו לסנכרון מנויים במכשירים שונים.