Skip to content

Environment Variables podcast roundup: Green Kernels, and Resource transparency in AI and Cloud

Podcast roundup - The state of transparency in AI, Kernels and in Cloud Computing

In addition to offering briefings and workshops on specific topics around digital sustainability, part of our work involves working with other organisations to help share cutting edge research in the field to understand what is possible when it comes to building greener software, and reducing the environmental impact of digital services. One example is the Environment Variables podcast we co-produce with the Green Software Foundation, where our director of Technology and Policy, Chris Adams interviews experts to share insights from their work to the wider public. Read on in this post for a round up of some of the highlight episodes, and why they’re relevant to responsible technologists.

Lessons learned engaging at the industry level on cloud transparency, with Adrian Cockroft, former VP cloud sustainability at AWS

Early in July, we spoke to Adrian Cockroft, one of the engineers whose name has been synonymous with cloud since the mid 2000’s, working at Sun Microsystems, Netflix, Ebay, and Amazon among others. We have been working with him in the Green Software Foundation’s Real Time Cloud project -an initiative to define a standardised set of data to use for companies disclosing environmental impact figures in a way that customers of cloud computing provider can use to optimise the emissions of their services.

Why would I listen to this?

Adrian has unique perspective, first as the former VP of cloud sustainability at Amazon Web Services, having to collect the data needed for disclosure, and now as a free agent trying to request this from all the largest players. If you have ever had to request information from a supplier, or respond to one such request, it’s enlightening.

In the interview we cover trends of the last two years of the project, how new operating models can design out common forms of waste from operating services, and Adrian’s own experiences coralling swarms of AI agents to build a home automation system from scratch in his house.

Diving into the details of AI energy measurement, with Scott Chamberlin of Neuralwatt

Later in September, we also spoke to Scott Chamberlin, formerly of Microsoft, and now leading his own start NeuralWatt, that focusses on optimising the energy usage of AI GPUs, and without reducing their performance.

Why is this worth a listen?

Scott lead the work on the Microsoft Windows operating system power and carbon tracking tooling, getting deep into the weeds of measuring how devices consume electricity, and he was also key in helping Microsoft Azure work out their own internal carbon accounting standards. So, he’s one of the best informed people in the world right now, about how what getting a large organisation to move in this direction looks like.

In the interview, we also cover why getting direct energy figures might be hard, how Microsoft used an internal carbon tax to build a “carbon war chest” to find digital sustainability initiatives, and some provocative takes on whether it’s realistic to expect developers to act on sustainability at all.

Creating the building blocks for sustainability certification of software in Germany with Didi Hoffman

Finally in October, we caught up with Didi Hoffman of Green Coding Solutions, the company behind the open source Green Metrics Tool, on software certification, organising conferences in Germany about greener digital services, and er… animal husbandry and preserving rare breeds of livestock on farms.

Why is this worth a listen?

Didi Hoffman is of the maintainers of the Green Metrics Tool, the software used by the German National “Blue Angel” agency for their own software certification programme. In the episode we cover how certification works for software and why it’s different to certifying other products, as well as what’s involved.

We also talk about his most recent work to extend the Linux Kernel to make it able to report energy usage the way it reports usage of other resources in the system. Because most of the servers in the world now run on Linux, building tooling into the core of the system would make understanding and reducing the energy consumption of digital services possible in a way that currently is not possible, and long overdue.