Reading view

There are new articles available, click to refresh the page.

Driving sustainable water management

From semiconductor manufacturing to mining, water is an essential commodity for industry. It is also a precious and constrained resource. According to the UN, more than 2.3 billion people faced water stress in 2022. Drought has cost the United States $249 billion in economic losses since 1980. 

Climate change is expected to worsen water problems through drought, flooding, and water contamination caused by extreme weather events. “I can’t think of a country on the planet that doesn’t have a water scarcity issue,” says Rob Simm, senior vice president at Stantec, an engineering consultancy focused on sustainability, energy solutions, and renewable resources. 

Economic innovations, notably AI and electric vehicles, are also increasing industrial demand for water. “When you look at advanced manufacturing and the way technology is changing, we’re requiring more, higher volumes of ultrapure water [UPW]. This is a big driver of the industrial water market,” Simm says. AI, computing, and the electric vehicle industries all generate immense quantities of heat and require sophisticated cooling and cleaning. Manufacturing silicon wafers for semiconductor production involves intricate cleaning processes, requiring up to 5 million gallons of high-quality UPW daily. With rising demand for semiconductors, improvements in water treatment and reuse are imperative to prevent waste.   

Data-driven industrial water management technologies are revolutionizing how enterprises approach conservation and sustainability. They are harnessing the power of digital innovation by layering sensors, data, and cloud-based platforms to optimize physical water systems and allow industrial and human users to share water access. Integration of AI, machine learning (ML), data analytics, internet of things (IoT) and sensors, digital twins, and social media can enable not just quick data analysis, but also can allow manufacturers to minutely measure water quality, make predictions using demand forecasting, and meet sustainability goals.

More integrated industrial water management solutions, including reuse, industrial symbiosis, and zero liquid discharge (ZLD), will all be crucial as greenfield industrial projects look toward water reuse. “Water is an input commodity for the industrial process, and wastewater gives you the opportunity to recycle that material back into the process,” says Simm. 

Treating a precious resource

Water filtration systems have evolved during the past century, especially in agriculture and industry. Processes such as low-pressure membrane filtration and reverse osmosis are boosting water access for both human and industrial users. Membrane technologies, which continue to evolve, have halved the cost of desalinated water during the past decade, for example. New desalinization methods run on green power and are dramatically increasing water output rates. 

Advances in AI, data processing, and cloud computing could bring a new chapter in water access. The automation this permits allows for quicker and more precise decision-making. Automated, preset parameters let facilities operate at capacity with less risk. “Digital technology and data play a crucial role in developing technology for water innovations, enabling better management of resources, optimizing treatment processes, and improving efficiency in distribution,” says Vincent Puisor, global business development director at Schneider Electric. 

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Scaling green hydrogen technology for the future

Unlike conventional energy sources, green hydrogen offers a way to store and transfer energy without emitting harmful pollutants, positioning it as essential to a sustainable and net-zero future. By converting electrical power from renewable sources into green hydrogen, these low-carbon-intensity energy storage systems can release clean, efficient power on demand through combustion engines or fuel cells. When produced emission-free, hydrogen can decarbonize some of the most challenging industrial sectors, such as steel and cement production, industrial processes, and maritime transport.

“Green hydrogen is the key driver to advance decarbonization,” says Dr. Christoph Noeres, head of green hydrogen at global electrolysis specialist thyssenkrupp nucera. This promising low-carbon-intensity technology has the potential to transform entire industries by providing a clean, renewable fuel source, moving us toward a greener world aligned with industry climate goals.

Accelerating production of green hydrogen

Hydrogen is the most abundant element in the universe, and its availability is key to its appeal as a clean energy source. However, hydrogen does not occur naturally in its pure form; it is always bound to other elements in compounds like water (H2O). Pure hydrogen is extracted and isolated from water through an energy-intensive process called conventional electrolysis.

Hydrogen is typically produced today via steam-methane reforming, in which high-temperature steam is used to produce hydrogen from natural gas. Emissions produced by this process have implications for hydrogen’s overall carbon footprint: worldwide hydrogen production is currently responsible for as many CO2 emissions as the United Kingdom and Indonesia combined.

A solution lies in green hydrogen—hydrogen produced using electrolysis powered by renewable sources. This unlocks the benefits of hydrogen without the dirty fuels. Unfortunately, very little hydrogen is currently powered by renewables: less than 1% came from non-fossil fuel sources in 2022.

A massive scale-up is underway. According to McKinsey, an estimated 130 to 345 gigawatts (GW) of electrolyzer capacity will be necessary to meet the green hydrogen demand by 2030, with 246 GW of this capacity already announced. This stands in stark contrast to the current installed base of just 1.1 GW. Notably, to ensure that green hydrogen constitutes at least 14% of total energy consumption by 2050, a target that the International Renewable Energy Agency (IRENA) estimates is required to meet climate goals, 5,500 GW of cumulative installed electrolyzer capacity will be required.

However, scaling up green hydrogen production to these levels requires overcoming cost and infrastructure constraints. Becoming cost-competitive means improving and standardizing the technology, harnessing the scale efficiencies of larger projects, and encouraging government action to create market incentives. Moreover, the expansion of renewable energy in regions with significant solar, hydro, or wind energy potential is another crucial factor in lowering renewable power prices and, consequently, the costs of green hydrogen.

Electrolysis innovation

While electrolysis technologies have existed for decades, scaling them up to meet the demand for clean energy will be essential. Alkaline Water Electrolysis (AWE), the most dominant and developed electrolysis method, is poised for this transition. It has been utilized for decades, demonstrating efficiency and reliability in the chemical industry. Moreover, it is more cost effective than other electrolysis technologies and is well suited to be run directly with fluctuating renewable power input. Especially for large-scale applications, AWE demonstrates significant advantages in terms of investment and operating costs. “Transferring small-scale manufacturing and optimizing it towards mass manufacturing will need a high level of investment across the industry,” says Noeres.

Industries that already practice electrolysis, as well as those that already use hydrogen, such as fertilizer production, are well poised for conversion to green hydrogen. For example, thyssenkrupp nucera benefits from a decades-long heritage using electrolyzer technology in the chlor-alkali process, which produces chlorine and caustic soda for the chemical industry. The company “is able to use its existing supply chain to ramp up production quickly, a distinction that all providers don’t share,” says Noeres.

Alongside scaling up existing solutions, thyssenkrupp nucera is developing complementary techniques and technologies. Among these are solid oxide electrolysis cells (SOEC), which perform electrolysis at very high temperatures. While the need for high temperatures means this technique isn’t right for all customers, in industries where waste heat is readily available—such as chemicals—Noeres says SOEC offers up to 20% enhanced efficiency and reduces production costs.

Thyssenkrupp nucera has entered into a strategic partnership with the renowned German research institute Fraunhofer IKTS to move the technology toward applications in industrial manufacturing. The company envisages SOEC as a complement to AWE in the areas where it is cost effective to reduce overall energy consumption. “The combination of AWE and SOEC in thyssenkrupp nucera’s portfolio offers a unique product suite to the industry,” says Noeres.

While advancements in electrolysis technology and the diversification of its applications across various scales and industries are promising for green hydrogen production, a coordinated global ramp-up of renewable energy sources and clean power grids is also crucial. Although AWE electrolyzers are ready for deployment in large-scale, centralized green hydrogen production facilities, these must be integrated with renewable energy sources to truly harness their potential.

Making the green hydrogen market

Storage and transportation remain obstacles to a larger market for green hydrogen. While hydrogen can be compressed and stored, its low density presents a practical challenge. The volume of hydrogen is nearly four times greater than that of natural gas, and storage requires either ultra-high compression or costly refrigeration. Overcoming the economic and technical hurdles of high-volume hydrogen storage and transport will be critical to its potential as an exportable energy carrier.

In 2024, several high-profile green hydrogen projects launched in the U.S., advancing the growth of green hydrogen infrastructure and technology. The landmark Inflation Reduction Act (IRA) provides tax credits and government incentives for producing clean hydrogen and the renewable electricity used in its production. In October 2023, the Biden administration announced $7 billion for the country’s first clean hydrogen hubs, and the U.S. Department of Energy further allocated $750 million for 52 projects across 24 states to dramatically reduce the cost of clean hydrogen and establish American leadership in the industry. The potential economic impact from the IRA legislation is substantial: thyssenkrupp nucera expects the IRA to double or triple the U.S. green hydrogen market size.

“The IRA was a wake-up call for Europe, setting a benchmark for all the other countries on how to support the green hydrogen industry in this startup phase,” says Noeres. Germany’s H2Global scheme was one of the first European efforts to facilitate hydrogen imports with the help of subsidies, and it has since been followed up by the European Hydrogen Bank, which provided €720 million for green hydrogen projects in its pilot auction. “However, more investment is needed to push the green hydrogen industry forward,” says Noeres.

In the current green hydrogen market, China has installed more renewable power than any other country. With lower capital expenditure costs, China produces 40% of the world’s electrolyzers. Additionally, state-owned firms have pledged to build an extensive 6,000-kilometer network of pipelines for green hydrogen transportation by 2050.

Coordinated investment and supportive policies are crucial to ensure attractive incentives that can bring green hydrogen from a niche technology to a scalable solution globally. The Chinese green hydrogen market, along with that of other regions such as the Middle East and North Africa, has advanced significantly, garnering global attention for its competitive edge through large-scale projects. To compete effectively, the EU must create a global level playing field for European technologies through attractive investment incentives that can drive the transition of hydrogen from a niche to a global-scale solution. Supportive policies must be in place to also ensure that green products made with hydrogen, such as steel, are sufficiently incentivized and protected against carbon leakage.

A comprehensive strategy, combining investment incentives, open markets, and protection against market distortions and carbon leakage, is crucial for the EU and other countries to remain competitive in the rapidly evolving global green hydrogen market and achieve a decarbonized energy future. “To advance several gigawatt scale or multi-hundred megawatts projects forward,” says Noeres, “we need significantly more volume globally and comparable funding opportunities to make a real impact on global supply chains.”

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

The data practitioner for the AI era

The rise of generative AI, coupled with the rapid adoption and democratization of AI across industries this decade, has emphasized the singular importance of data. Managing data effectively has become critical to this era of business—making data practitioners, including data engineers, analytics engineers, and ML engineers, key figures in the data and AI revolution.

Organizations that fail to use their own data will fall behind competitors that do and miss out on opportunities to uncover new value for themselves and their customers. As the quantity and complexity of data grows, so do its challenges, forcing organizations to adopt new data tools and infrastructure which, in turn, change the roles and mandate of the technology workforce.

Data practitioners are among those whose roles are experiencing the most significant change, as organizations expand their responsibilities. Rather than working in a siloed data team, data engineers are now developing platforms and tools whose design improves data visibility and transparency for employees across the organization, including analytics engineers, data scientists, data analysts, machine learning engineers, and business stakeholders.

This report explores, through a series of interviews with expert data practitioners, key shifts in data engineering, the evolving skill set required of data practitioners, options for data infrastructure and tooling to support AI, and data challenges and opportunities emerging in parallel with generative AI. The report’s key findings include the following:

  • The foundational importance of data is creating new demands on data practitioners. As the rise of AI demonstrates the business importance of data more clearly than ever, data practitioners are encountering new data challenges, increasing data complexity, evolving team structures, and emerging tools and technologies—as well as establishing newfound organizational importance.
  • Data practitioners are getting closer to the business, and the business closer to the data. The pressure to create value from data has led executives to invest more substantially in data-related functions. Data practitioners are being asked to expand their knowledge of the business, engage more deeply with business units, and support the use of data in the organization, while functional teams are finding they require their own internal data expertise to leverage their data.
  • The data and AI strategy has become a key part of the business strategy. Business leaders need to invest in their data and AI strategy—including making important decisions about the data team’s organizational structure, data platform and architecture, and data governance—because every business’s key differentiator will increasingly be its data.
  • Data practitioners will shape how generative AI is deployed in the enterprise. The key considerations for generative AI deployment—producing high-quality results, preventing bias and hallucinations, establishing governance, designing data workflows, ensuring regulatory compliance—are the province of data practitioners, giving them outsize influence on how this powerful technology will be put to work.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Industry- and AI-focused cloud transformation

For years, cloud technology has demonstrated its ability to cut costs, improve efficiencies, and boost productivity. But today’s organizations are looking to cloud for more than simply operational gains. Faced with an ever-evolving regulatory landscape, a complex business environment, and rapid technological change, organizations are increasingly recognizing cloud’s potential to catalyze business transformation.

Cloud can transform business by making it ready for AI and other emerging technologies. The global consultancy McKinsey projects that a staggering $3 trillion in value could be created by cloud transformations by 2030. Key value drivers range from innovation-driven growth to accelerated product development.

“As applications move to the cloud, more and more opportunities are getting unlocked,” says Vinod Mamtani, vice president and general manager of generative AI services for Oracle Cloud Infrastructure. “For example, the application of AI and generative AI are transforming businesses in deep ways.”

No longer simply a software and infrastructure upgrade, cloud is now a powerful technology capable of accelerating innovation, improving agility, and supporting emerging tools. In order to capitalize on cloud’s competitive advantages, however, businesses must ask for more from their cloud transformations.

Every business operates in its own context, and so a strong cloud solution should have built-in support for industry-specific best practices. And because emerging technology increasingly drives all businesses, an effective cloud platform must be ready for AI and the immense impacts it will have on the way organizations operate and employees work.

An industry-specific approach

The imperative for cloud transformation is evident: In today’s fast-faced business environment, cloud can help organizations enhance innovation, scalability, agility, and speed while simultaneously alleviating the burden on time-strapped IT teams. Yet most organizations have not fully made the leap to cloud. McKinsey, for example, reports a broad mismatch between leading companies’ cloud aspirations and realities—though nearly all organizations say they aspire to run the majority of their applications in the cloud within the decade, the average organization has currently relocated only 15–20% of them.

Cloud solutions that take an industry-specific approach can help companies meet their business needs more easily, making cloud adoption faster, smoother, and more immediately useful. “Cloud requirements can vary significantly across vertical industries due to differences in compliance requirements, data sensitivity, scalability, and specific business objectives,” says Deviprasad Rambhatla, senior vice president and sector head of retail services and transportation at Wipro.

Health-care organizations, for instance, need to manage sensitive patient data while complying with strict regulations such as HIPAA. As a result, cloud solutions for that industry must ensure features such as high availability, disaster recovery capabilities, and continuous access to critical patient information.

Retailers, on the other hand, are more likely to experience seasonal business fluctuations, requiring cloud solutions that allow for greater flexibility. “Cloud solutions allow retailers to scale infrastructure on an up-and-down basis,” says Rambhatla. “Moreover, they’re able to do it on demand, ensuring optimal performance and cost efficiency.”

Cloud-based applications can also be tailored to meet the precise requirements of a particular industry. For retailers, these might include analytics tools that ingest vast volumes of data and generate insights that help the business better understand consumer behavior and anticipate market trends.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

AI-readiness for C-suite leaders

Generative AI, like predictive AI before it, has rightly seized the attention of business executives. The technology has the potential to add trillions of dollars to annual global economic activity, and its adoption for business applications is expected to improve the top or bottom lines—or both—at many organizations.

While generative AI offers an impressive and powerful new set of capabilities, its business value is not a given. While some powerful foundational models are open to public use, these do not serve as a differentiator for those looking to get ahead of the competition and unlock AI’s full potential. To gain those advantages, organizations must look to enhance AI models with their own data to create unique business insights and opportunities.

Preparing an organization’s data for AI, however, unlocks a new set of challenges and opportunities. This MIT Technology Review Insights survey report investigates whether companies’ data foundations are ready to garner benefits from generative AI, as well as the challenges of building the necessary data infrastructure for this technology. In doing so, it draws on insights from a survey of 300 C-suite executives and senior technology leaders, as well on in-depth interviews with four leading experts.

Its key findings include the following:

Data integration is the leading priority for AI readiness. In our survey, 82% of C-suite and other senior executives agree that “scaling AI or generative AI use cases to create business value is a top priority for our organization.” The number-one challenge in achieving that AI readiness, survey respondents say, is data integration and pipelines (45%). Asked about challenging aspects of data integration, respondents named four: managing data volume, moving data from on-premises to the cloud, enabling real-time access, and managing changes to data.

Executives are laser-focused on data management challenges—and lasting solutions. Among survey respondents, 83% say that their “organization has identified numerous sources of data that we must bring together in order to enable our AI initiatives.” Though data-dependent technologies of recent decades drove data integration and aggregation programs, these were typically tailored to specific use cases. Now, however, companies are looking for something more scalable and use-case agnostic: 82% of respondents are prioritizing solutions “that will continue to work in the future, regardless of other changes to our data strategy and partners.”

Data governance and security is a top concern for regulated sectors. Data governance and security concerns are the second most common data readiness challenge (cited by 44% of respondents). Respondents from highly regulated sectors were two to three times more likely to cite data governance and security as a concern, and chief data officers (CDOs) say this is a challenge at twice the rate of their C-suite peers. And our experts agree: Data governance and security should be addressed from the beginning of any AI strategy to ensure data is used and accessed properly.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Download the full report.

Unlocking the trillion-dollar potential of generative AI

Generative AI is poised to unlock trillions in annual economic value across industries. This rapidly evolving field is changing the way we approach everything from content creation to software development, promising never-before-seen efficiency and productivity gains.

In this session, experts from Amazon Web Services (AWS) and QuantumBlack, AI by McKinsey, discuss the drivers fueling the massive potential impact of generative AI. Plus, they look at key industries set to capture the largest share of this value and practical strategies for effectively upskilling their workforces to take advantage of these productivity gains. 

Watch this session to:

  • Explore generative AI’s economic impact
  • Understand workforce upskilling needs
  • Integrate generative AI responsibly
  • Establish an AI-ready business model

Learn how to seamlessly integrate generative AI into your organization’s workflows while fostering a skilled and adaptable workforce. Register now to learn how to unlock the trillion-dollar potential of generative AI.

Register here for free.

Optimizing the supply chain with a data lakehouse

When a commercial ship travels from the port of Ras Tanura in Saudi Arabia to Tokyo Bay, it’s not only carrying cargo; it’s also transporting millions of data points across a wide array of partners and complex technology systems.

Consider, for example, Maersk. The global shipping container and logistics company has more than 100,000 employees, offices in 120 countries, and operates about 800 container ships that can each hold 18,000 tractor-trailer containers. From manufacture to delivery, the items within these containers carry hundreds or thousands of data points, highlighting the amount of supply chain data organizations manage on a daily basis.

Until recently, access to the bulk of an organizations’ supply chain data has been limited to specialists, distributed across myriad data systems. Constrained by traditional data warehouse limitations, maintaining the data requires considerable engineering effort; heavy oversight, and substantial financial commitment. Today, a huge amount of data—generated by an increasingly digital supply chain—languishes in data lakes without ever being made available to the business.

A 2023 Boston Consulting Group survey notes that 56% of managers say although investment in modernizing data architectures continues, managing data operating costs remains a major pain point. The consultancy also expects data deluge issues are likely to worsen as the volume of data generated grows at a rate of 21% from 2021 to 2024, to 149 zettabytes globally.

“Data is everywhere,” says Mark Sear, director of AI, data, and integration at Maersk. “Just consider the life of a product and what goes into transporting a computer mouse from China to the United Kingdom. You have to work out how you get it from the factory to the port, the port to the next port, the port to the warehouse, and the warehouse to the consumer. There are vast amounts of data points throughout that journey.”

Sear says organizations that manage to integrate these rich sets of data are poised to reap valuable business benefits. “Every single data point is an opportunity for improvement—to improve profitability, knowledge, our ability to price correctly, our ability to staff correctly, and to satisfy the customer,” he says.

Organizations like Maersk are increasingly turning to a data lakehouse architecture. By combining the cost-effective scale of a data lake with the capability and performance of a data warehouse, a data lakehouse promises to help companies unify disparate supply chain data and provide a larger group of users with access to data, including structured, semi-structured, and unstructured data. Building analytics on top of the lakehouse not only allows this new architectural approach to advance supply chain efficiency with better performance and governance, but it can also support easy and immediate data analysis and help reduce operational costs.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Multimodal: AI’s new frontier

Multimodality is a relatively new term for something extremely old: how people have learned about the world since humanity appeared. Individuals receive information from myriad sources via their senses, including sight, sound, and touch. Human brains combine these different modes of data into a highly nuanced, holistic picture of reality.

“Communication between humans is multimodal,” says Jina AI CEO Han Xiao. “They use text, voice, emotions, expressions, and sometimes photos.” That’s just a few obvious means of sharing information. Given this, he adds, “it is very safe to assume that future communication between human and machine will also be multimodal.”

A technology that sees the world from different angles

We are not there yet. The furthest advances in this direction have occurred in the fledgling field of multimodal AI. The problem is not a lack of vision. While a technology able to translate between modalities would clearly be valuable, Mirella Lapata, a professor at the University of Edinburgh and director of its Laboratory for Integrated Artificial Intelligence, says “it’s a lot more complicated” to execute than unimodal AI.

In practice, generative AI tools use different strategies for different types of data when building large data models—the complex neural networks that organize vast amounts of information. For example, those that draw on textual sources segregate individual tokens, usually words. Each token is assigned an “embedding” or “vector”: a numerical matrix representing how and where the token is used compared to others. Collectively, the vector creates a mathematical representation of the token’s meaning. An image model, on the other hand, might use pixels as its tokens for embedding, and an audio one sound frequencies.

A multimodal AI model typically relies on several unimodal ones. As Henry Ajder, founder of AI consultancy Latent Space, puts it, this involves “almost stringing together” the various contributing models. Doing so involves various techniques to align the elements of each unimodal model, in a process called fusion. For example, the word “tree”, an image of an oak tree, and audio in the form of rustling leaves might be fused in this way. This allows the model to create a multifaceted description of reality.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

❌