In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-05-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
Beijing time, February 13, with the popularity of ChatGPT, Microsoft, Google and Baidu have announced major reforms to their search engines in an attempt to integrate large artificial intelligence models into search in order to provide users with a richer and more accurate experience. But after the excitement, there may be a "dirty secret" behind the new tools.
The application of AI in search will lead to an increase in carbon emissions. Foreign media pointed out that the race to build a high-performance artificial intelligence search engine is likely to require a significant increase in computing power, which will result in a significant increase in energy and carbon emissions required by technology companies.
Alan Woodward, a professor of cyber security at the University of Surrey in the UK, said: "there are already a lot of resources used to index and search Internet content, but the integration of artificial intelligence requires a different firepower. It requires processing power, storage, and efficient search. Whenever we see changes in the steps of online processing, we see a significant increase in the power and cooling resources required for large processing centers. I think the integration of artificial intelligence may take this step. "
The carbon surge in training large language models (LLMs) means parsing and calculating links in large amounts of data, which is why they are often developed by companies with a lot of resources, such as ChatGPT, which powers Microsoft's Bing search, and language models that support Bard, the Google chat robot.
"training these models requires a lot of computing power," says Carlos G ó mez- Rodriguez, a computer scientist at the University of University of Coru ñ an in Spain. "now, only large technology companies can train them."
Microsoft has integrated ChatGPT in Bing although neither OpenAI nor Google has disclosed the computing costs of their products, but third-party analysis released by researchers estimates that training of the GPT-3 model on which ChatGPT relies partly consumes 1287 megawatt-hours of electricity and produces more than 550m tonnes of carbon dioxide equivalent, the equivalent of a person commuting between New York and San Francisco.
"this number doesn't look so bad, but you have to consider the fact that you not only have to train it, but also execute it to serve millions of users." Rodriguz said.
Moreover, using ChatGPT as a stand-alone product is very different from integrating it into Bing. UBS, an investment bank, estimates that ChatGPT has an average of 13 million unique visitors a day. By contrast, Bing handles 500 million searches a day.
Martin Bouchard, co-founder of Canadian data center company QScale, believes that based on his knowledge of Microsoft and Google's search plans, adding generative artificial intelligence to the search process requires "at least four to five times the amount of computation per search".
In order to meet the needs of search engine users, companies must make changes. "if they are going to retrain the model frequently and add more parameters and the like, this is a completely different scale," says Mr Bouchard. "it will require a lot of investment in hardware. our existing data centres and infrastructure will not be able to cope with the consumption of generative artificial intelligence. they need too much performance."
How to reduce carbon emissions? According to data released by the International Energy Agency, data centers already account for about 1% of global greenhouse gas emissions. That number is expected to rise as demand for cloud computing grows, but companies that operate search engines have pledged to reduce their net contribution to global warming.
Microsoft has pledged to achieve negative carbon emissions by 2050 and plans to buy 1.5 million tons of carbon credits this year. Carbon credit, also known as carbon rights, refers to the right to emit 1 ton of carbon dioxide equivalent of greenhouse gases. Google is committed to achieving zero emissions across its entire business and value chain by 2030.
For these giants, one way to reduce the environmental footprint and energy costs of integrating artificial intelligence into search is to shift data centers to cleaner energy and redesign neural networks to make them more efficient. Reduce the so-called "inference time", the computing power that algorithms need to process new data.
"We have to study how to reduce the inference time required for such a large model," says Nafis Sadat Mosavi, a lecturer in natural language processing at the University of Sheffield, who works on the sustainability of natural language processing. "now is a good time to focus on efficiency."
Google spokeswoman Jane Parker (Jane Park) said that Google's initial release of the Bard version is supported by a lightweight large language model. " We also published a study detailing the energy costs of state-of-the-art language models, including earlier and larger versions of LaMDA, "Parker said." our results show that combining efficient models, processors and data centers with clean energy can reduce the carbon footprint of machine learning systems by 1000 times. "
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.