In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2024-10-09 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)12/24 Report--
According to news on December 4, Huang Renxun, CEO of Nvidia, announced not long ago that Super artificial Intelligence (AI) will overtake humans within five years. In this regard, Facebook parent company Meta chief scientist, deep learning pioneer Yang Likun (Yann LeCun) view is diametrically opposite. He believes that superintelligence will not come any time soon.
Tuyuan PixabayMeta recently held a media event in San Francisco to celebrate the 10th anniversary of its basic artificial intelligence research team. Yang Likun said at the event that he thinks it will take decades for current artificial intelligence systems to achieve humanoid perception in some sense. At that time, these common sense artificial intelligence systems will be more powerful and will no longer be limited to summing up mountains of texts in creative ways.
On Huang Renxun's point of view, Yang Likun commented: "I know Huang Renxun, the chief executive of Invida, who has benefited a lot from the artificial intelligence craze. This is an artificial intelligence war, and he is providing weapons."
Speaking of technicians trying to develop general artificial intelligence (AGI), Yang Likun said: "if you want to develop AGI, you have to buy more GPU." AGI is a kind of artificial intelligence equivalent to the level of human intelligence. As long as researchers at companies such as OpenAI continue to pursue AGI, they will need more Nvidia computer chips.
Yang Likun said that before the emergence of artificial intelligence at the level of human intelligence, there was more likely to be "cat-level" or "dog-level" artificial intelligence. The tech industry's current focus on language models and textual data is not enough to create the kind of advanced humanoid artificial intelligence systems that researchers have dreamed of for decades.
"text is a very bad source of information," Yang Likun explained. "it may take 20,000 years for humans to read the amount of text used to train modern language models. And even if a system is trained with the equivalent of 20,000 years of reading material, they may still not understand: if An and B are the same, then B is the same as A. there are many very basic things in the world, and the big model has not yet passed this kind of training."
As a result, Yang Likun and other Meta AI executives have been working hard on how to customize the so-called converter models used to create applications such as ChatGPT to process all kinds of data, including audio, image and video information. They believe that the more these artificial intelligence systems can find billions of hidden correlations between these different types of data, the more likely they are to accomplish more difficult tasks.
Meta's research includes software that can help people teach people how to play tennis better when wearing the company's Project Aria augmented reality glasses, which integrate digital graphics into the real world. Executives have demonstrated that a tennis player wearing augmented reality glasses can see visual cues that teach them how to hold tennis rackets correctly and swing their arms in a perfect way. In addition to text and audio, the artificial intelligence model needed to power this kind of digital tennis assistant also needs to mix three-dimensional visual data to support digital assistant to communicate with people.
These so-called multi-mode artificial intelligence systems represent the next frontier, but their development is not cheap. As more and more companies such as Meta and Google's parent company Alphabet work on more advanced artificial intelligence models, Nvidia is likely to gain a bigger advantage, especially without other competitors.
Nvidia has been the biggest beneficiary of the generative artificial intelligence boom, and its expensive graphics processing units have become a standard tool for training large language models. Meta relies on 16000 Nvidia A100 GPU to train its Llama AI software.
Asked whether the technology industry needs more hardware suppliers as Meta and other researchers continue to develop such complex artificial intelligence models, Yang Likun said: "the new model doesn't need it, but it will be fine." He added that when it comes to artificial intelligence, GPU technology is still the gold standard. However, future computer chips may not be called GPU.
Yang Likun is also skeptical of quantum computing, despite the resources invested by tech giants such as Microsoft, IBM and Google. Many researchers outside Meta believe that quantum computers can make great strides in data-intensive areas such as drug discovery because they can perform multiple calculations using so-called qubits rather than the traditional binary bits used in modern computing.
But Yang Likun is skeptical. "you can solve problems with quantum computing, or you can solve them more effectively with classical computers," he said. Quantum computing is just a fascinating scientific topic, and it is not clear what it actually means or whether it is feasible to make a really useful quantum computer. "
Mike Schroepfer, a senior researcher at Meta and former technology director, agrees, evaluating quantum technology every few years and believing that a useful quantum machine "may emerge at some point, but its time span is too long to help what we are doing".
"the reason we set up the artificial intelligence lab ten years ago is that it is clear that the technology will be commercialized within the time frame of the next few years," Shropf said. "
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.