When you browse through your Instagram feed on your smartphone, not only does the app’s algorithms are figuring out what you like but also what you would like to see next and how to curate your feed better. While it can be a bit scary to think that such Machine Learning (ML) and Artificial Intelligence (AI) algorithms are learning what we like or dislike, avoiding it is next to impossible. However, for those who dare to grow and keep up with the changing times, there’s no better time to be alive.
Banking on the skyrocketing needs to handle intensive computing by systems worldwide, Alibaba has introduced a new Hanguang 800 AI inference chip. The new processor is the company’s first chip to utilise the company’s self-developed hardware framework and algorithms, and it apparently speeds up ML tasks on the company’s platform by a factor of 12. Here’s what you need to know.
New chip for faster AI inferencing
Alibababa’s new Hanguang 800 chip was announced at the company’s Apsara Conference 2019 in Hangzhou, China. It is already handling tasks requiring heavy processing on the company’s e-commerce website like product search, image analysis, personalised recommendations and automatic translations. “The launch of Hanguang 800 is an important step in our pursuit of next-generation technologies, boosting computing capabilities that will drive both our current and emerging businesses, while improving energy efficiency,” Said Alibaba Chief Technology Officer, Jeff Zhang.
As per the company, traditional GPUs would require an hour or so to categorise 1 billion new product images that are daily uploaded on Taobao, which is an online shopping website owned by Alibaba. The new Hanguang 800 is claimed to perform the same task in merely five minutes.
Who is the new chip aimed at?
While the Hanguang 800 is currently being used in Alibaba’s online retailer website, the company says there’s no immediate plans to sell it as a standalone product. It will be used in the company’s computing arm’s data centers so that its clients can reap the benefits of faster AI inferencing and ML processing. The new chip was created by T-Head, which is the unit handling the development of chips for cloud and edge computing within Alibaba DAMO Academy, the global research and development initiative by Alibaba.
The new announcement comes after Alibaba DAMO commenced developing its first ML focused chip last year. Five months later, the T-Head division was announced, which launched the “XuanTie 910” processor, which is an Internet of Things (IoT) chip based on the RISC-V architecture that began as an open-source project at UC Berkeley. DAMO also teams up with universities around the world like Tel Aviv and UC Berkeley to announce research projects focused on machine learning, Natural Language Processing (NLP), network security, and visual computing.
Image credits: Alibaba
Stay tuned to Silicon Canals for more European technology news.