The world, today, is dominated by Artificial Intelligence. From automated human resource management to automated streamlined manufacturing and from virtual assistants to self-driving cars, AI has become indispensable in almost all sectors. According to “The Global Artificial Intelligence Trends 2020” survey, conducted by Analytics Insight, 37% of Artificial Intelligence Technologies are adopted by the high tech industry. But, an ever-increasing usage of this technology has led to the overuse of vast amounts of data and computing power while depending on centralized cloud systems. Since extensive energy is utilized in training the modern AI, the process results in large amounts of carbon emissions. Researchers at the University of Massachusetts, Amherst, found out that training a single AI can emit as much as 284 tonnes of carbon dioxide which is equivalent to five times the lifetime carbon emission of an average car. This is not only hazardous for the environment but is also obstructing the speed and privacy of AI applications. In order to counter this problem, Artificial Intelligence researchers, along with tech giants, are focusing on developing Tiny AI.
What is Tiny AI?
Tiny AI is also known as Tiny ML, that is Tiny Machine Learning, which runs on lesser energy. It is nothing but an attempt by academic researchers in developing compressed AI algorithms to reduce the size of existing machine-learning models that utilize large amounts of datasets and computational power. Tiny AI is a step towards ‘Green Computing’ that involves not only shrinking the size of AI models but also accelerating their inference while maintaining their capabilities. The methods used to develop these compressed algorithms are known as distillation methods and can be used to scale down a model 10 times its existing size.
This can be best understood with the example of Google Assistant software which was previously a program, approximately, 100 gigabytes in size. But, in May 2019, Google CEO – Sundar Pichai announced that the Google Assistant had now been reduced to roughly half a gigabyte in size and that users need not send requests to a remote server. This also drastically reduced the software’s carbon footprint. Another example is Apple’s virtual assistant, Siri, whose speech recognition capabilities are run locally on the iPhone.
The reduced size of models enables programs to be directly installed on the device itself and does not require users to send data to the cloud or a remote server. This proves that Tiny AI will play a major role in reducing AI technology’s environmental footprints.
To know more about the benefits and applications of Tiny AI , please enter your email ID below