adplus-dvertising

What is Tiny AI?

  • Vanshika Kaushik
  • May 07, 2021
  • Artificial Intelligence
What is Tiny AI? title banner

Introduction

 

Climate crisis is overshadowing the positive aspects of technology. Black clouds of global warming are hovering on the world and posing extreme danger. 

 

Frosted snow capped mountains of Iceland which used to entice tourists in the past are  today merely standing tall and iceless.

 

But is technology a major contributor to the demon of global warming? Well the research suggests the answer in an affirmative tone. Modern Technology is emitting huge amounts of carbon dioxide and is responsible for placing the entire humanity’s future on stake.

 

The researchers worldwide couldn’t comprehend technology as the biggest reason behind the climate crisis so to eradicate the negative impact of technology researchers developed a new form of AI, i.e., tiny AI.


 

About Tiny AI 

 

Tiny refers to something which is minute,or little.Tiny AI refers to the new model of AI or ML that makes use of compressed algorithms to minimize the usage of large quantities of data and computational power.

 

It is an emerging area in the field of machine learning.The main aim of tiny AI is to reduce the size of artificial intelligence algorithms especially the ones that caters to voice or speech recognition.

 

Components of Tiny AI

 

Tiny AI or Tiny Machine Learning is comprised of following three components

 

  1. Tiny data

 

The big data converted that is compressed  by researchers through distillation in machine learning is known as tiny data.Tiny Data usage is synonymous to smarter data usage and compressing big data through network pruning is an inherent part of data conversion(from big data to tiny data).

 

  1. Tiny Hardware

 

Due to the advancements in Technology tiny AI can help developers with the production of tiny hardware firewalls and routers. These are simple and keep the devices safe even while travelling.

 

  1. Tiny Algorithm

 

Tiny Algorithm or Tiny Encryption Algorithm is a block cipher that is known for its simplicity and implementation. Tiny Algorithm can deliver the desired results in typically a few lines of code.

 

 

Need of Tiny AI

 

Training a sophisticated model of AI takes a huge amount of energy as AI adoption is spanning in multiple fields. It is important for technology to be efficient and green.GPU (graphic processing unit)is the major contributor to the heat .

 

The new models of AI that are helping in translations, writing, speech and voice recognition causes detrimental impact due to the CO2 emissions.

 

According to the researchers at the University of Massachusetts Amherst, training one single algorithm might consume 5x the lifetime carbon dioxide emissions of an average car or the equivalent of 300 round trip flights between New York and San Francisco.

 

The below table represents the estimated costs of training an AI model (Data Source)

Subheads 

Date of original paper

Energy Consumption(kWh) 

Carbon footprints(lbs of CO2e)

Cloud compute cost (USD)

 

Transformer (65Mparameters)

Jun,2017

27

26

$41-$140

Transformer(213Mparameters)

Jun, 2017

201

192

$289-$981

ELMo

 

Feb,2018

275

262

$433-$1472

BERT(110M parameters)

 

Oct,2018

1,507

1,438

$3,751-$12,571

Transformer (213M parameters w/architecture search)

Jan,2019

656,347

626,155

$942,973-$3,201,722

GPT-2

Feb,2019

   

$12,902-$43,008

The above data shows an alarming rate at which the computational cost and environmental cost are increasing in synchronization with the AI model’s increasing size.

 

In an attempt to achieve the maximum accuracy through the AI model the developers are responsible for roughly producing 700-1400  pounds of carbon dioxide.

 

The large scale NLP experiments are causing extreme damage to the environment.

 

BERT a Transformer-based machine learning model that helps Google to process Conversational queries produces roughly 1400 pounds of carbon dioxide and by far stands first  in the list of  AI models that produce maximum carbon emissions.

 

The world is in dire need of tiny AI to reduce the carbon emissions that are diluting the environment in all the possible ways.

 

(Must read: Google BERT)

 

 

Applications of Tiny AI

 

  1. Finance:

 

A lot of investment banks are making use of AI for data collection and predictive analytics.Tiny AI can help the financial institutions to convert large datasets into smaller ones to simplify the process of predictive analysis.

 

(Also check: Banking on AI)

 

  1. Teaching:

 

Devices that are built on simple ML algorithms such as specialized AI based tutoring systems can help to reduce the workload of teachers.VR headsets are also widely used and provide  an enriching experience to students.

 

 

  1. Manufacturing Industry:

 

With the advancements in technology robots will collaborate with human beings to ease their workload. Tiny ML can also help companies by analyzing their sensor data.

 

 

Advantages of Tiny AI or Tiny ML

 

  • Energy Efficient:

 

A single model of AI gives out 284 tonnes of carbon dioxide that is equivalent to five times as much as the lifetime emissions of an average cost.

 

Tiny AI produces minimum carbon emissions and thus doesn’t contribute to global warming.Tiny BERT is an energy efficient model of BERT that is 7.5 times smaller than the original version of BERT. It is even 96%better in the performance as compared to Google’s main BERT model.

 

  • Cost Effective: 

 

Artificial Intelligent models are extremely costly.Heavy expenditures are incurred on these models to ensure maximum accuracy.

 

The production of Alexa and Siri nearly cost nearly a  million dollars.Tiny AI models are cheap  when compared to these big budgeted voice assistants.

 

  • Speedy: 

 

Tiny AI is not only energy efficient and cheap it is also faster and quicker  when compared to the traditional models of AI.

 

Tiny BERT’s speed is overall 9.4 times faster when compared to BERT’s original model. It is apparently true that tiny AI is the future of AI.It is energy efficient,cost friendly and speedy. ML these days is used in all sorts of places .Every application has some kind of machine learning happening somewhere.

 

According to Pete Warden, Staff Research Engineer at Google  and co- author of the book Tiny ML,  the future of Machine Learning is tiny. He emphasizes that Deep Learning can be energy efficient with simple tiny algorithms. Voice interfaces have a wake word system(detection task of recognizing the “hot word” for activating speech assistants.

 

In the past the speech assistants systems were developed on big datasets but a recently  developed  full speed recognition system that can run locally on a pixel phone (fits within 80 megabytes)is a small victory for the  tiny ML researchers.

 

In this video below, Pete Warden talks about how developing simple algorithms  on embedded  microprocessors and other devices is creating a transformation in the traditional ML field.



Key Developments in the field of Tiny ML

 

  1. SparkEdge:

 

Spark Edge is developed on a single algorithm that can run speech recognition functions through a simple “wake word’ feature.It is developed on a tiny embedded microprocessor.

 

  1. Ardino device:

 

It can perform complex  functions like gesture recognition and voice recognition.It is developed in a simple algorithm and is very easy to install.

 

Tiny ML can be practically used in any field and researchers all over the world are working to develop more complex devices based on simple algorithms.

 

                                                 

Problems in full fledged transition to Tiny AI

 

Developing speech assistants and voice assistants through phones appears dreamy.It was a long sought dream of software developers which is now a reality.But there are a lot of challenges in full fledged transition to tiny AI.

 

The biggest challenge for both the researchers  and software developers is managing the trade off. 

 

Trade off includes reducing the size of the model through distillation technique at the same time maintaining high accuracy to ensure high performance for the interface.

 

Tiny AI cannot replace AI in certain domain specific domains.One such example is the Automobile industry.Self driving electric cars are the future but their programming is not possible in simple algorithms.

 

Also considering that a minute mistake in coding can be the reason behind someone’s demise.Even for diagnostics and medical imaging the code cannot be produced in a simple algorithm.

 

If it is produced in a simple way the results wouldn’t be accurate.Therefore it might be a breakthrough invention it is not possible to have a complete transition from big data sets to tiny ones.

 

(Recommended read: Branches of Artificial Intelligence)

 

 

Green AI


The image is stating about a simple algorithm (green AI) that produces minimum CO2 emissions

Green AI


“The proper use of science is not to conquer nature but to live in it”-Barry Commoner”

 

It all sounds cool until we are left with nothing…

In the rapidly changing environment it is important to GO Green.Sustainable environment approaches will go a long way .

 

Green AI is a newly coined term of AI research that delivers best results with a sustainable approach and low computational costs.

 

In times like this when environmental crises have become a global concern it is important to educate communities about the ways to  minimize  CO2 emissions that are a result of big AI models.

 

A large number of NLP experiments cause harm to the environment.Roy Schwartz (Phd student at the University of Jerusalem)and Jesse Dodge(Phd student at Cornell University) suggests that the AI research community has paid little attention to computational efficiency. (Source)

 

The prime focus of AI experts has been on accuracy rather than efficiency.This is the major reason behind increasing carbon footprint.

 

The  research  paper titled “Green AI” written by Roy Schwartz, Jesse Dodge, Noah A.Smith and Oren Etzioni draws attention on how the AI community has obtained state of art results by neglecting the most important aspects i.e. cost and efficiency.


Number of research paper on AI, source


All this compiled data compelled  the researchers to rethink about the ways of achieving sustainability through AI. In the research paper they also suggested measures to increase efficiency through  softwares that also reduced energy costs.

 

 

Last Note

 

AI is the gift of science to mankind.It is transforming every key sphere of life.It is simplifying the process for data analysts. It is helping the students to witness the evolution of technology while sitting at their homes.

 

It is even providing music producers with a platform where they can adjust the auto tune to deliver the best blend of music and videos. But the huge algorithms and large datasets are posing innumerable threats to the environment.

 

It is important to note  that technology at the present must not hamper the lives of future generations.Therefore tiny AI and green AI are the sustainable solutions that scientists should be focusing on. Hence, it won’t be an exaggeration to say that the future of AI is green and tiny.

0%

Comments