Insights into Editorial: The rise of AI chips – INSIGHTSIAS

[ad_1]

 

 

 Introduction:

The adoption of Artificial Intelligence (AI) chips has risen, with chipmakers designing different types of these chips to power AI applications such as natural language processing (NLP), computer vision, robotics, and network security across a wide variety of sectors, including automotive, IT, healthcare, and retail.

Market leader Nvidia recently announced its H100 GPU (graphics processing unit), which is said to be one of the world’s largest and most powerful AI accelerators, packed with 80 billion transistors.

Earlier this month, Nvidia’s rival Intel launched new AI chips to provide customers with deep learning compute choices for training and inferencing in data centres.

The increasing adoption of AI chips in data centres is one of the major factors driving the growth of the market.

 

What are AI chips?

  1. AI chips are built with specific architecture and have integrated AI acceleration to support deep learning-based applications.
  2. Deep learning, more commonly known as active neural network (ANN) or deep neural network (DNN), is a subset of machine learning and comes under the broader umbrella of AI.
  3. It combines a series of computer commands or algorithms that stimulate activity and brain structure.
  4. DNNs go through a training phase, learning new capabilities from existing data.
  5. DNNs can then inference, by applying these capabilities learned during deep learning training to make predictions against previously unseen data.
  6. Deep learning can make the process of collecting, analysing, and interpreting enormous amounts of data faster and easier.
  7. These chips, with their hardware architectures and complementary packaging, memory, storage and interconnect technologies, make it possible to infuse AI into a broad spectrum of applications to help turn data into information and then into knowledge.
  8. There are different types of AI chips such as application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), central processing units (CPUs) and GPUs, designed for diverse AI applications.

 

Information about Artificial Intelligence (AI):

Artificial Intelligence (AI) describes the action of machines accomplishing tasks that have historically required human intelligence.

It includes technologies like machine learning, pattern recognition, big data, neural networks, self algorithms etc.

AI involves complex things such as feeding a particular data into the machine and making it react as per the different situations.

It is basically about creating self-learning patterns where the machine can give answers to the never answered questions like a human would ever do.

Example: Million of algorithms and codes are there around the humans to understand their commands and perform human-like tasks.

Facebook’s list of suggested friends for its users, a pop-up page, telling about an upcoming sale of the favourite brand of shoes and clothes, that comes on screen while browsing the internet, are the work of artificial intelligence.

 

Are AI chips different from traditional chips?

  1. When traditional chips, containing processor cores and memory, perform computational tasks, they continuously move commands and data between the two hardware components.
  2. These chips, however, are not ideal for AI applications as they would not be able to handle higher computational necessities of AI workloads which have huge volumes of data.
  3. Although, some of the higher-end traditional chips may be able to process certain AI applications.
  4. In comparison, AI chips generally contain processor cores as well as several AI-optimised cores (depending on the scale of the chip) that are designed to work in harmony when performing computational tasks.
  5. The AI cores are optimized for the demands of heterogeneous enterprise-class AI workloads with low-latency inferencing, due to close integration with the other processor cores, which are designed to handle non-AI applications.
  6. AI chips, essentially, reimagine traditional chips’ architecture, enabling smart devices to perform sophisticated deep learning tasks such as object detection and segmentation in real-time, with minimal power consumption.

 

What are their applications?

  1. Semiconductor firms have developed various specialized AI chips for a multitude of smart machines and devices, including ones that are said to deliver the performance of a data centre-class computer to edge devices.
  2. Some of these chips support in-vehicle computers to run state-of-the-art AI applications more efficiently.
  3. AI chips are also powering applications of computational imaging in wearable electronics, drones, and robots.
  4. Additionally, the use of AI chips for NLP applications has increased due to the rise in demand for chatbots and online channels such as Messenger, Slack, and others. They use NLP to analyse user messages and conversational logic.
  5. Then there are chipmakers who have built AI processors with on-chip hardware acceleration, designed to help customers achieve business insights at scale across banking, finance, trading, insurance applications and customer interactions.
  6. As AI becomes pervasive across different workloads, having a dedicated inference accelerator that includes support for major deep learning frameworks would allow companies to harness the full potential of their data.

 

What can be expected in the future?

  1. AI company Cerebras Systems set a new standard with its brain-scale AI solution, paving the way for more advanced solutions in the future.
  2. Its CS-2, powered by the Wafer Scale Engine (WSE-2) is a single wafer-scale chip with 2.6 trillion transistors and 8,50,000 AI optimised cores.
  3. The human brain contains on the order of 100 trillion synapses, the firm said, adding that a single CS-2 accelerator can support models of over 120 trillion parameters (synapse equivalents) in size.
  4. Another AI chip design approach, neuromorphic computing, utilises an engineering method based on the activity of the biological brain.
  5. An increase in the adoption of neuromorphic chips in the automotive industry is expected in the next few years, according to Research And Markets.
  6. Additionally, the rise in the need for smart homes and cities, and the surge in investments in AI start-ups are expected to drive the growth of the global AI chip market, as per a report by Allied Market Research.
  7. The Worldwide AI chip industry accounted for $8.02 billion in 2020 and is expected to reach $194.9 billion by 2030, growing at a compound annual growth rate (CAGR) of 37.4% from 2021 to 2030.

 

Conclusion:

The UN Secretary-General’s Roadmap on Digital Cooperation is a good starting point: it lays out the need for multi-stakeholder efforts on global cooperation so AI is used in a manner that is trustworthy, human rights-based, safe and sustainable, and promotes peace”.

And UNESCO has developed a global, comprehensive standard-setting draft Recommendation on the Ethics of Artificial Intelligence to Member States for deliberation and adoption.

Agreeing on common guiding principles is an important first step, but it is not the most challenging part.

It is where principles meet reality that the ethical issues and conundrums arise in practice, and for which we must be prepared for deep, difficult, multi-stakeholder ethical reflection, analyses and resolve. Only then will AI provide humanity its full promise.

[ad_2]

Leave a Comment