Nvidia’s B200: The New AI Powerhouse

www.analyticsinsight.net

Blackwell 200: Flagship AI chip by Nvidia

In the realm of Artificial Intelligence, Nvidia has again raised the bar with the launch of its flagship AI chip, the Blackwell 200 and it works 30 times faster than its predecessors.

CEO Jensen Huang of Nvidia unveiled the new Blackwell series of AI processors at the company’s annual GTC event in San Jose, promising unsurpassed speed and efficiency, marking a big leap forward in how artificial intelligence devices perform. He also stated that the Nvidia Blackwell 200 chips are faster in a range of seven to thirty as compared to H100 while 25 times less low power consumption making it a new AI powerhouse.

The B200 is packed with eight Nvidia B200 Tensor Core GPUs, interconnected by fifth-generation Nvidia NVLink, offering a performance that is three times the training and fifteen times the inference performance of its predecessors. The new chip comes with 208 billion transistors, double the 80 billion on the company’s prior processor. All of those transistors can access the chip’s memory almost simultaneously, increasing productivity.

Huang stressed the critical role of Blackwell GPUs in driving the “new Industrial Revolution,” emphasizing the transformative potential of generative AI. NVIDIA wants to unlock the maximum capabilities of AI, transforming industries and driving innovation, with assistance from prominent companies across various sectors.

The technological strength of Blackwell chips stems from their unparalleled performance capabilities. With speeds of up to 20 petaflops, Blackwell outperforms the H100 by a significant margin, quadrupling computing capability.

This quantum leap is accomplished by the integration of 208 billion transistors, a significant increase over the H100’s 80 billion. NVIDIA accomplished this achievement by linking two large silicon dies, allowing for lightning-fast communication at speeds of up to 10 terabytes per second. NVIDIA is yet to reveal the price for Blackwell processors, the demand for its technology is tremendous.

In addition to it, Chief Executive Jensen Huang also unveiled a new set of software tools at the company’s annual GTC event. The new software tools, known as microservices, boost system performance across various uses, making it easier for a company to incorporate an AI model into its operations.

Nvidia also is shifting from selling individual chips to selling entire systems. Its most recent iteration houses 72 of its AI chips, 36 central processors, and contains 600,000 parts, and weighs 3,000 pounds (1,361 kg). Though Nvidia is best known for its chip designs, the company has also developed a significant battery of software products 

Over the past 12 months, Nvidia’s stock has risen to 240% making it the third-most valuable company on the US stock market, after Microsoft and Apple.

Its remarkable 12-month rally puts Nvidia’s stock at risk of falling down if the Santa Clara, California-based company fails to expand its AI business as much as investors anticipate. Nvidia’s market share is predicted to fall by several percentage points in 2024 as new products from competitors such as Intel and Advanced Micro Devices enter the market.

Huang also announced partnerships with the design software companies Ansys, Cadence, and Synopsys.

Source: www.analyticsinsight.net

Share This Article
Leave a comment