📷 Image Credits: The Hindu
In a recent announcement, CEO Tim Cook’s Apple revealed an exciting collaboration with OpenAI to integrate its advanced artificial intelligence model into Siri, the voice assistant. The partnership showcases Apple’s dedication to enhancing the capabilities of Siri through cutting-edge technology. However, a closer look at a technical document released after the event unveils the significant role played by Google in Apple’s AI training efforts. Apple’s engineers leveraged their framework software in combination with a variety of hardware, including on-premise GPUs and Google’s tensor processing units (TPUs) available on the cloud platform. Google, having developed TPUs for a decade, has established itself as a leading provider of AI training chips. The latest fifth-generation chips by Google offer impressive performance, comparable to Nvidia’s H100 AI chips, fostering robust AI training capabilities. Moreover, Google’s upcoming sixth-generation TPUs set to launch this year signify continuous innovation in the AI hardware space. These specialized processors are purpose-built for executing AI applications and training models efficiently, reflecting Google’s commitment to enabling advanced AI solutions. Both Apple and Google are at the forefront of AI innovation, driving the development of sophisticated AI technologies. While Apple did not disclose the extent of its reliance on Google’s hardware and software, utilizing Google’s chips typically involves accessing them through the cloud, akin to other leading cloud providers like AWS and Azure. The collaborative efforts between Apple and Google underscore the synergistic relationship between tech giants in pushing the boundaries of AI capabilities. As AI continues to revolutionize various industries, strategic partnerships and technological advancements are pivotal for unlocking the full potential of artificial intelligence in the digital era.