Great Post. What about Marvell? Can its ASIC chips gain market share in AI given that GPU will loose its market share to ASIC in AI based on Mckinsey recent report.
I'm very curious to hear more about XLA, why/how did NVIDIA have such a stronghold on the space for so long. Doesn't Google have a more challenging problem since they potentially need to work with all hardware or is XLA more optimized on TPUs?
Also how come optical computing is still a niche? Free inference sounds like something everyone would jump on? Any programmer friendly tutorials you'd recommend here? What does programming an optical computer look like and what are the key differences a practitioner should be aware of
Great Post. What about Marvell? Can its ASIC chips gain market share in AI given that GPU will loose its market share to ASIC in AI based on Mckinsey recent report.
Excellent post
I'm very curious to hear more about XLA, why/how did NVIDIA have such a stronghold on the space for so long. Doesn't Google have a more challenging problem since they potentially need to work with all hardware or is XLA more optimized on TPUs?
Also how come optical computing is still a niche? Free inference sounds like something everyone would jump on? Any programmer friendly tutorials you'd recommend here? What does programming an optical computer look like and what are the key differences a practitioner should be aware of