In the last 18 months, a plethora of startups have begun working on their own variations of AI hardware and/or deep learning algorithms. Very few of these startups have yet to establish a significant installed base and most have yet to ship a product but have had no trouble raising finance. Looking to optimize inference and machine training — two key parts of processes like image and speech recognition — startups have sought to find ways to pick away at these processes using methods that will make them faster, more power-efficient and generally better suited for the next generation of artificial intelligence-powered devices. Instead of the traditional computational architecture they have invested in GPUs which has become one of the go-to pieces of silicon for processing rapid-fire calculations required for AI processes. This is now extending to further new architectures attracting venture capital companies to invest big. Cerebras Systems picked up funding from Benchmark Capital […]