Artificial intelligence implementations often run on a powerful server and then client devices such as Alexa, Google Home, and mobile apps that utilize them send commands and requests to that server.
This is done on the premise that client devices such as phones and home assistants have limited processing power, and should therefore just have a server handle that for them. That concept has worked, however it has its challenges and there is always room for improvement.
If you use a home assistant, what you say to it is stored on the respective service provider’s servers and many people appreciate their privacy. Another issue with this AI cloud concept is that every request is going to use bandwidth or mobile data.
Samsung’s new Exynos 9820 phone SoC (System on Chip) incorporates an NPU for on-chip AI acceleration. This means that phones will be able to execute at least some AI-related functions themselves instead of just passing everything through a cloud server.
AI-related applications do require a large amount of processing power, but said amounts of power are now attainable by fairly small chips due to various technological advancements, most notably improved processor designs.
This can provide a substantial performance improvement because it won’t be slowed down by the user’s Internet connection (unless it is still partially reliant on a server). Any Internet connection is very slow compared to your phone’s internal data transfer rates (the rate at which it can transfer data between RAM and the CPU, for example).
In addition to that, that Exynos 9 chip delivers a 20% performance improvement in single-core processing, or a 40% power efficiency improvement. It also supports 2.0 Gbps broadband, which is incredibly fast if you have an Internet connection to match that.