Iris CPU vs. GPU
Does Iris utilize the CPU or GPU more
GrayMeta Iris does not use GPU for HW acceleration. All the 3rd party codecs and software we used did not make use of the GPU and we prided ourselves in making a professional player without the dependency on expensive GPU Hardware.
Iris QC Pro for the last 10 years did not utilize it and even the latest codecs for Dolby, Prores and JPEG2000 we use only required enhanced CPU instruction sets like SSE1,2,3,4, AVX, AVX2, AVX512 etc which nearly all modern CPUs have. Iris is quite thread and memory intensive so the more threads and faster memory you have, the better.
The Iris Anywhere product is based on the backend processing of Iris QC Pro so its no different. One of the possible ways we could use GPU HW in the future would be to use on-board H.264, HEVC, VP8 or VP9 encoders available on certain families of chipsets. Unfortunately, the open source WebRTC libraries we use does not support HW encoding presently but there are requests to support that from the WebRTC community.