I am trying to make a graph to show how much compute power you would need to “perfectly” emulation a system (per core), based on the PassMark - CPU Mark. perhaps for now under a basic GPU. A graph / CPU architecture, but now it would just be X86_64 and ARM (Maybe).
Each system (per core) will be show the core need to “perfectly” emulate the easiest game, the more demanding games, and average power needed to emulate ~80% of the title under Retroarch.
By “perfectly”, I mean the best possible for each emulation core, WITHOUT shader overhead at the console’s native resolution. a different graph will be use for up-scaling resolution, sense it might also need to list the GPU power as well.
The biggest questions are 1.) is such graph / info exist? 2.) what would be best way test this, is there something built-in I can use in Retroarch?