Well, it depends what you are measuring …
There have been many attempts to quantify the question of whether brains or machines are faster, however the results are apples and oranges. To really communicate the difference there needs to be some commonality to which we can compare. There are several ways we can test processing speed, so let’s jump in.
For starters we can look at something unfair for humans: arithmetic. A good fifth grader should be able to answer addition, subtraction, and multiplication tables at a rate of about one answer per second. A 1GHz computer, like your old cellphone, can do a billion per second. So clearly conscious arithmetic is not our strong point.
However, let’s turn that number on its face: how many operations per second does our brain do in our subconscious visual cortex? If you imagine a 3D figure and rotate, zoom, or transform it, you should find that it isn’t that hard and it all happens in realtime. The Graphics Processing Unit of the human imagination still puts high-end computer GPUs to shame. This is strange though, there are only ~140 million neurons on each lobe of the visual cortex, however there are over 30 billion transistors on the latest processors. Clearly our brains are more efficient at this sort of processing, but is that fair to compare hardware without attention to software.
So we see that humans are in fact running on out-of-date hardware, but we are much more efficient where it counts. Today’s computers on the other hand are beastly machines, but the software/hardware mix is very inefficient and slow. The result is that humans still have a significant processing advantage in areas that we have already adapted to. For other games that don’t come naturally to us, we are outmatched. For all deterministic games of historical significance, humans play orders of magnitude less proficiently than modern AIs.