assume the speed of light (and the uncertainty principle, or the product of the Planck constant and the speed of light) is the processing speed limit of the computer which simulates the universe.
Also assume that the universe has finite computational resources. Therefore resources have to be diminished in one location when they are in in demand another location.
The simulation of a complex event, such as some highly involved, very low redundancy, fast moving and extensive event (like some kind of very large very fast collision between two highly involved structures, there are better examples) will result in significant local load on the universe computer.
Given our two assumptions above then we have a measurable result:
Ether the speed of light (or Planck constants) will be diminished in that local region, or they or their product will be diminished in another region.
If the diminishment is local (or within a testable neighbourhood) this can be tested.
If the diminishment is not in a testable neighbourhood, perhaps other experimental constructions will work.
Wholly unfounded assumption. If the universe is a simulation, there's no reason why any particular constant would be the processing speed limit.
Indeed, even talking about processing speed limit is probably an incoherent concept when considering the question of whether the universe is a simulation. If our universe is a simulation, then some other entity is simulating our universe. We know absolutely nothing about the world of that entity, and have no grounds to conclude or assume anything whatsoever about the machinery of the simulator.
The egregious mistake in this "test" is the implicit assumption that the machinery some entity (in a universe we know nothing about) would use to simulate our universe would be a computer of the same basic architecture as the one sitting in your bedroom.