Hmm…
So that's cool, but now I'm thinking: the distant galaxies are redshifted and time-dilated in equal proportion, and also more densly packed because the universe was smaller in the past, so I expect the actual rate of supernovas to be significantly smaller than simply multiplying 1/century/galaxy by 1e11 galaxies.
Edit: also I don't know if rate of supernovas changes over history thanks to different steller environments giving the population-1/2/3 generations of stars…
I would imagine the supernova rate to be higher in the early universe, as we've already passed peak stellar formation rates and the heavier (and shorter lived) stars were more likely to be formed earlier when the average density of the universe was higher.
It probably isn't wildly lower today, we know of at least five or six big supernovae in the Milky Way in the past millennium. For 200B stars in our galaxy the size normalized rate implied by that would be like one ever 300 years. So if you extrapolated the Milky Way alone in (cosmological) modernity you would get 10/sec not 30/sec.
There is dust between us and most stars in the Milky Way that blocks them from view in visible light. Therefore we can only see a fraction of the supernovae in the Milky Way.
It is substantially easier for us to see supernovae in other galaxies that we're not facing edge-on. And we have a large sample size of such galaxies. That's why our best estimates of supernovae frequency are based on observations of such galaxies, and not on our observations of the Milky Way.