I've got the following set up on my Firebase web app (it's a Single Page App built with React):
firebase.json
"rewrites": [{
"source": "/**",
"function": "ssrApp"
}]
Basically every request should go into my ssrApp
function, that will detect robot crawlers user-agents and decide wheter it will respond with the SSR version for the robots, or the JS version for the regular users.
It is working as intended. Google is indexing my pages, and I always log some info about the user agents from my ssrApp
function. For example, when I'm sharing an URL on Whatsapp, I can see Whatsapp crawler on my logs from Firebase Console (see below):
But the weird thing is that I'm not being able to mimick Googlebot using Chrome's Network Conditions tab:
When I try to access my site by using Googlebot's user agent I get a 500 - Internal error
And my ssrApp
functions isn't even triggered, since NOTHING is logged out from it.
Is this a Firebase Hosting built-in protection to avoid fake Googlebots? What could be happening?
NOTE: I'm trying to mimick Googlebot's user agent because I want to inspect the SSR version of my app in production. I know that there are other ways to do that (including some Google Search Console tools), but I thought that this would work.
Could you check that your pages are still in the Google index? I have the exact same experience and 80% of my pages are now gone... When I look up a page in Google Search Console https://search.google.com/search-console it indicates there was an issue during the last crawl. When I "Test it Live" it spins and reports the error 500 as well and asks to "try again later"...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With