I have an SPA built using create-react-app and wish to have a robots.txt like this:
http://example.com/robots.txt
I see on this page that:
You need to make sure your server is configured to catch any URL after it's configured to serve from a directory.
But for firebase hosting, I'm not sure what to do.
In my /public directory, I created a robots.txt.
In my /src directory, I did the following:
I created /src/index.js:
import React from 'react'
import ReactDOM from 'react-dom'
import {TopApp} from './TopApp'
import registerServiceWorker from './registerServiceWorker'
import {BrowserRouter} from 'react-router-dom'
ReactDOM.render(
<BrowserRouter>
<TopApp/>
</BrowserRouter>,
document.getElementById('react-render-root')
)
registerServiceWorker()
I created /src/TopApp.js:
import React from 'react'
import {
Switch,
Route
} from 'react-router-dom'
import {ComingSoon} from './ComingSoon'
import {App} from './App'
export class TopApp extends React.Component {
render() {
return (
<div className="TopApp">
<Switch>
<Route path='/MyStuff' component={App}/>
<Route exact path='/' component={ComingSoon}/>
</Switch>
</div>
)
}
}
Because path /robots.txt is not covered by the router paths provided, it took it from my public directory and robots file was published as desired.
The same could be done for sitemap.xml.
Just add the following rules to the "rewrites" section in firebase.json
"rewrites": [
{
"source": "/robots.txt",
"destination": "/robots.txt"
},
{
"source": "**",
"destination": "/index.html"
}
]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With