The robots.txt file tells search engines what they can and cannot crawl on your website. It’s a simple text file, but it plays a significant role in SEO. To control how search engines look at your site, manually edit or overwrite the robots.txt file.
By default, WordPress creates a virtual robots.txt file. But you can create your file and upload it to your site. In this guide, I’ll show you how to overwrite the robots.txt file in WordPress manually.
Step 1: Check If You Already Have a Robots.txt File
First, you need to check if there is an existing file.
How to Check:
- Open your browser
- Go to: https://yourdomain.com/robots.txt
- If you see text, then WordPress is using a virtual file.
- There is no physical file if you see a blank page or an error.
You can still overwrite it by creating your version.
Step 2: Create a New Robots.txt File
You can create the file using any text editor like Notepad (Windows) or TextEdit (Mac).
Steps:
1. Open your text editor
2. Add this sample code or write your own:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap.xml
3. Click Save As
4. Name the file robots.txt
5. Make sure the file type is All Files (not .txt.txt)
Your file is ready to upload.
Step 3: Upload Robots.txt to Your WordPress Site
Now, you need to upload the file to your site’s root directory.
How to Upload via File Manager (cPanel):
- Log in to your hosting panel (like cPanel)
- Open File Manager
- Go to the public_html folder
- Click Upload
- Select your robots.txt file and upload it.
- Confirm it shows in the root folder.
How to Upload via FTP (FileZilla):
- Open FileZilla or any FTP software
- Connect to your site using FTP login.
- Navigate to the public_html folder.
- Drag and drop the robots.txt file into the folder.
- Wait for the upload to complete
Now, your manual file has overwritten the virtual one.
Step 4: Check If the File Is Live
After uploading, you need to make sure it’s working.
Steps to Check:
- Open your browser
- Go to: https://yourdomain.com/robots.txt
- You should now see your custom rules.
- If not, try clearing your browser cache or wait a few minutes.
Step 5: Edit Robots.txt Anytime You Want
If you want to make changes later, re-upload the file with new rules.
Block a folder:
Disallow: /private-folder/
Allow a specific bot:
User-agent: Googlebot
Allow: /
Block all bots:
User-agent: *
Disallow: /
Just save and upload the updated file using the same steps.
Step 6: Avoid These Common Mistakes
Here are a few things to avoid when editing robots.txt:
- Don’t block search engines from important content
- Don’t forget to allow your sitemap URL
- Don’t use capital letters – it’s case-sensitive.
- Don’t leave a blank Disallow line (use /).
Use Google Search Console to Test the File
You can test your robots.txt file with Google Search Console.
Steps:
- Go to Google Search Console
- Choose your property (your domain)
- Use the robots.txt tester tool.
- Paste your rules
- Click Test
- Fix any errors if shown.
This helps you make sure search engines can crawl your site correctly.
Conclusion
You now know how to overwrite the robots.txt file in WordPress manually.
- First, check if you already have one.
- Create a new file with clear rules.
- Upload it to the root folder using File Manager or FTP.
- Test the file to make sure it works.
- Edit and re-upload anytime you want.
This simple file gives you complete control over how bots and search engines interact with your site. Use it wisely to boost your SEO and hide sensitive content from crawlers.