It will creep up on us.
It's obviously a bot, rather than a human pretending to be a bot.
It will crawl at different times of the day.
This means that:
A smart bots, it may be able to distinguish between us and Googlebot.
Because Googlebot will put more pressure on our web servers, it may behave differently. When websites are visited by too many bots or visitors at once, they may take certain actions to help keep canada number data website online. They may turn on more computers to power the website (this is called scaling), they may also try to limit the rate of users requesting too many pages, or serve reduced versions of pages.
Servers run tasks periodically. For example, a listings website might run a task every day at 01:00 to purge all of its old listings, which can impact server performance.
It would be heartless to work out what's going on with these intermittent effects. You'll probably need to talk to a backend developer.
Depending on your skill level, you may not know where to take the discussion. A useful structure for the discussion is often to talk about how the application moves through your technology stack and then look at the edge cases we discussed above.
What happens to servers under heavy load?
When are important scheduled tasks due?
Two useful pieces of information to enter this conversation:
Depending on the regularity of the problem in the logs, it is often worth trying to recreate the problem by trying to crawl the website with a crawler at the same speed/intensity that Google is using to see if you can find/cause the same problems. This won't always be possible depending on the size of the site, but it will be for some sites. Being able to consistently reproduce a problem is the best way to resolve it.
However, if you can't, try to provide the exact time when Googlebot was seeing the issues. This will give the developer the best chance to tie the issue to other logs so they can debug what's going on.
If our website is blocking
-
- Posts: 306
- Joined: Tue Jan 07, 2025 4:41 am