Imagine you are looking for information about cooking recipes on a website. An unfriendly URL could be something like:
This doesn’t offer much information about
The content of the page. Instead, a friendly URL for the same content could be:
Ejemplo.com/recetas/paella-de-marisco”
Friendly URLs are web addresses that are designread and understand. Instead of being long and full of strange characters, they are structur in a clear and descriptive way, making them easier to interpret. In other words, we move from what could be technical URLs to logical and user-friendly ones.
What are the advantages of friendly URLs:
Errors are easier to avoid: Friendly URLs, although not mandatory, are easier to work with. to avoid confusion.
Improve user experience: Descriptive URLs help users understand what the page is about before clicking on the link. This allows them to make truemoney database more inform decisions about whether the content is relevant to them.
They make it easier to share and remember:
Friendly URLs are easier to share on social networks and other mia, since they have a logical structure and are easier to copy elsewhere. And how does Google crawl and index those URLs?
Google crawls and indexes links through Googlebot, these robots also known as spiders. What they do is crawl the Internet, find pages and save essential focus on recovery emails data to form their index. Note that these bots will not always have the same objective. Googlebot spiders have two main jobs:
Discover URLs
They will do this through links, which are the key to crawling and indexing. And the thing is that bots, every time they reach a page through a link, will not usa data only look at the content, but also at what other links it contains, saving them (or not, we will see in Part II) in a list that we know as the crawl queue, so the second job would be.