Two of the important things that search engines evaluate
when assessing your site’s “search relevancy” are:
How many other sites link to your content. Search
engines assume that if a lot of people around the web are linking to your
content, then it is likely useful and so weight it higher in relevancy.
The uniqueness of the content it finds on your site.
If search engines find that the content is duplicated in multiple places around
the Internet (or on multiple URLs on your site) then it is likely to drop the
relevancy of the content.
One of the things you want to be very careful to avoid when
building public facing sites is to not allow different URLs to retrieve the
same content within your site. Doing so will hurt with both of the
situations above.
In particular, allowing external sites to link to the same
content with multiple URLs will cause your link-count and page-ranking to be
split up across those different URLs (and so give you a smaller page rank than
what it would otherwise be if it was just one URL). Not allowing external
sites to link to you in different ways sounds easy in theory – but you might wonder
what exactly this means in practice and how you avoid it.