MSN’s Live Search is a little more confusing because they only have very limited support for wildcards in robots.txt files. Based on their docs, it looks like wildcards are supported. These are valid robots.txt rules for MSN’s Live.com:
User-agent: msnbot Disallow: /*.PDF$ Disallow: /*.jpeg$ Disallow: /*.exe$
However, “MSNdude” recently stated on WebmasterWorld that “Live Search does not support wildcards in robots.txt today; we are thinking about it.”
An asterisk that substitutes for another set of characters is a wildcard, so this statement is confusing.
I think that wildcards should be added to the robots.txt standard. Wildcards in robots.txt files are essential for the ability to block certain kinds of dynamic URLs. The original robots.txt standard should be updated and MSN should fully jump on board.