In particular, you should notice that the syntax there involves some intrasymbol syntax, which it expects to have visible.
Specifically, if you make intrasymbol syntax part of the reader, you get the following potential problem:
intrasymbol syntax will have to be completely regular across the entire Arc environment. You can't do tricks like I did in 'w/html, where div.bar doesn't mean (div bar) but rather means "the <div> element with the bar id". Yes, you could probably modify w/html so that it can understand (div bar): but what if the programmer wants to, say, redefine the intrasymbol syntax for foo#bar to mean, say (en-number foo bar), then suddenly w/html will break. And what if programmer B wants to, say, redefine foo#bar to mean (number-en foo bar)? How will anything that uses a slightly different intrasymbol syntax work with that?
Programmer A decides he or she wants to specially treat #\@. Programmer B decides he or she doesn't. Now, load Programmer B's code into Programmer A's environment. Oh, and Programmer B has been writing a lot of functions with "@" in their names.
If you're not going to allow #\@ to be specially treated, why should you specially treat #\., #\!, #\~ or #\: ?
#\' and friends, after all, aren't intrasymbol syntax. In fact, #\. is treated differently from within the context of a symbol from within the context of a list.
This is where "code is spec" fails, bad. Me, I say, someone has to choose one or the other, define it as "this is spec!", and everyone follows it. Your move, PG?
If the reader can be configured (e.g. by specifying wich read table to use) then two modules that uses different reading conventions can coexist by simply using their own configuration.
Now programmer C wants to use both programmer A's module and programmer B's module. Which readtable does he use so that he can freely intermix macros from A with macros from B, which have different expectations on the reader?
Reader hacking is nice, but I don't see it often in CL libraries (note: counterexamples are welcomed; it's not like I've made an exhaustive search for them). Any reader hack must make the cut of being a good, generic enough meaning that it will always be used by everyone; take for example the Arc-type [ ... _ ... ] syntax
CLSQL modifies the read table to let you write embedded SQL queries such as [select "A" [where [= ...]]] and similar (I've never studied the exact syntax, but this should give you the idea). The special reader in CLSQL can be activated/disactived through function calls that modifies the default reader.
It looks like CLSQL needs reader macros to switch the syntax on and off locally. If Arc had reader macros, then you could do this:
#.(with-A (mac macro-A ..blah..blah..in special A syntax))
Assuming 'with-A is a function that set the read table locally, and macro-A uses quasi-quote to generate its result, this will produce a macro that produces standard Arc syntax, even though it's written in A syntax.
With reader macros, 'w/html could be implemented even if de-sugaring were moved to the reader, although you'd have to call it with #. all the time.
It makes sense to me that macros should always expand to vanilla Arc syntax (or maybe even pure s-exps without any ssyntax) so that they are portable across environments.