Jacob Fugal wrote:

>Hence my disclaimer. :) I'm just demonstrating that it's possible,
>making it efficient will be a big project.

It'd be something interesting to see at any rate, even if it weren't
as efficient as possible immediately. It's another of those things I'm
not sure I'm up to programming myself though. ^_^;

As I understand it, each color would have it's own bit, right?

So for ansi color that'd be 8 bits for basic colors, and a bit each for
bold, italic, underline, and strikethrough. What else other protocols
might need I'm not sure about, which is probably one of the things
complicating this...

A user might also find it desirable to be able to define their own
desired coding to be put into the text, to simplify later use. (As in,
why shouldn't I be able to tell it "when it changes to bold blue text,
send me this" and so forth, instead of sending me some other set
of tags that I then have to figure out how to parse?)

For that matter, I'm wondering just how configurable it could be made
without impacting speed. After all, such a thing could have uses other
than this MUD stuff...

> > At least, the behavior I would expect is to preserve the existing 
> coloration.
> > (Of course, then I specifically picked an example where there's no obvious
> > sensible way to do that. Still, I think keeping the color would 
> generally be
> > expected.)
>
>Well, once possible workaround for this is to pass initial state when
>decoding raw ascii. However, as you state, this example is ambiguous.
>Assume we replaced the shorter substring "d and bl" with the raw
>"ostrich", using this technique. We'd end up with something like:

Doing it that way I think would make more sense. In most sensible
applications that's probably the behavior you'd want, and as for
insensible applications... well, sometimes you can protect the user
only so much.

-Morgan 


-- 
No virus found in this outgoing message.
Checked by AVG Anti-Virus.
Version: 7.0.344 / Virus Database: 267.10.17/85 - Release Date: 08/30/2005