Hi,

At Wed, 18 May 2005 03:30:31 +0900,
Jonathan Paisley wrote in [ruby-talk:142938]:
> > Is this function called at each time when .so is loaded, and
> > the area pointed by _NSGetArgv() shouldn't be changed?
> > 
> > I suspect that system dependent initialization of arguments
> > should be integrated.
> 
> The _NSGetArgv function would appear to be an private function (inferring 
> this from the leading underscore), so perhaps it'd be inappropriate to 
> use it in the ruby core. Having said that, it is declared in <crt_externs.h>...
> 
> Perhaps an alternative would be to modify set_arg0() to change the
> non-first argument to be empty C strings ("") rather than NULL?

"-AppleLanguages" can be disappeared?

> More below:
> 
> > +#if defined(__APPLE__) && (defined(__MACH__) || defined(__DARWIN__)) && !defined(__MacOS_X__)
> 
> What is the purpose of the !defined(__MacOS_X__) ?

Just copied from process.c.

> > +	int i, n = *argc, len = 0;
> > +	char **v1 = *argv, **v2 = ALLOC_N(char*, n + 1);;
> > +	for (i = 0; i < n; ++i) {
> > +	    *v2[i] = strdup(*v1[i]);
> 
> I think the above line should be:
>             
>             v2[i] = strdup(v1[i]);

Yes, of cource.

What about this?

void
ruby_sysinit(argc, argv)
    int *argc;
    char ***argv;
{
#if defined(__APPLE__) && (defined(__MACH__) || defined(__DARWIN__))
    int i, n = *argc, len = 0;
    char **v1 = *argv, **v2 = ALLOC_N(char*, n + 1);
    MEMCPY(v2, v1, char*, n);
    v2[n] = 0;
    for (i = 0; i < n; ++i) {
	v1[i] = strdup(v1[i]);
    }
    *argv = v2;
#elif defined(__MACOS__) && defined(__MWERKS__)
    *argc = ccommand(argv);
#endif
}

-- 
Nobu Nakada