> include every single Unicode character?
Masamichi - Gavin means "all". The vast majority of fonts cover basic
European. That's not the issue.
Gavin - there are a few fonts that "aim to" include every character,
though none actually does. Here's a page with some basic info:
http://unix.st
> For example, if you want to use Japanese characters,
> I think that it is possible to set the Japanese font in txi-ja.tex.
To reiterate: as far as I know, it is not possible to set the font for
Japanese only in texinfo[.tex]. Thus the ja font, wherever it is
specified, would be used for
On 15 January 2016 at 18:13, Masamichi HOSODA wrote:
>> the following is created in the output auxiliary table of contents file:
>>
>> @numchapentry{f@"ur}{1}{}{1}
>>
>> Without it, it would be
>>
>> @numchapentry{für}{1}{}{1}
>>
>> Do you understand now how changing the active definitions can cha
> the following is created in the output auxiliary table of contents file:
>
> @numchapentry{f@"ur}{1}{}{1}
>
> Without it, it would be
>
> @numchapentry{für}{1}{}{1}
>
> Do you understand now how changing the active definitions can change
> what's written to the output files?
Thank you for yo
On 15 January 2016 at 17:15, Masamichi HOSODA wrote:
>> I think it could be done by changing the active definitions of bytes
>> 128-256 when writing to an auxiliary file to read a single Unicode
>> character and write out an ASCII sequence that represents that
>> character, probably involving the
(something like ``Table of Contents'' broken etc.)
That can be fixed in other ways, without resorting to native UTF-8.
>>>
>>> I agree.
>>
>> In the case of LuaTex, exactly, it can be fixed.
>> In the case of XeTeX, unfortunately,
>> it cannot be fixed if I understand correctly.
> Date: Fri, 15 Jan 2016 14:48:09 +
> From: Gavin Smith
> Cc: Texinfo , Ken Brown
>
> On 15 January 2016 at 14:47, Eli Zaretskii wrote:
> >> Date: Fri, 15 Jan 2016 12:30:59 +
> >> From: Gavin Smith
> >> Cc: Texinfo , Ken Brown
> >>
> >> On 14 January 2016 at 19:12, Gavin Smith wrote:
On 15 January 2016 at 15:19, Masamichi HOSODA wrote:
>>> (something like ``Table of Contents'' broken etc.)
>>>
>>> That can be fixed in other ways, without resorting to native UTF-8.
>>
>> I agree.
>
> In the case of LuaTex, exactly, it can be fixed.
> In the case of XeTeX, unfortunately,
> i
>> By switching to native UTF-8, the support in texinfo.tex for characters
>> outside the base font is lost, as far as I can see. Yes, you get some
>> characters "for free" (the ones in the lmodern*.otf fonts now being
>> loaded instead of the traditional cm*) but you also lose some characters
>>
On 15 January 2016 at 14:47, Eli Zaretskii wrote:
>> Date: Fri, 15 Jan 2016 12:30:59 +
>> From: Gavin Smith
>> Cc: Texinfo , Ken Brown
>>
>> On 14 January 2016 at 19:12, Gavin Smith wrote:
>> > I'm inclined to add "-L$(PERL_INC) -lperl", with "-lperl" determined
>> > from perl -V:libperl, w
> Date: Fri, 15 Jan 2016 12:30:59 +
> From: Gavin Smith
> Cc: Texinfo , Ken Brown
>
> On 14 January 2016 at 19:12, Gavin Smith wrote:
> > I'm inclined to add "-L$(PERL_INC) -lperl", with "-lperl" determined
> > from perl -V:libperl, with a special case for cygwin to change
> > "cygperl5_22.
On 1/15/2016 7:30 AM, Gavin Smith wrote:
On 14 January 2016 at 19:12, Gavin Smith wrote:
I'm inclined to add "-L$(PERL_INC) -lperl", with "-lperl" determined
from perl -V:libperl, with a special case for cygwin to change
"cygperl5_22.dll" into -lperl.
Done.
This works on Cygwin. Thanks.
K
On 14 January 2016 at 19:12, Gavin Smith wrote:
> I'm inclined to add "-L$(PERL_INC) -lperl", with "-lperl" determined
> from perl -V:libperl, with a special case for cygwin to change
> "cygperl5_22.dll" into -lperl.
Done.
On 15 January 2016 at 00:11, Karl Berry wrote:
> it means that you want to use native UTF-8 support in my humble opinion.
>
> Not necessarily. The problem isn't encodings, it's fonts. The two
> things are intimately and fundamentally tied together, and that cannot
> be escaped.
>
> By switch
14 matches
Mail list logo