GNOME Bugzilla – Bug 766148
CT font weights do not map correctly to PangoWeight
Last modified: 2017-11-01 03:54:39 UTC
In pangocoretext-fontmap.c, it has to map a CoreText floating point weight number to a PangoWeight when it's converting the CT cascade list into a Pango one before selection is done. The end result should be that the font gets described the same on OS X as it would on other systems, but the way it translates a ctweight into a Pango weight is wrong. Here's what it currently is: 200 PANGO_WEIGHT_ULTRALIGHT -1 <= w < -0.75 300 PANGO_WEIGHT_LIGHT -0.75 <= w < -0.50 350 PANGO_WEIGHT_SEMILIGHT -0.50 <= w < -0.25 380 PANGO_WEIGHT_BOOK -0.25 <= w < -0.10 400 PANGO_WEIGHT_NORMAL -0.10 <= w < 0 500 PANGO_WEIGHT_MEDIUM 0 <= w < 0.10 600 PANGO_WEIGHT_SEMIBOLD 1.10 <= w < 0.25 700 PANGO_WEIGHT_BOLD 0.25 <= w < 0.50 800 PANGO_WEIGHT_ULTRABOLD 0.50 <= w < 0.75 900 PANGO_WEIGHT_HEAVY 0.75 <= w < 1.00 However, I wrote a little utility [1] that prints out the CoreText weight of a font alongside the OS/2 weight and found there is a 1-to-1 mapping between them which is incompatible with the above list. For example, a font with the weight 600 is actually 0.3 in CT, and a font with the weight 700 is 0.4 in CT. Those both map to PANGO_WEIGHT_BOLD so you would never be able to select one of them! I ran my tool on more than 1,000 fonts in the Google Font library. Here are the results: # of fonts tested OS/2 CT 8 100 -0.700000 21 200 -0.500000 32 250 -0.365000 1 260 -0.338000 4 275 -0.297500 1 280 -0.284000 111 300 -0.230000 1 350 -0.115000 950 400 0.000000 102 500 0.200000 75 600 0.300000 317 700 0.400000 47 800 0.600000 66 900 0.800000 Since this is clearly 1-to-1, I think this could be used as a lookup table instead of doing the range check that currently happens. Aside: I ran into this because I'm working on a program that needs to create a PangoFontDescription that will resolve to a specific TTF, given that TTF file. It isn't possible right now because of the weight collisions shown here [1] https://gist.github.com/chearon/4e3042d5d752a8eaa1a1ea6fecc98010
Thanks. Feel like cooking a patch? I think we should do linear interpolation instead of bucketing. That's what we do in the pangofc backend these days.
Not sure its the same problem, but here's a blob of code we have sitting in Ardour, which computes a scalar factor by which we multiply the width values returned by Pango to avoid ugly results when laying out text on our canvas object. We're using this on OS X (10.5-10.latest) with Pango 1.36.8 #ifdef __APPLE__ if (_width_correction < 0.0) { // Pango returns incorrect text width on some OS X // So we have to make a correction // To determine the correct indent take the largest symbol for which the width is correct // and make the calculation Gtk::Window win; Gtk::Label foo; win.add (foo); win.ensure_style (); int width = 0; int height = 0; Glib::RefPtr<Pango::Layout> test_layout = foo.create_pango_layout ("H"); if (_font_description) { test_layout->set_font_description (*_font_description); } test_layout->get_pixel_size (width, height); _width_correction = width*1.5; } #else /* don't bother with a conditional here */ _width_correction = 0.0; #endif
This is for weight, not width, but I bet you are experiencing a similar problem. Maybe I'll look into that too when I get the chance @behdad, so basically I would do something like this: if (w < (-0.7 + -0.5)/2) return PANGO_WEIGHT_THIN; ... ?
I was a little confused in my previous comment because pangofc-fontmap was doing linear interpolation in an #if that I did not see. I've looked at FcWeightToOpenType and based my implementation off of that. So here is a patch that fixes the issue. Selection is *much* more accurate now, but it still isn't as good as on Linux. At first glance, I think it is in fact the width being mapped incorrectly, which would be another easy fix
Created attachment 327747 [details] Patch to fix weight issues on OS X
Created attachment 327748 [details] [review] Patch to fix weight issues on OS X (corrected typo in previous patch)
Thanks.
I know this issue is closed, but I'm wondering, because as of 10.11 (which was released before this bug was filed) OS X's CoreText.framework comes with a bunch of named constants for predefined weights. Equivalently, they are #define NSFontWeightUltraLight -0.800000 #define NSFontWeightThin -0.600000 #define NSFontWeightLight -0.400000 #define NSFontWeightRegular 0.000000 #define NSFontWeightMedium 0.230000 #define NSFontWeightSemibold 0.300000 #define NSFontWeightBold 0.400000 #define NSFontWeightHeavy 0.560000 #define NSFontWeightBlack 0.620000 except that OS X uses extern const variables instead of #defines. You'll notice two things. First, OS X does not define a name for all the names that Pango, GDI, DirectWrite, OS/2, etc. do. In fact, I made a quick table: Core Text DirectWrite Pango Pango Core Text Minimum -1 1 1 -0.7 Thinnest -0.8 100 100 -0.7 Thin -0.6 200 200 -0.5 Light -0.4 300 300 -0.23 Book 350 380 -0.115 Regular 0 400 400 0 Medium 0.23 500 500 0.2 Semibold 0.3 600 600 0.3 Bold 0.4 700 700 0.4 Heavy 800 800 0.6 Heavier 0.56 900 900 0.8 Heaviest 0.62 950 1000 Maximum 1 999 1000 0.8 (I arbitrarily decided where to put the spaces when the Core Text constant names did not include the equivalent type; this is what I'm using for libui right now, interpolating between the two adjacent spots.) Second, these values don't seem to match the measurements taken above. Now to make things even weirder, let's write a program to produce the same table as above, except instead of reading fonts from a set of named files, we instead read from the system font library. The Swift program to do this is basic; the Xcode playground is perfect for this: import Cocoa import CoreText let descs = CTFontCollectionCreateMatchingFontDescriptors(CTFontCollectionCreateFromAvailableFonts(nil)) as! NSArray as! [CTFontDescriptor] var counts = [Float: Int]() for desc in descs { let t = CTFontDescriptorCopyAttribute(desc, kCTFontTraitsAttribute) as! CFDictionary as NSDictionary let v = t[kCTFontWeightTrait] as! CFNumber as NSNumber if counts[v.floatValue] == nil { counts[v.floatValue] = 0 } counts[v.floatValue]! += 1 } counts.keys.sorted().forEach { (d) in print("\(d) - \(counts[d]!) fonts") } and produces -0.8 - 7 fonts -0.6 - 7 fonts -0.4 - 52 fonts -0.23 - 1 fonts -0.1 - 1 fonts 0.0 - 242 fonts 0.23 - 24 fonts 0.24 - 1 fonts 0.3 - 25 fonts 0.4 - 155 fonts 0.5 - 2 fonts 0.56 - 12 fonts 0.62 - 18 fonts which mostly corresponds to the values listed by Apple. Of course, more useful would be to only consider TrueType and OpenType fonts. So I made a modification of Caleb's original program. https://gist.github.com/andlabs/253a5b712959cdbb5a8ccc9bb555b823 Running this on my 10.11.6 box gives me count OS2 Core Text 1 0 0.000000 13 5 0.000000 2 7 0.400000 2 100 -0.800000 2 200 -0.600000 2 250 -0.400000 7 250 0.000000 4 275 -0.400000 1 300 -0.100000 18 300 -0.400000 4 300 0.000000 14 400 -0.400000 159 400 0.000000 1 400 0.240000 2 400 0.300000 15 400 0.400000 2 400 0.560000 2 400 0.620000 2 500 -0.400000 9 500 0.000000 10 500 0.230000 10 600 0.300000 3 600 0.400000 99 700 0.400000 2 700 0.500000 3 800 0.620000 6 900 0.560000 7 900 0.620000 1 1000 0.620000 So either - I'm not processing the OS/2 table right (and I'm not using FreeType because it does not seem to like DFONTs and I can't seem to find a way to get the TTC index out of a CTFontDescriptor) or - Apple ISN'T using the OS/2 weight to determine these numbers And I wonder why all the Google Web Fonts said other things...
Please file a new bug if something needs fixing in pango. I don't understand all the details you describe.
I'm asking because I don't know *if* something needs fixing in pango. I'm trying to make sense of the font weight values that Core Text gives and why the values I get from the fonts installed by default on OS X don't match the ones given in the original bug report. I'll have to run my own tests on the Google Web Fonts and possibly some others, it seems... What about the details I describe don't you understand?
Same as you: I don't know what to make of this information. :)
Okay, I just ran Caleb's original tool on the Google Fonts archive and got the same results as in the OP. However, I noticed that Caleb went through Core Graphics's deprecated CGFont API to create a CTFont. So I modified it to skip the middleman and go right to CTFont: https://gist.github.com/andlabs/253a5b712959cdbb5a8ccc9bb555b823#file-whatsthefont-files-c and... also got the same results. This is really weird; I need to do some more investigation. For what it's worth here's the results from the Adobe Font Folio 11: # OS2 Core Text 18 250 -0.365000 10 275 -0.297500 281 300 -0.230000 2 350 -0.115000 729 400 0.000000 8 450 0.100000 219 500 0.200000 7 550 0.250000 232 600 0.300000 2 650 0.350000 617 700 0.400000 72 750 0.500000 73 800 0.600000 30 850 0.700000 116 900 0.800000 12 950 0.900000 which correlates with Caleb's results, but not with what Apple says...
So here's the situation: - Caleb is right about Core Text's OS/2 font weight mappings. - Of course, if Core Text can't use the OS/2 font weight for whatever reason, it will use values matching the NSFontWeight constants I listed above, using a number of different search methods, some of which will result in values that differ slightly. - HOWEVER, if the font is installed, then Core Text doesn't even bother to peek inside; instead, it just asks OS X's "FontServer" for cached information! Try it: copy /Library/Fonts/Impact.ttf to another folder and run Caleb's tool on that copy — you'll see values that match what Caleb get, but run it on /Library/Fonts/Impact.ttf directly and you'll get values that don't match. This explains why I was getting different values from the system fonts. - So what Pango has now is probably right, and no further action would be necessary on its part. I'll need to figure out what I'm going to do with libui, since I need to do font matching with Core Text weights (since it seems Core Text does not do so itself... unless I'm using the APIs wrong). - Of course there might still be cases where Caleb's values don't quite match the actual font weight for a given system CTFontDescriptorRef, but it shouldn't differ by more than one weight class :S I'll write a blog post with the technical details and post it sometime in the coming months. (It will have the full definition of weight matching, so you can decide then if you want to address that last bullet point.) Thanks in the meantime!
Two more sets of data, just for completeness's sake: all the ttf and otf files that come with Windows 7, 8.1, and 10 combined 18 300 -0.230000 11 350 -0.115000 343 400 0.000000 12 600 0.300000 243 700 0.400000 3 800 0.600000 7 900 0.800000 all the ttf and otf files that come with OS X evaluated on their own from another folder so as to make Core Text think they aren't installed 1 0 -1.000000 1 0 0.000000 (this file might be hardcoded) 2 5 0.200000 (Core Text takes the OS/2 weight and multiplies it by 100 if it's less than 10, so 5 becomes 500) 2 100 -0.700000 1 200 -0.500000 1 300 -0.100000 3 300 -0.230000 84 400 0.000000 1 400 0.240000 (this file might be hardcoded) 2 410 0.020000 2 420 0.040000 2 430 0.060000 6 500 0.200000 3 600 0.300000 37 700 0.400000 2 710 0.420000 2 720 0.440000 2 730 0.460000 3 800 0.600000 2 900 0.800000 Now I just wonder if there are any fonts with an OS/2 weight of 380 (Pango's "book"), 999, or 1000...
and the 1 300 -0.100000 might be hardcoded as well (can't seem to edit comments)
Pango does query the system for fonts, though, never from a file (I'm pretty sure about this). Wouldn't that mean the current Pango code is wrong, if CT really does give different results for system fonts than for fonts created from a file? I have a theory on why you're seeing some weird stuff for system fonts. Apple had their own competitor to TrueType from a long time ago which used a different table than OS/2 for weights, based on "axes" instead of discrete numbers like 400. I think maybe some of the system fonts on macOS have both tables, but you probably wouldn't see both on any fonts in Google Fonts or in Windows fonts So basically, my original table reveals the "backwards compatible" mapping that macOS does for fonts that have OS/2 but not the Apple tables (nearly every font except some macOS ones) I think this is supported by your most recent data
Do you mean the fvar table? I considered that feature at first, and the latest version of the OpenType standard resurrects that feature, but /Library/Fonts/Skia.ttf is the only font left on OS X with an fvar table — in fact it's the only file in either /System/Library/Fonts or /Library/Fonts that has the bytes that make up 'fvar' in its table directory! :/ The NSFontWeight constants match with other weight-determining code in the CoreText dylib that... uses a list of (substring, floating-point value) pairs and strstr() of the font subfamily name; I assume this is where OS X is getting its values from for the system fonts, but I'm not sure if there is a way to ever find out.
One more thing for completeness. Here's what OS/2 weights would produce the respective NSFontWeight constants: (between 66 and 67) NSFontWeightUltraLight -0.800000 150 NSFontWeightThin -0.600000 (between 237 and 238) NSFontWeightLight -0.400000 400 NSFontWeightRegular 0.000000 530 NSFontWeightMedium 0.230000 600 NSFontWeightSemibold 0.300000 700 NSFontWeightBold 0.400000 780 NSFontWeightHeavy 0.560000 810 NSFontWeightBlack 0.620000 I'll try not to post any more of these for completeness notes here, to reduce on any further noise.
Nice debugging on this! I did mean fvar but it looks like you've disproven that. I ran your whatsthefont-impact.c and see that it gives different results when in a system folder than when not, weird! However in comment #13 you said that my original program would do the same thing but for me it doesn't. I think what's happening is as follows. There's 2 ways to create font and they'll give you different results: 1. Use CTFontManager to query the OS for a font OR create a font from a system folder 2. Use CoreGraphics to create a font from any file, or use CTFontManager to create a font from a *non system* file If you use method #2 the weight from the resulting font seems to always match my original table. But if you use method #1 there are some fonts that have wild values, like your Impact example - it seems like OS X is figuring out that it's a heavy face and assigning it a determined weight by some algorithm we don't know. It does that for a handful of fonts I have, but not for most of them I think the new 10.11 constants are a bit of a red herring because that's just what they're labeling the numbers
Right. For what it's worth I managed to trace the acquisition of the traits dictionary for system fonts down to libFontRegistry.dylib, in a method TGlobalFontRegistry::CopyPropertiesForFonts() that uses Mach ports to communicate to a "com.apple.FontServer" server which is where I assume the real work happens; I'd need to dig around for *that* code. The Core Text dylib will call into libFontRegistry.dylib first, and if that fails, then it uses some hardcoded PostScript names like LucidaGrande, and then it uses the OS/2 table, and then it uses the subfamily name thing I mentioned. For all the fonts we've tried that aren't in a system folder, it's the OS/2 table step that Core Text stops at, which is why our values are consistent. It's just that *that* table's mappings don't match any of the other mappings. So you say using the old CGFont APIs will return consistent values for the system fonts? I'll have to try that then... It does make sense, since most of the logic I've been talking about happens in the Core Text dylib. It just seems weird, since that still doesn't explain why the system fonts are skipping the OS/2 table weights.
Also I'm sure the values with the names predate 10.11; it's just that in 10.11 they became public API, which means who knows what changes Apple might make in the future... I'd have to dig out a version of CoreText from 10.7 or so to confirm.
One last thing, since I keep forgetting to write all these the first time: the fvar variant information in Skia.ttf isn't even used by Core Text — the OS/2 weight is always 400 and the Core Text weight is always 0.000 regardless of variant. I don't know what this means for an *actual* OS X program that tries to use Skia, but... (Skia was the first font to use the fvar table when it debuted with QuickDraw GX; Apple acquired NeXT shortly thereafter, which is why fvar never caught on. Skia got grandfathered into the OS X distribution, I guess. Though I could swear the Microsoft-maintained OpenType spec uses Skia in some sample images...)
Yeah the CGFont method always gave me the same weights, system or not. So are you saying that this independent font server is doing its own weight determination differently? I can't see any correlation between some of these system font weights and anything in the file, so it'll be interesting if you find anything
Take a look in /System/Library/Frameworks/ApplicationServices.framework/Frameworks/ATS.framework/Resources/FontInfo — in particular the *.ATSD files. Here's an except from the Impact.ttf one: 0005650: 0000 0024 4d54 445f 5479 7065 6661 6365 ...$MTD_Typeface 0005660: 5f57 6569 6768 745f 5669 7375 616c 4465 _Weight_VisualDe 0005670: 7363 7269 7074 6f72 0000 0007 0000 0005 scriptor........ 0005680: 626c 6163 6b00 0000 0700 0000 214d 5444 black.......!MTD In a number of the ATS-related dylibs, there's code that reads the *.ATSD files, pulls the string associated with MTD_Typeface_Weight_VisualDescriptor, and assigns it a weight. (MTD is an abbreviation for "metadata" in this case, judging from the debug symbols.) One of these *is* run in-process; the others are part of the com.apple.FontServer server. I can try debugging again to see if the in-process one breaks when reading this value. If it does, we have our culprit. But here's the odd part: **there does not seem to be any code anywhere that actually creates these FontInfo files**! If someone installed a font themselves, they could see if that winds up creating one anywhere (be it in /System or in a user directory), but I highly doubt it... I wonder if these files are actually part of the OS X distribution, and thus are made by Apple internally as part of the dev process.
I think I may have found a potential correlation that would let us be able to figure out which set of weight values are correct. The only requirement is that we either have - a CTFontRef, or - a completely filled-in CTFontDescriptorRef (for instance, from a CTFontRef, or from CTFontCollection) This Swift 3 program tries to print the style name, OS/2 weight, and Core Text weight of all the fonts on the system, all the standard UI fonts, and a handful of font files on my hard drive: https://github.com/andlabs/misctestprogs/blob/master/weightsPriorities.swift To use this, change the members of the filenames array (on line 156) to files that you have on hand. Here's some (prettified with column -t -s$'\t') output from my system: https://hastebin.com/axefolovav.txt So what I think is the situation involves the kCTFontPriorityAttribute and kCTFontRegistrationScopeAttribute attributes of a font. The former determines what order the system looks when it tries to find a font. The latter determines how long a font can be used for (I think). You'll notice that for explicitly installed fonts, kCTFontPriorityAttribute loosely corresponds to where a font file is on the system. But some of these values vary strangely, so I'm not sure if this is part of the correlation or not. That being said, the setup appears to be - If the file was loaded with CTFontManagerCreateFontDescriptorsFromURL(), it will have a registration scope of kCTFontManagerScopeNone. - If the registration scope is kCTFontManagerScopeNone, the font is not installed and so for TrueType and OpenType we use the OS2 table conversion formula. (Not sure what is used for PostScript fonts; my guess is that it's the method used by the next step.) - Otherwise, the font is installed and we use the special Core Text constants to determine the weight. I'm not sure if this is purely based on the font style string or if there's some interpolation going on. I'm not 100% confident about this (some more figuring out would be needed, and I have to go back and find that font style -> weight value table again), but this should be a step in the right direction?????? Also, the registration scope stuff in Core Text seems to require 10.6, so on 10.5 we probably would need to drop down to ATS.framework to get the registration scope. I'm not 100% sure either; it depends on what the minimum version of OS X Pango wants to require. I'm also not sure if other methods of creating a CTFont than CTFontManagerCreateFontDescriptorsFromURL() will produce different results.
Update: using CTFontManagerRegisterFontsForURL() to register the fonts will result in Core Text using the Core Text constants instead of the ones observed here. You can test this with my above Swift program by replacing let cfdescs = CTFontManagerCreateFontDescriptorsFromURL(url) with var err = UnsafeMutablePointer<Unmanaged<CFError>?>.allocate(capacity: 1) if !CTFontManagerRegisterFontsForURL(url, CTFontManagerScope.process, err) { if err.pointee != nil { print("** \(basename) failed: \(err.pointee!)") } else { print("** \(basename) failed: error unknown") } continue } let matchdesc = CTFontDescriptorCreateWithAttributes([ kCTFontURLAttribute as String: url, ] as CFDictionary) let collection = CTFontCollectionCreateWithFontDescriptors([matchdesc] as CFArray, nil) let cfdescs = CTFontCollectionCreateMatchingFontDescriptors(collection) Note: using kCTFontManagerScopeNone will cause the function to fail with a CFErrorRef representing the OSStatus paramErr.
Interesting, so if Pango were to get more accurate font matching, it would need to check the font's registration scope. If the font isn't installed, it would use the current method. But if the font is installed, it should use some other method to map the float to an OS/2 style weight? Did you figure out what that mapping looks like? I ran the change from your last comment too, and in that case the scope is "process", which might be useful too. BTW would still love to see a blog post about this! Nice work.
The blog post(s) will come when I finish all this ;) Now that I have gone through Core Text and ATS and mapped out exactly how and in what order it matches fonts, I can safely say that Core Text's claimed weight values (or values very close to them) are always used, **except** when all of the following conditions are met: 1) The font is not registered. 2) The font's PostScript name doesn't match a few special hardcoded cases. 3) The font has an OS2 table. 4) The size of the OS2 table is either a) less than 78 bytes, in which case the actual usWeightClass is ignored and 0 is used, or b) greater than or equal to 78 bytes, in which case the usWeightClass must be <= 1000 If all these conditions are met, then Core Text uses its internal WeightOfClass() function to get values that match what Caleb and I found originally, and Pango's existing code will work fine. So that leaves how to handle the situation where the Core Text weights map to what Core Text says they should be. Fortunately, Core Text and ATS only use hardcoded values (no fma() calls or anything), and I've compiled a list of them here: https://github.com/andlabs/libui/blob/utflib-and-attrstr/doc/export/ctweightsannotated They are sorted to show floating-point values (and IEEE 754 forms, since clang seems to have weird IEEE 754 code), and what maps to them. Unless otherwise annotated, a string means either a literal match for one of those MTD files, or a case-insensitive, Unicode normalized match against font family and subfamily strings. You can assume if there's a Title Case string and a lowercase string, the Title Case string is for registered fonts and the lowercase string is for unregistered fonts. You'll notice there is some inconsistencies in a few places, like the case of W1 (which is for registered fonts) and w1 (which is for unregistered fonts). You'll also notice there isn't really info about what order things are tested in; the other files in that directory will (ctweights itself is my pseudocode recreation of the weight determination code; I was hoping to make it self-contained, but some of the undocumented or private helper functions are rather complicated to reimplement or require outside help from other private dylibs that would require me to reimplement the whole dylib =P ). I will say that PANOSE values are used in a registered font if the OS2 weight is either 0 or >= 1000. I'm not sure if there should be a unified way to approach these, but I'm pretty sure from this list we can figure out a way to map from Core Text weights to OS2-like weights. We probably *shouldn't* require exact matches because new values may come along in the future or have already gone; I'm not sure how to approach it otherwise, though. There is one other oddity; you'll notice that there is no PANOSE weight value of 4 in the above list. If it was there, it would have values -0.300000 0xbe99999a However, for whatever reason, libFontRegistry.dylib's code is returning false in that case instead of true, meaning that value will be ignored. Odd... Note to self: do the same thing for width (stretch) values; while they do seem to be much more sensible and reflecting intuition, I thought I saw some outliers...
Oh, I can also confirm that yes, checking the registration scope seems to be the way to determine if the font is registered or not (I'm at least 95% confident about this after reading through the disassemblies).
Okay, so width/stretch values are also coming from two different sources like the weight values, and the sets of values are mostly similar. https://github.com/andlabs/libui/blob/utflib-and-attrstr/doc/export/ctwidthsprocessed is the list of widths for when the weight rules above are not met. You'll notice one major inconsistency: there's two "Semi Condensed" entries in the registered font set. This is probably a typo, but since the -0.7 one appears first on the list (and the list is scanned in order), these fonts will return a width of -0.7 instead of what I assume is the correct -0.1. Not sure if I should handle this case specially, especially since it seems to only apply to TrueType and OpenType fonts (since the code in libFontRegistry.dylib seems to specifically ask for the 'name' table). Either way, here's what I'm guessing is the mapping between Pango's named constants and Core Text widths: PANGO_STRETCH_ULTRA_CONDENSED -0.7 PANGO_STRETCH_EXTRA_CONDENSED -0.5 PANGO_STRETCH_CONDENSED -0.2 PANGO_STRETCH_SEMI_CONDENSED -0.1 PANGO_STRETCH_NORMAL 0.0 PANGO_STRETCH_SEMI_EXPANDED 0.1 PANGO_STRETCH_EXPANDED either 0.2 (subfamily names) or 0.4 (OS/2) PANGO_STRETCH_EXTRA_EXPANDED either 0.4 (subfamily names) or 0.6 (OS/2) PANGO_STRETCH_ULTRA_EXPANDED 0.8 Of course, I'm still not sure what to do about values that don't exactly match these, especially since Core Text can also return -0.4, and I still doubt there's a guarantee that these single-digit decimals are the only possible ones. And this table doesn't consider PANOSE values either; I probably should do that too... In the case of the above formula that produces the different weights, the Core Text width is just (usWidthClass / 10) - 0.5. usWidthClass must be between 0 and 10 inclusive, even though 0 and 10 are technically disallowed by both TrueType and OpenType. And an OS2 table that's less than 78 bytes long is still treated as if usWidthClass == 0.
Fixed a typo in the ctweights files. I've also started aggregating everything to see if I can set up an implementation of a function that produces an approximate OS/2 weight and width from a Core Text weight and width for my own uses, and to see just how divergent everything is. STXihei is just a lighter version of STHeiti, which is either Regular or Medium (I can't quite tell), so I don't think we would need to worry about those hardcoded names... See also http://www.typophile.com/node/93813 (it also seems macOS 10.12 at least uses different versions of the SinoType fonts...).
*different versions of the SinoType fonts that have different PostScript names