================
@@ -199,13 +183,13 @@ bool lldb_private::formatters::WCharSummaryProvider(
   options.SetBinaryZeroIsTerminator(false);
 
   switch (wchar_size) {
-  case 8:
+  case 1:
----------------
Michael137 wrote:

> No, this wasn't wrong. It used the bit size before, and the libc++ summary 
> used the byte size. I'm not sure which is better (I suppose 8 bits/byte is 
> true for all targets supported by LLDB).

Ah I see. Making them consistent makes sense

> The *CharSummaryProviders used valobj.GetCompilerType().GetBasicTypeFromAST() 
> and the libc++ formatters used ScratchTypeSystemClang::GetBasicType(). Is 
> there a big difference between them, and is one preferred over the other?

The `ScratchTypeSystemClang` is the AST that is shared between all expressions 
run in the target. Whereas the `TypeSystem` underlying a `CompilerType` only 
contains the AST nodes created when the `ValueObject` and its type were 
created. It shouldn't make a difference for primitive types. And especially 
shouldn't make a difference when all we need to do is get the bytesize. That 
being said, I'd prefer using `valobj.GetCompilerType().GetBasicTypeFromAST()` 
(assuming that doesn't break anything). Reaching into the scratch typesystem 
feels weird here

https://github.com/llvm/llvm-project/pull/144258
_______________________________________________
lldb-commits mailing list
lldb-commits@lists.llvm.org
https://lists.llvm.org/cgi-bin/mailman/listinfo/lldb-commits

Reply via email to