NS_LITERAL_STRING, as its name suggests, only ever gets used on string literals, or macros that expand to string literals.

NS_LITERAL_CSTRING gets (ab?)used in all sorts of ways. Should we be consistent and require NS_LITERAL_CSTRING to be used on string literals? I found the following in-tree examples, none of which would have worked with NS_LITERAL_STRING:

static const char f00[] = "f00";
F001(NS_LITERAL_CSTRING(f00));
NS_NAMED_LITERAL_CSTRING(f002, NS_LITERAL_CSTRING("F00"));
template<size_t N>
void F003(const char (&f00)[N])
{
 F003(NS_LITERAL_CSTRING(f00));
}
F004(NS_LITERAL_CSTRING("F00") + NS_LITERAL_CSTRING(f00));
NS_ConvertASCIItoUTF16 f005(NS_LITERAL_CSTRING(f00));
NS_NAMED_LITERAL_CSTRING(f006, f00);
f007.Assign(NS_LITERAL_CSTRING(f00));

When I tried enforcing string literals the compilers also tripped up over this line but it might be because of the way macros are expanded:
nsCString f008 = NS_LITERAL_CSTRING(__FUNCTION__);

--
Warning: May contain traces of nuts.
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to