-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 On 08/07/2014 11:19 AM, Darac Marjal wrote:
> On Thu, Aug 07, 2014 at 10:01:41AM -0400, Steve Litt wrote: >> Oh geez, I'm sorry, I thought your post was flippant sarcasm, so I >> did what I thought was extending it. OK, you really do mean the log >> should go into Postgres. >> >> I don't necessarily disagree, but I very strongly believe its first >> step should be to go to a text file with one line per event, or >> perhaps some sublines. If that text file were designed correctly, >> perhaps with field separators, it would be trivial to write a C or >> Python program to input it into Postgres. I just want to make sure >> that I can read that log on any Linux, BSD, or even (ugh) Mac and >> Windows. > > You see, though, this isn't the design goal of the journal. It should be, if not "the" design goal, then certainly "a" design goal. The canonical forms of the legacy log formats (plain text) can be read and manipulated by generic tools, in any environment; if the journal cannot provide the same (with respect to its canonical data files, not to something it can export to), that's a regression from the legacy feature set. For fundamental, low-level system components (from the kernel on up), no regression is acceptable, ever. At most, one may be an unavoidable, undesirable side effect of something which is itself absolutely necessary - and even there, that only works as "choosing the lesser of two evils", which is not how those developing and advocating journald et al. seem to consider it. > The journal is designed to be resistant against corruption (hashes > are used to preserve message integrity), quick to access (there is an > index so you don't have to spool through the whole file looking for > the event that happened at 10:00, say) and well defined (times, for > example, are defined as µsec since the epoch, not some > lets-defined-another-parser text string). > > Consider it to be another database format. You wouldn't necessarily > try to cat a MySQL or PostgreSQL datastore; you'd use the appropriate > tools to select all from it. In a similar vein, you'd use journalctl > to select all the entries from the journal file and you'd expect it > to do much of the hard work such as telling you if a line has been > altered (either tampered or simply corrupted), adjusting the > timestamps for time zones and so on. Approached from that angle, I don't inherently have a problem with the journal's being stored (optionally and/or partly) as binary files. (Though I don't necessarily want automatic timestamp adjustment and so forth; I may very well want verbatim logs, for one reason or another.) However, I do still have the problem that the tools used to interact with those files are dedicated, "proprietary", single-purpose tools. Nothing other than journald uses the journal's file format; by contrast, many, many databases use the Postgres format(s), and though I'm not familiar with Postgres in any detail, it wouldn't in the least bit surprise me if there were non-Postgres-project tools which can read and/or manipulate those formats. I'm sure there are advantages to using a designed, dedicated format for the specific purpose at hand, and writing tools to work specifically with that format. I simply believe that those advantages almost inherently cannot outweigh the matching downsides of using / requiring special-purpose tools and formats. If the journal's file format became vaguely standardized and came to be used for other purposes (e.g. perhaps as a generic indexing / metadata format, if that might be suitable?), I'd have much less problem with its being used for storing and handling log messages in and by the journal. The dedicated, and apparently OS-specific (?), nature of the format and its tools is IMO a problem all of its own. > "cat"ing the journal doesn't have to lose information, either. > Journalctl can export as JSON or a serialised "export" format > (plain-text). But those aren't the journal itself, they're exports of the journal; they don't provide the journal's full functionality. (Otherwise, it would be more appropriate for the journal to use those file formats natively, rather than using binary natively and treating those formats as exports.) And accessing them from zero still requires you to have a functioning special-purpose tool (journalctl) in your available environment, which is something the legacy systems never did. Debian apparently presently avoids (or at least minimizes) this latter problem by exporting to plain-text logs by default, and never storing the binary journal files anywhere on-disk. That doesn't eliminate the underlying issue, though. - -- The Wanderer Secrecy is the beginning of tyranny. A government exists to serve its citizens, not to control them. -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iQIcBAEBCgAGBQJT4556AAoJEASpNY00KDJry3wP/3i+SPOXEDTCNSbjmselXe3u s0BQKOLOZouSwrkgyLyUfgXwfx4ULimVTvjefD6k+WDA4cVroYkNwQqON1Ga0U1H U65i94R5bEidIb0MT9sO5v8z/JW+jCo4xvNQ34xRV7q70pcVLY5CPr/Zffw+eDgk xVP5TUuebtVIpu9FzPm8PmkgwWasHcUrgcYmvZw9/Je9ZiTczjbKskqEznETDEp/ lkTJ2BRLOCPK74Bd+b1nBBAJotybZLWAIv9ZZU9/WToB2KXOGdE4PUbUM4yrAslo ZSwVcmFlhsI/g9/a5+fzUU6O5BZ2nq5r81c78ciHRkV4w2nRqpJYM8FJCR53kohH j94Vl35UqDHgLdo3x427Jg1oDf3GvTYeaGQuqHw9uaYFLYSc9vTVy63ACsx1rXYd cfrQaybYEmI/CTjhswBAMg02uBmo/BH888qKVODBOxqzJ/cN29WrNRhpJP1jig6e FZVX184WSJKyXC3Y7gK7EHbwgMVhKQaRSv8oAi15cb8Op5QdVXFCQzf8ysjxDuj5 wjZQDoBlTYDNVyzJcl0Izsjb+YLqFwKC9UAn5tTCQLE51s3QdDbayoIja/Wb5C5T ShhdQst5LNsrAHM7RrU6SFOTN8+jZX6cl+Znls/I1taDFbDvPPfTTEXDAyofQybA XwBwUe1/j1WRTG3d23v2 =Izqh -----END PGP SIGNATURE----- -- To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org Archive: https://lists.debian.org/53e39e7a.9070...@fastmail.fm