> -----Original Message-----
> From: mailop <[email protected]> On Behalf Of Hans-Martin Mosner
> via mailop
> Sent: Sunday 14 March 2021 07:43
> To: mailop <[email protected]>
> Subject: [mailop] Reliability of DMARC reports?
> 
> Hello,
> 
> due to the recent GMX mail rejection incident (for which I still don't
> have a satisfactory explanation from GMX) I've enabled DMARC on our mail
> server in the hopes of getting better deliverability.

DMARC does not improve deliverability. However, the discipline that it enforces 
with maintaining SPF and DKIM alignment can help with deliverability as you are 
clearly identifying your mail stream, particularly for high volume senders. In 
fact, DMARC with an enforcing policy may cause some previously legitimate 
messages to be rejected. DMARC is intended to prevent unauthorised use of a 
domain in the 5322.from header, nothing else.

> But some of our outgoing mails were rejected, and the aggregate DMARC
> reports we were getting weren't too helpful (again :-( )
> 

Why? The aggregate reports should provide enough information on why a DMARC 
evaluation might be failing.

> Since this is a completely new area for me, I'm trying to make sense of
> the report content, and of course I'm trying to adjust our DNS records
> to limit damage.
> 
> As far as I understand, the report contains a copy of our published
> policy as well as records per sending IP. In the report I'm just looking
> at, it's stated that our domain and subdomain policy is "reject"
> although I changed it to "quarantine" within the same DNS update in
> which I changed the rua address from a generic one to a special receiver
> address, so I know the reporter must have read the new version of the
> DMARC DNS record because they sent to that special address.

That may be an incorrect assumption. The RUA address could have been re-read 
prior to sending the report, but the report will have contained data based on 
the historic DMARC policy. Each report contains a record of the p= and sp= at 
the time the messages were evaluated, not the time it was sent. Look at the 
time period on the report.

> The report also claims that SPF failed, although our SPF record included
> the outgoing mailserver from the beginning, of course.

For diagnosing SPF issues, the report will tell you the source IP address, the 
envelope (5321.from) address, how the SPF domain was derived (envelope or 
helo), the results of the receiver's (report sender) SPF test, and the SPF 
alignment according to DMARC.

Typically, SPF fails alignment because the recipient MTA forwards the message 
to a DMARC aware final hop MTA which now see a different source IP. This is 
pretty common, in particular when a recipient gets their hosting provider to 
forward their vanity domain's email on to their freemail account.

So it should be pretty easy to determine why SPF failed.

> So this report looks like a red herring to me - not enough information
> to debug what may have been wrong (ok for an aggregate report) but also
> containing highly questionable data.

Why specifically do claim that the data is questionable? If you wish, please 
share the un-redacted report with me off-list so that I may better understand 
your claim.

> I'm about to switch off DMARC off again or at least change the policy to
> "none" as it seems to hurt more than help.

I think this is a very good idea. Nobody should run DMARC with an enforcing 
policy until they fully understand their mail flow and how DMARC could effect 
it. This is specifically what "none" is intended for.

> What's your experience with reliability of DMARC reports? Mostly
> helpful? Too much nonsense?

I have never experienced a problem with the reliability of DMARC aggregate 
reports, and I look at them for a living.

Ken.
_______________________________________________
mailop mailing list
[email protected]
https://list.mailop.org/listinfo/mailop

Reply via email to