Google Not Perfect With Duplicate Content
Think Google is a good as they would have you believe they are about detecting duplicate copy, then take a look here [fireproof safe with power]. Since result two links to result one I’d say result two is probably the original and Google got it wrong.
In this case the comments probably played a big role in the uniqueness of the documents when compared to one another. From a purely hypothetical point of view what if you cloaked the unique factors and served them only to the bots. What if you took some standard non unique data feeds and uniquified them on the fly.
Serve your real visitors lovingly written marketing copy, serve the bots some really unique substitute and replaced copy with a bit of relevant RSS mashedup style copy, hypothetically speaking of course.