O'Reilly logo
  • bradd patel thinks this is interesting:

Hadoop uses HashPartitioner as the default partitioner implementation to calculate the distribution of the intermediate data to the Reducers. HashPartitione

From

Cover of Hadoop MapReduce v2 Cookbook - Second Edition

Note

means both the objects must be mapped to same memory address..