hi David
Please refer to the method "DFSInputStream#blockSeekTo", it has
same purpose with you.
***************************************************************************
LocatedBlock targetBlock = getBlockAt(target, true);
assert (target==this.pos) : "Wrong postion " + pos + " expect " +
target;
long offsetIntoBlock = target - targetBlock.getStartOffset();
DNAddrPair retval = chooseDataNode(targetBlock);
chosenNode = retval.info;
InetSocketAddress targetAddr = retval.addr;
try {
s = socketFactory.createSocket();
NetUtils.connect(s, targetAddr, socketTimeout);
s.setSoTimeout(socketTimeout);
Block blk = targetBlock.getBlock();
Token<BlockTokenIdentifier> accessToken =
targetBlock.getBlockToken();
blockReader = BlockReader.newBlockReader(s, src, blk.getBlockId(),
accessToken,
blk.getGenerationStamp(),
offsetIntoBlock, blk.getNumBytes() - offsetIntoBlock,
buffersize, verifyChecksum, clientName);
***************************************************************************
-Regards
Denny Ye
2012/1/6 David Pavlis <[email protected]>
> Hi,
>
> I am relatively new to Hadoop and I am trying to utilize HDFS for own
> application where I want to take advantage of data partitioning HDFS
> performs.
>
> The idea is that I get list of individual blocks - BlockLocations of
> particular file and then directly read those (go to individual DataNodes).
> So far I found org.apache.hadoop.hdfs.DFSClient.BlockReader to be the way
> to go.
>
> However I am struggling with instantiating the BlockReader() class, namely
> creating the "Token<BlockTokenIdentifier>".
>
> Is there an example Java code showing how to access individual blocks of
> particular file stored on HDFS ?
>
> Thanks in advance,
>
> David.
>
>
>
>
>
>