WARNING: This server is unstable and will be retired in the next days.
If you want to keep this forum available, please request immediately a migration
on the Nabble Support forum.
Forums that don't receive any migration request will be deleted forever.
I'm trying to use IBM BigInsights (essentially hadoop 0.20.2) as an HDFS file storage. While I'm perfectly able to access all my files using the web console and JAQL inside the box, the same does not apply to the Java application I've created which runs *externally*, on another machine, not in the hadoop environment. I get the well-known and dreaded "INFO: Could not obtain block blk_.................._.... from any node: java.io.IOException: No live nodes contain current block". I've raised dfs.datanode.max.xcievers to 4096 (and even to 8192) and max open file limits to 32K, to no avail. Kinda frustrated already, and google doesn't seem to help anymore. There are no exceptions logged in both namenode and datanode logs. Hadoop fsck says that the filesystem is clean.
My configuration is 1 datanode running on the same machine as namenode.