I'm new to the whole Hadoop stack so please bear with me.
I'm trying to import a whole table from MySql into HDFS using Sqoop 1.4.5 and I believe I have everything setup correctly but then Sqoop is telling me this:
Error: java.io.IOException: Cannot run program "mysqldump": error=2, No such file or directory
I can import without the --direct command but for some reason it can't find mysqldump.
Need help on this, please!
PS: I googled for a whole day and nothing could show me the right path.
PS2: I'm using a single node distribution on a Mac.
Best How To :
The mysqldump utility is used with the "direct connector". The reason it cannot be found is that the mysqldump binary is either not on your system or not part of the PATH environment variable when Sqoop is running the MapReduce job. Things that will help:
- It seems like you're running mac, so try installing MySQL with homebrew:
brew install mysql.
which mysqldump to find the mysqldump utility. If you can find it this way, then you may have a special PATH for your user. I'd try creating a soft link in /usr/bin via
ln -s $(which mysqldump) /usr/bin/mysqldump. Otherwise, uninstall MySQL and install it over homebrew.