Friday, June 15, 2012

Issues that you may encounter during the migration to Cassandra using DataStax/Sqoop and the fixes.


My previous blog post, Moving data from mysql to cassandra discusses how I migrated my database from mysql to cassandra. In this post, I will discuss some issues that I encountered, as I started to use DataStax, and how easily can they be fixed.

If you fail to indicate the primary key to sqoop, the below exception will be thrown.
ERROR tool.ImportTool: Error during import: No primary key could be found for table Category. Please specify one with --split-by or perform a sequential import with '-m 1'.
Solution: Indicated in the error log itself!

Exceptions similar to the below will be thrown, if you try to use sqoop as above, without properly starting the cassandra.
Exception in thread "main" java.io.IOException: Failed to retrieve RMIServer stub: javax.naming.ServiceUnavailableException [Root exception is java.rmi.ConnectException: Connection refused to host: pradeeban; nested exception is:
    java.net.ConnectException: Connection refused]
    at javax.management.remote.rmi.RMIConnector.connect(RMIConnector.java:338)
    at javax.management.remote.JMXConnectorFactory.connect(JMXConnectorFactory.java:248)
    at org.apache.cassandra.tools.NodeProbe.connect(NodeProbe.java:141)
    at org.apache.cassandra.tools.NodeProbe.(NodeProbe.java:111)
    at com.datastax.bdp.tools.DseTool.(DseTool.java:136)
    at com.datastax.bdp.tools.DseTool.main(DseTool.java:562)
Caused by: javax.naming.ServiceUnavailableException [Root exception is java.rmi.ConnectException: Connection refused to host: pradeeban; nested exception is:
    java.net.ConnectException: Connection refused]
    at com.sun.jndi.rmi.registry.RegistryContext.lookup(RegistryContext.java:101)
    at com.sun.jndi.toolkit.url.GenericURLContext.lookup(GenericURLContext.java:185)
    at javax.naming.InitialContext.lookup(InitialContext.java:392)
    at javax.management.remote.rmi.RMIConnector.findRMIServerJNDI(RMIConnector.java:1886)
    at javax.management.remote.rmi.RMIConnector.findRMIServer(RMIConnector.java:1856)
    at javax.management.remote.rmi.RMIConnector.connect(RMIConnector.java:255)
    ... 5 more
Caused by: java.rmi.ConnectException: Connection refused to host: pradeeban; nested exception is:
    java.net.ConnectException: Connection refused
    at sun.rmi.transport.tcp.TCPEndpoint.newSocket(TCPEndpoint.java:601)
    at sun.rmi.transport.tcp.TCPChannel.createConnection(TCPChannel.java:198)
    at sun.rmi.transport.tcp.TCPChannel.newConnection(TCPChannel.java:184)
    at sun.rmi.server.UnicastRef.newCall(UnicastRef.java:322)
    at sun.rmi.registry.RegistryImpl_Stub.lookup(Unknown Source)
    at com.sun.jndi.rmi.registry.RegistryContext.lookup(RegistryContext.java:97)
    ... 10 more
Caused by: java.net.ConnectException: Connection refused
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
    at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
    at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
    at java.net.Socket.connect(Socket.java:529)
    at java.net.Socket.connect(Socket.java:478)
    at java.net.Socket.(Socket.java:375)
    at java.net.Socket.(Socket.java:189)
    at sun.rmi.transport.proxy.RMIDirectSocketFactory.createSocket(RMIDirectSocketFactory.java:22)
    at sun.rmi.transport.proxy.RMIMasterSocketFactory.createSocket(RMIMasterSocketFactory.java:128)
    at sun.rmi.transport.tcp.TCPEndpoint.newSocket(TCPEndpoint.java:595)
    ... 15 more
Unable to run : jobtracker not found




If you try to run the above migration example once more, it will complain as below, and the migration will halt.
12/06/15 15:39:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/06/15 15:39:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/06/15 15:39:56 INFO tool.CodeGenTool: Beginning code generation
12/06/15 15:39:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Category` AS t LIMIT 1
12/06/15 15:39:56 INFO orm.CompilationManager: HADOOP_HOME is /home/pradeeban/programs/dse-2.1/resources/hadoop/bin/..
Note: /tmp/sqoop-pradeeban/compile/5ddc038aef3f4db8ed8f643cdba0786d/Category.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/06/15 15:39:57 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-pradeeban/compile/5ddc038aef3f4db8ed8f643cdba0786d/Category.jar
12/06/15 15:39:59 INFO manager.DirectMySQLManager: Beginning mysqldump fast path import
12/06/15 15:39:59 INFO mapreduce.ImportJobBase: Beginning import of Category
12/06/15 15:40:00 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/06/15 15:40:01 INFO mapred.JobClient: Cleaning up the staging area cfs:/tmp/hadoop-root/mapred/staging/pradeeban/.staging/job_201206151241_0006
12/06/15 15:40:01 ERROR security.UserGroupInformation: PriviledgedActionException as:pradeeban cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory Category already exists
12/06/15 15:40:01 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory Category already exists

Solution: Make sure to delete the output directory, "Category", while also deleting the source files generated in the working directory, before running it once more. This is because hadoop doesn't like overwriting files.

The output directory can be deleted as below.
$ bin/dse hadoop dfs -rmr Category
12/06/15 15:41:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Deleted cfs:/user/pradeeban/Category

Further assistance on hadoop trouble shooting can be found here.

No comments:

Post a Comment

You are welcome to provide your opinions in the comments. Spam comments and comments with random links will be deleted.