Archive for category bigdata

php access hive2 server by thrift

You can get some guys did on

When you try to access Hive2 without SASL. you may met following alerts:
TTransportException’ with message ‘TSocket: timed out reading 67108844 bytes from

Try add following lines to /etc/hive/conf/hive-site.xml

Client authentication types.
NONE: no authentication check
LDAP: LDAP/AD based authentication
KERBEROS: Kerberos/GSSAPI authentication
CUSTOM: Custom authentication provider
(Use with property hive.server2.custom.authentication.class)

Notes: NOSASL not equal NONE. NONE is default and means Plain SASL.

No Comments

(fixed) sqoop import from Oracle to Hive throw Heap size error

If you use CDH and Sqoop. you probably met following issue:

15/08/07 15:21:55 INFO manager.SqlManager: Using default fetchSize of 1000
15/08/07 15:21:55 INFO tool.CodeGenTool: Beginning code generation
15/08/07 15:21:56 INFO manager.OracleManager: Time zone has been set to GMT
15/08/07 15:21:56 INFO manager.SqlManager: Executing SQL statement: select xxx where (1 = 0)
Exception in thread “main” java.lang.OutOfMemoryError: Java heap space
at java.lang.reflect.Array.newArray(Native Method)
at java.lang.reflect.Array.newInstance(
at oracle.jdbc.driver.BufferCache.get(
at oracle.jdbc.driver.PhysicalConnection.getCharBuffer(
at oracle.jdbc.driver.OracleStatement.prepareAccessors(
at oracle.jdbc.driver.T4CTTIdcb.receiveCommon(

This related with HDFS client HEAP settings. you can fix it by increase HEAP size for HDFS client.

from CDH:
Client Java Heap Size in Bytes.
Read the rest of this entry »

No Comments