: class ""'s signer information does not match signer information of other classes in the same package If you get the following errors in the unit tests: T09:51:49,365 ERROR ]: spark.SparkContext (Logging.scala:logError(96)) - Error initializing SparkContext. Why do Spark unit tests fail with a SecurityException? The good side is that you may freely use -Pitests to run integration tests from the root project without the need of mvn install. There are some good and bad sides using this, it's introduced as a profile to clearly communicate that the integration tests are attached to the main project. There is an option to attach all the itest subprojects to the main project by enabling this with -Pitests ( HIVE-13490). If you'd like to give that a try, by all means, go ahead. The best option would be to utilize the failsafe plugin for integration testing but it needs a bit different setup, and it's harder to use for now. It would be great to have it connected, but it would make it harder to use mvn test locally. Why isn't the itests pom connected to the root pom?
With the above steps, you can create a patch which has a.
Add the to itests/src/test/resources/testconfiguration.properties to the appropriate variable (ex.If there is any interaction with file system, use unique folders for the test to avoid any collision with other tests.This will help reduce flakiness in the test runs, since Jenkins will run tests and batches, and currently it does not restore to former state after running each of the q files. For instance, name a table in the test file foo.q, foo_t1 instead of simply t1. If the new test creates any table, view, function, etc., make sure that the name is unique across tests.Copy the test to a new file under ql/src/test/queries/clientpositive/.q (or /clientnegative if it is a negative test).How do I add a test case?įirst, add the test case to the qfile test suite: See Hive Logging for details about log files, including alternative configurations. /tmp/$USER/ (Linux) or $TMPDIR/$USER/ (MacOS).From the root of the source tree: find.Print """mvn clean test -Dtest=%s '-Dqfile=%s' =true""" % (driver, ",".join( for a in q])) L = įor driver,q in groupby(sorted(), key=lambda a:a): In master, only Hadoop 2.x is supported, thus there is no need to specify a Maven profile for most build operations. For most Maven operations one of these profiles needs to be specified or the build will fail. There is a profile for each version of Hadoop, hadoop-1 and hadoop-2. In branch-1, since both Hadoop 1.x and 2.x are supported, you need to specify whether you want to build Hive against Hadoop 1.x or 2.x. The way Maven is set up differs between the master branch and branch-1. You might have to set the following Maven options on certain systems to get build working: Set MAVEN_OPTS to "-Xmx2g -XX:MaxPermSize=256M".
See Installing from Source Code (Hive 0.12.0 and Earlier) for detailed information about building Hive 0.12 and earlier with Ant.See Getting Started: Building Hive from Source for detailed information about building Hive releases 0.13 and later with Maven.
See MiniDriver Tests for information about MiniDriver and Beeline tests.įor FAQ about how to run tests, see HiveDeveloperFAQ#Testing below.