cricket_007 cricket_007 -4 years ago 124
Scala Question

'new HiveContext' is wanting an X11 display? com.trend.iwss.jscan?

Spark 1.6.2 (YARN master)

Package name: com.example.spark.Main

Basic SparkSQL code

val conf = new SparkConf()
conf.setAppName("SparkSQL w/ Hive")
val sc = new SparkContext(conf)

val hiveContext = new HiveContext(sc)
import hiveContext.implicits._

// val rdd = <some RDD making>
val df = rdd.toDF()

And stacktrace...

No X11 DISPLAY variable was set, but this program performed an operation which requires it.
at java.awt.GraphicsEnvironment.checkHeadless(
at java.awt.Window.<init>(
at java.awt.Frame.<init>(
at java.awt.Frame.<init>(
at com.trend.iwss.jscan.runtime.BaseDialog.getActiveFrame(
at com.trend.iwss.jscan.runtime.AllowDialog.make(
at com.trend.iwss.jscan.runtime.PolicyRuntime.showAllowDialog(
at com.trend.iwss.jscan.runtime.PolicyRuntime.stopActionInner(
at com.trend.iwss.jscan.runtime.PolicyRuntime.stopAction(
at com.trend.iwss.jscan.runtime.PolicyRuntime.stopAction(
at com.trend.iwss.jscan.runtime.NetworkPolicyRuntime.checkURL(
at com.trend.iwss.jscan.runtime.NetworkPolicyRuntime._preFilter(
at com.trend.iwss.jscan.runtime.PolicyRuntime.preFilter(
at com.trend.iwss.jscan.runtime.NetworkPolicyRuntime.preFilter(
at org.apache.commons.logging.LogFactory$
at Method)
at org.apache.commons.logging.LogFactory.getProperties(
at org.apache.commons.logging.LogFactory.getConfigurationFile(
at org.apache.commons.logging.LogFactory.getFactory(
at org.apache.commons.logging.LogFactory.getLog(
at org.apache.hadoop.hive.shims.HadoopShimsSecure.<clinit>(
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(
at org.apache.hadoop.hive.shims.ShimLoader.createShim(
at org.apache.hadoop.hive.shims.ShimLoader.loadShims(
at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(
at org.apache.spark.sql.hive.client.ClientWrapper.overrideHadoopShims(ClientWrapper.scala:116)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:69)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
at java.lang.reflect.Constructor.newInstance(
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:345)
at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:255)
at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:459)
at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:233)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:236)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
at com.example.spark.Main1$.main(Main.scala:52)
at com.example.spark.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
at java.lang.reflect.Method.invoke(
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
ivysettings.xml file not found in HIVE_HOME or HIVE_CONF_DIR,/etc/hive/ will be used

This same code was working a week ago on a fresh HDP cluster, and it works fine in the sandbox... the only thing I remember doing was trying to change around the JAVA_HOME variable, but I am fairly sure I undid those changes.

I'm at a loss - not sure how to start tracking down the issue.

The cluster is headless, so of-course it has no X11 display, but what piece of
new HiveContext
even needs to pop-up any

Based on the logs, I'd say it's a Java configuration issue I messed up and something within
got triggered, therefore a Java security dialog is appearing, but I don't know.

Can't do X11 forwarding, and tried to do
export SPARK_OPTS="-Djava.awt.headless=true"
before a
, and that didn't help.

Tried these, but again, can't forward and don't have a display

The error seems to be reproducible on two of the Spark clients.

Only on one machine did I try changing

Did an Ambari Hive service-check. Didn't fix it.

Can connect fine to Hive database via Hive/Beeline CLI

Answer Source

Found this post. java.awt.HeadlessException in Spring 3.0.5

Basically, Trend Micro is inserting some com.trend.iwss.jscan package into the JAR files that are downloaded via Maven through a company firewall , and I have no control over that.

(link not working)

Wayback Machine to the rescue...

If anyone else has input, I would also like to hear it.

When downloading some .JAR files via IWSA, a directory filled with .class file, which is not related to what is being downloaded, is added to the jar file (com\trend\iwss\jscan\runtime\).

This happens because if a JAR file is originally unsigned, IWSA will insert some code into the applet to monitor and restrict potential harmful actions.

For IWSS/IWSA, every "get" request is the same so it will not know if you are trying to download an archive or an applet, which will be executed by your browser.

This code is added for security reason to monitor the behavior of the "possible" applet to be sure that it does not do any harm to the machine and its environment.

To prevent this issue, please follow these steps:

  1. Log on to the IWSS web console.

  2. Go to HTTP > Applets and ActiveX > Policies > Java Applet Security Rules.

  3. Under Java Applet Security, change the value of "No signature" to either "Pass" or "Block", depending on what you want to do with the unsigned .JAR files.

  4. Click Save.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download