TheFunk TheFunk - 3 months ago 21
Java Question

Spark Framework and relative path

I'm using the Spark framework to create a web app and I've been going through the tutorials on their site. At the moment, I'm working on their templates section, however I can't seem to get the project to recognize that my css and my templates are in my resources folder.

I'm using Netbeans and Maven to manage my dependencies.

Can anyone help me figure out how to set up my relative paths/create my project folders appropriately in this environment? I'm a newbie to both Maven and Spark, so go easy please.

Answer

To tell spark where your static files are, you can use staticFiles.location("/directory_in_resources")

And if you wanted an external location on the filesystem you could use staticFiles.externalLocation("directory_on_filesystem").

(More info about these two methods here)

but...

If you are already doing this (I'm guessing you are), and spark is still having trouble finding your templates, it may be because you have your template engine misconfigured.

I ran into this problem myself, and after looking through the examples I noticed that all of the templates were stored in the resources root under spark/template/freemarker (I'm using freemarker, you might not be, but hopefully this still applies)

It seems that spark's default template engine looks for templates in that directory. You can either:

Move all of your templates to that directory - or - You can configure your own template engine (This example is for freemarker)

FreeMarkerEngine freeMarkerEngine = new FreeMarkerEngine();
Configuration freeMarkerConfiguration = new Configuration();
freeMarkerConfiguration.setTemplateLoader(
        new ClassTemplateLoader(YOUR_CLASS.class, "/templatedir"));
freeMarkerEngine.setConfiguration(freeMarkerConfiguration);

You would then have to pass this engine when defining routes.

I hope that this answer helped, sorry if it is a bit of a mess, I tried! And if you are using a different engine this should still mostly apply, but if not I can try and revise my answer.

Apologies if you already tried this, I am new to both maven and spark as well.