I have a HotSpot JVM heap dump that I would like to analyze. The VM ran with
Normally, what I use is ParseHeapDump.sh included in Eclipse Memory Analyzer, and I do that on one our more beefed up servers. The shell script needs less resources than parsing the heap from the GUI, plus you can run it on your beefy server with more resources (you can allocate more resources by adding something like
-vmargs -Xmx40g -XX:-UseGCOverheadLimit to the end of the last line of the script.
For instance, it might look like this after modification
./MemoryAnalyzer -consolelog -application org.eclipse.mat.api.parse "$@" -vmargs -Xmx40g -XX:-UseGCOverheadLimit
When it succeeds, it creates a number of "index" files.
After getting the indices, I try to generate reports from that as well and scp those to my local machines and try to see if I can find the culprit just by that (not just the reports, not the indices). Here's a tutorial on creating the reports.
If those reports are not enough and if I need some more digging (i.e. let's say via oql), I scp the indices as well as hprof file to my local machine, and then open the heap dump (with the indices in the same directory as the heap dump) with my Eclipse MAT. From there, it does not need too much memory to run.
EDIT: I just liked to add two notes :