Pages

Thursday, December 22, 2011

OutOfMemory with Maven Compiler ?

When using Maven with Jdk 6 (1.6.0_14) within eclipse, OutOfMemory error is encountered
Setting MAVEN_OPTS as per the suggestions was of no help.
The following configuration setting inside the ma ven compile plugin worked like a wonder!

<configuration>

<fork>true</fork>
<meminitial>256m</meminitial>
<maxmem>512m</maxmem>
</configuration>

Upon investigation, there is a note on the attribute <fork>
<fork> attribute, when set to true, allows running the compiler in a separate process and use defined heap value as against using default built-in compiler. (By default, fork is false)

Hence, the solution is having the compiler use a separate process as against the default. Another important observation is that i haven't encountered this exception when i used jdk1.6.0_23 + version.

Sunday, October 9, 2011

Java Memory Manager

I am yet to see a Java Application that hasn't suffered Memory Issues in its journey to Production. Do we always get to build applications where we know of a static data / memory requirements? I am sure we  are talking about something that is perceived to be next to impossible.

When you have an application that has been built or in development process, it is very important to ensure the code is optimally written to control the Java memory Usage. We are talking about following Standards!


What-if we had an option where we can ensure the application does not to use memory anything more than 60% of the total max heap (-Xmx) defined? (w.r.t the cache of the objects, probably static) Wouldn't that be really awesome?


Memory Governance is what we need here. There are 3 ways we can hold objects in Cache, In Memory- In heap, In-Memory - Off Memory and Disk based storage. 


As an Application Architect, I would usually or rather definitely need to define the memory requirements. To avoid memory contentions, I should be able to define a cap on the memory occupancy, ex: 60% of memory is what i would want to use for object storage / caching and the rest i would like to be used for processing requirements. The number i define here mostly depends on how processing intensive my application would be. 


For most of Java Applications, Data increases over time and handling the processing of this data gets tougher and there comes a stage where your JVM tuning options are all tried but in vain.


I have built a wrapper over EHCache 2.5.6 wrapping the base CacheManager, that has the capability of controlling the maximum memory usage across all the caches within the application scope.

I have introduced new attributes that can be use to define the memory occupancy threshold and just by configuring this, Developers can seamlessly manage memory without having to programmatically manage the same.

All the in-memory processing data needs to be held within this cache, that optimally uses a % of Heap and spills over the rest of objects into either disk or "off" heap (this is a feature of Big Memory that is a licenced extention to Core EHCache, Terracotta).


This works well for any in-memory processing and UI Data display requirements, where an existing / optimal query returns couple of millions of rows and the application is unable to handle all of this data in-memory while displaying on the screen or writing into a file for offline access.


Note: Spring API for EHCache can be used to further improve the usage with IoC.