Performance of setting Java Initial and Maximum memory to the same value
In my work environment, we have a number of Enterprise Java applications that run on Windows Server 2008 hardware. An external consultant has been reviewing the performance of our Java applications and have suggested that we change out initial and maximum memory values to be identical.
So, at present, we're running an application with 1GB initial and 2GB maximum memory. Their recommendation is to change the initial and maximum memory both to 2GB.
Their reasoning is 2-fold...
- By allocating the initial and maximum at 2GB, when the Java application is started it will grab a 2GB block of memory straight up. This prevents any other applications from accessing this memory block, so this memory will always be available to the Java application. This is opposed to the current situation, where it would get an initial block of 1GB only, which means potentially other applications could consume the remaining available memory on the server, and the enterprise Java application wouldn't be able to grow to the 2GB memory maximum.
- There is a small performance hit every time that Java needs to allocate memory between the initial and maximum size. This is because it need to go to Windows, assign the new block of the required size, then use it. This performance hit will occur every time that memory is required between the initial size and the maximum size. Therefore, setting them both to the same value means that this performance impact is removed.
I think their 1st point is valid from a safety perspective, as it means the java application will always have access to the maximum memory regardless of what other applications are doing on the server. However, I also think that this could be a negative point, as it means that there is a big block of memory that can't be used by other applications if they need it, which could cause general server performance problems.
I think the performance impact they discuss in point 2 is probably so negligible that it isn't worth worrying about. If they're concerned about performance, they would be better off tuning things like the garbage collection aspect of Java rather than worrying about the tiny performance impact of allocating a block of memory.
Could anyone please tell me whether there is real benefit in their request to assign the initial and maximum memory to the same value. Are there any general recommendations either way?
setting them the same increase predictability. If you don't set them the same when the GC decides it needs more memory it will take time to allocate and shuffle around objects to make room. This is in addition to the GC activities. While this is happening requests are slower and your throughput will suffer. On a 2GB heap you will probably be allocating around 400mb of memory each time more is needed and 100mb removed each time the memory isn't needed. Increase the heap size and these numbers increase. The heap would be ever changing between your values, it isn't like it just allocates and keeps that memory.
For the argument #1 on ensuring the OS always has the memory available for you I believe is a moot point. If your server is hurting for memory and you are already capping out the machine to run the software then you are running it on the wrong server. Get the hardware you need and give it room to grow. If you say your application could use 2GB, personally I would have that on a machine with 4GB or more free. If my client/user base grows the freedom is there to increase the heap to accommodate new users.
IMO, even the 1st suggestion is not so important. Modern OS has virtual memory. So even if you allocate 2GB memory to your Java process, it's not guaranteed that these 2GB memory will always reside in physic memory. If there are other applications in the same box which consumes a lot of memory, you'll get poor performance no matter when you allocate memory.
The 2nd point is VERY valid. Grabbing memory is a slow IO operation. Especially large chunks. However, 2GB is a large block. What kind of hardware are you running this on? If it has quite a bit of memory, it would be a good idea.
HOWEVER, lets say your computer doesn't really have a lot of memory, allocating a 2GB block can be dangerous and greedy. You should instead create a object pooling scheme. Remember, the garbage collector takes memory and does its own memory pool, but in cases of 2GB, it is likely some memory will be released to the OS.
Do some profiling, you would be surprised how much time can be saved by not wasting memory and allocating memory all the time. The garbage collector will shield you from a lot of it, but not as much with a 2GB block.
Ultimately, if possible... you should look into using LESS memory. For REAL performance, look into keeping buffers and arrays in the cache. This is a bit more complex, but it can generate vast results.
Best of luck
When you start a JVM it grabs the maximum heap size as one continuous block whether you set the initial size or not. How much its uses depends on usage. e..g a hello world with an initial size of 2G will not use 2G. What the initial size does is not attempt to limit usage until the initial size is reached. (But it won't allocate main memory to it if it not actually needed)
Setting the initial can improve the startup time. I tend to set the initial size to the size the JVM grows to anyway (or not bother if the startup time is short)
The real impact is that a small heap size can result in temporary objects moved into tenured space which is expensive to cleanup. A larger heap size can mean temporary objects are discarded before being copied many times and removed with a full gc.