Dealing with Exceptions
Java Crashes
Java crashes refer to errors that the Java VM is unable to handle "gracefully" by itself. In such a case the Java VM terminates and creates a crash report. Native libraries are often the cause of these errors. This may also include the FACT-Finder library.
The Java parameter –XX:ErrorFile=
can define where these log files are recorded (see Configuring Java).
Out of Memory Killer
The OOM-Killer is a kernel feature that caused the operating system to (forcefully!) ensure that applications cannot use any more memory than is available. Thus, if FACT-Finder (and therefore the Java as the operating system) tries to allocate too much memory, the operating system will kill the process. Normally checkserver will restart FACT-Finder and the problem is eliminated.
However, it also occurs (particularly under Linux) that FACT-Finder has been allocated so much memory that just a little free space is left over. Then, if some other system-relevant process such as cron
or syslog
also wants to have a little bit of memory, these same processes are killed by the operating system. If we consider a situation in which Java has obtained more memory, but has refused or neglected to release it, and thereafter cron
requests memory and is killed for this reason, then it is ensured that checkserver.sh
will not be able to restart after a crash. Whenever checkserver.sh
(for whatever reason) requests some memory, it will be terminated, and it will not be long before the next Java request for memory occurs, which means the end of the FACT-Finder service. It is therefore highly advisable to monitor the memory usage level from the standpoint of the operating system.
The OOM-Killer records log data to syslog. If processes disappear, which are expected to be running on the server, it is recommended to check the Syslog for OOM-Killer entries.
Tomcat Exceptions
If the Tomcat server is not operating correctly, is recommended to study the file catalina.out
, which is normally located in /var/log/tomcat7
. You will often find an explanation for the exception there. Frequently recurring problems include:
Error Message | Explanation |
---|---|
java.lang.OutOfMemoryError: Java Heap | see: Configuring Java. Most likely the values for Xms and Xmx are set badly. |
java.lang.OutOfMemoryError: PermGen | Old FACT-Finder versions using Java 7 saw this error when values for Java Heap and Permgen worked at cross purposes. This error should no longer appear, as Java 8 doesn't distinguish between them. |
de.factfinder.jni.FactFinderException: Out of memory! | This is another separate type of memory problem. This means that the FACT-Finder library has insufficient memory available. The Java VM and the FACT-Finder library run in the same process but do not share the same memory. The memory that is reserved for Java (see Configuring Java) is not available to the FACT-Finder library. For example, if the system has 4GB of RAM and Java is assigned 3.5 GB, the remaining 500 MB is available for the FACT-Finder library. |
All Threads (150) busy | This message is self-explanatory, and generally means that the server does not have enough CPU power. If the CPU load has not significantly increased, one reason could be a poor network connection between FACT-Finder and the shop-system. If in doubt please consult the FACT-Finder Service Desk. |
BeforeAccessLogValve
If an error occurs repeatedly, you can configure another LogValve
that will log requests immediately upon on their arrival, and not just when they have been processed. This permits some errors to be identified more easily, because in this manner requests are detected whose execution could not be logged. This Valve should be used only for debugging and is not recommended for normal operation. An example (added to the server.xml
of the Tomcat):
<Valve className="org.apache.catalina.valves.BeforeAccessLogValve" directory="logs"
prefix="localhost_pre_access_log." suffix=".txt"
pattern="%h %t %S "%r" "%{Referer}i""
resolveHosts="false"
buffered="true"/>