Thursday, September 25, 2008

Why does the JVM crash with a core dump or a Dr.Watson error?

Why does the JVM crash with a core dump or a Dr.Watson error?
Any problem in pure Java code throws a Java exception or error. Java exceptions or errors will not cause a core dump (on UNIX systems) or a Dr.Watson error (on WIN32systems). Any serious Java problem will result in an OutOfMemoryError thrown by the JVM with the stack trace and consequently JVM will exit. These Java stack traces are very useful for identifying the cause for an abnormal exit of the JVM. So is there a way to know that OutOfMemoryError is about to occur? The Java JDK 1.5 has a package called java.lang.management which has useful JMX beans that we can use to manage the JVM. One of these beans is the MemoryMXBean.
An OutOfMemoryError can be thrown due to one of the following 4 reasons:
􀂃 JVM may have a memory leak due to a bug in its internal heap management implementation. But this is highly unlikely because JVMs are well tested for this.
􀂃 The application may not have enough heap memory allocated for its running. You can allocate more JVM heap size (with –Xmx parameter to the JVM) or decrease the amount of memory your application takes to overcome this. To increase the heap space:
Java -Xms1024M -Xmx1024M
Care should be taken not to make the –Xmx value too large because it can slow down your application. The secret is to make the maximum heap size value the right size.
􀂃 Another not so prevalent cause is the running out of a memory area called the “perm” which sits next to the heap. All the binary code of currently running classes is archived in the “perm” area. The ‘perm’ area is important if your application or any of the third party jar files you use dynamically generate classes.
For example: “perm” space is consumed when XSLT templates are dynamically compiled into classes, J2EE application servers, JasperReports, JAXB etc use Java reflection to dynamically generate classes and/or large amount of classes in your application.
To increase perm space:
Java -XX:PermSize=256M -XX:MaxPermSize=256M
􀂃 The fourth and the most common reason is that you may have a memory leak in your application
So why does the JVM crash with a core dump or Dr.Watson error?
Both the core dump on UNIX operating system and Dr.Watson error on WIN32 systems mean the same thing. The JVM is a process like any other and when a process crashes a core dump is created. A core dump is a memory map of a running process.
This can happen due to one of the following reasons:
􀂃 Using JNI (Java Native Interface) code, which has a fatal bug in its native code. Example: using Oracle OCI drivers, which are written partially in native code or jdbc-odbc bridge drivers, which are written in non Java code. Using 100% pure Java drivers (communicates directly with the database instead of through client software utilizing the JNI) instead of native drivers can solve this problem. We can use Oracle thin driver, which is a 100% pure Java driver.
􀂃 The operating system on which your JVM is running might require a patch or a service pack.
􀂃 The JVM implementation you are using may have a bug in translating system resources like threads, file handles, sockets etc from the platform neutral Java byte code into platform specific operations. If this JVM’s translated native code performs an illegal operation then the operating system will instantly kill the process and mostly will generate a core dump file, which is a hexadecimal file indicating program’s state in memory at the time of error. The core dump files are generated by the operating system in response to certain signals. Operating system signals are responsible for notifying certain events to its threads and processes. The JVM can also intercept certain signals like SIGQUIT which is kill -3 <> from the operating system and it responds to this signal by printing out a Java stack trace and then continue to run.
The JVM continues to run because the JVM has a special built-in debug routine, which will trap the signal -3. On the other hand signals like SIGSTOP (kill -23 ) and SIGKILL (kill -9 ) will cause the JVM process to stop or die. The following JVM argument will indicate JVM not to pause on SIGQUIT signal from the operating system.
Java –Xsqnopause

Friday, September 12, 2008

Software Requirement Keep Changing? Frustrated Software Engineer?

  1. So client requirement keep changing? They don't have right process? Designer has not done their work properly? Client does not know what he wants? Making my life miserable? How many times I will move one text box and it's Label to left, right, left right?Like me, it's true that many software engineer bemoan the fact that software requirement keep changing.

    But we choose this profession to develop software, make people life easier, remove some of complexity from their life, and this changing requirement facilitates this goal.So requirement changing is not really the problem. The problem is that I am not in habit of accommodating change, that I and my process are not agile. I have written piece of code that is not agile, that it requires lot of changes to accommodate changes.

"Life is 10 percent what happened to you and 90 percent how you respond to it''

So how should I write peace of code that is agile in nature? Minimizing dependency between layers of code? Shoud I write my code in such a way that elements shuffle require just configuration change in some file? If database schema changes, it does not affect others? Or it just requires little changes? What should be my attitude to be effective in modern java development?

Be Disciplined:


By discipline, I don’t mean that coming office in sharp 8:45 AM before your manager comes and go after your managers leave office.
By discipline I mean follows basic rule of software development for each line of code you have delivered. Simply basic rule is to write unit test for code before actually writing the code, and then code particular requirement, run your unit test cases, and your code should do exactly what it supposed to do. And follow this basic principle for each line of code you have written. .

Think of code as Design, Not a product:

Code is not a product that is going to solve particular client or business requirement but think it as great, marvelous civil construction on virtual world. Then how should I beautify the piece of code I have written?

I believe basic rule is:

  • Each line of code should be covered by proper comments.
  • Each layer of your application code should be decoupled as much as possible i.e. minimizing dependency.
  • Following standard design patterns, principle for avoiding common software problems.
  • be agile, be adaptative, be fast, but don’t loose design principles and guidelines.

Monday, August 4, 2008

Transaction, Spring, Hibernate and Database like Oracle

Understanding Different Types of Dependency Injection

There are threemain types of DI:
• Interface injection (Type 1 IoC)
• Setter injection (Type 2 IoC)
• Constructor injection (Type 3 IoC)
Of these, setter injection and constructor injection are widely accepted and supported by
most IoC containers.
How It Works
For comparison, it’s better to introduce the DI types in order of their popularity and efficiency,
rather than by the type number.
Setter Injection (Type 2 IoC)
Setter injection is themost popular type of DI and is supported bymost IoC containers. The
container injects dependency via a settermethod declared in a component. For example,
ReportService can implement setter injection as follows:
package com.apress.springrecipes.report;
public class ReportService {
private ReportGenerator reportGenerator;
public void setReportGenerator(ReportGenerator reportGenerator) {
this.reportGenerator = reportGenerator;
}
...
}
The container has to inject dependencies by calling the settermethods after instantiating

public class Container {
public Container() {
...
ReportService reportService = new ReportService();
reportService.setReportGenerator(reportGenerator);
components.put("reportService", reportService);
}
...
}
Setter injection is popular for its simplicity and ease of use sincemost Java IDEs support
automatic generation of settermethods. However, there are someminor issues with this type.
The first is that, as a component designer, you cannot be sure that a dependency will be
injected via the settermethod. If a component user forgets to inject a required dependency,
the evil NullPointerException will be thrown and it will be hard to debug. But the good news
is that some advanced IoC containers (e.g., the Spring IoC container) can help you to check for
particular dependencies during component initialization.
Another shortcoming of setter injection has to do with code security. After the first injection,
a dependencymay still bemodified by calling the settermethod again, unless you have
implemented your own securitymeasures to prevent this. The carelessmodification of
dependenciesmay cause unexpected results that can be very hard to debug.
Constructor Injection (Type 3 IoC)
Constructor injection differs fromsetter injection in that dependencies are injected via a constructor
rather than settermethods. This type of injection, too, is supported bymost IoC
containers. For example, ReportServicemay accept a report generator as a constructor argument.
But if you do it this way, the Java compiler will not add a default constructor for this
class, because you have defined an explicit one. The common practice is to define a default
constructor explicitly for code compatibility.