TomcatExpert

Tomcat Admin

Blog : Year in Review 2011

posted by Stacey Schneider on January 4, 2012 07:31 AM

2011 has been a great year for the Tomcat Expert community. After almost 2 years of operating, the Tomcat Expert has hit its stride, unloading an array of new information, as well as keeping you up to date with the newest releases for Apache Tomcat 6 and Apache Tomcat 7. With the addition of two new Tomcat Expert Contributors, (Channing Benson and Daniel Mikusa), the Tomcat Expert community continues to build on its reputation for being the leading source for fresh perspectives and new information on how to best leverage Apache Tomcat in the enterprise.

Read More

72 comments   |  

0
Rating
  |  

Developers, Executives | Cross-site Scripting, Java Development, Parallel Deployment

Blog : 4 Mistakes To Avoid On Apache Tomcat

posted by mthomas on August 3, 2011 06:50 AM

As a VMware engineer dedicated to building Apache Tomcat and vFabric tc Server , I get the opportunity to see a lot of issues across the official Apache Tomcat public mailing lists, as well as VMware’s private professional support queue for both Apache Tomcat and tc Server. Typical of any software issue tracker, many of the issues logged could be avoided with a little better understanding of the Tomcat applications. Here are a few tips that may be useful to keep in mind:

Understanding Global vs Application Context.xml Files

There are two different types of context.xml files: one is global, and the other is specific to each web application. The problem with editing the global context.xml file is, as its name implies, that it affects every web application running on that Tomcat instance. So for instance, if you have 10 web applications, and create a new JNDI datasource with 50 connections to the database in the global context.xml file, you have essentially created 10 JNDI datasources with a total of 500 connections to your database and have likely completely overwhelmed your database. If you want to add a datasource to a single application, by remembering to create the datasource in the application level context.xml file, you can avoid serious performance problems.

Creating a Single Global Datasource for Application Sharing

Occasionally companies will deploy 3 or 4 related applications on a Tomcat server that are designed to share a single datasource. As described above, placing the datasource definition either once in the global context.xml file or in 3 or 4 application specific context.xml files will always create multiple instances of that datasource. To truly share a single datasource, it is necessary to put the definition of the datasource into the server.xml file, and then place a single resource link into the global context.xml file. This link ensures only one instance of the datasource is ever created and when any application goes to use it, it always uses the same single instance.

Read More

111 comments   |  

0
Rating
  |  

Developers, Operations | Java Development, Tomcat Admin, Tomcat Configuration

Blog : Parallel Deployment with Apache Tomcat 7

posted by mthomas on May 31, 2011 07:44 AM

Upgrading web applications can be very expensive if your storefront is the web. Weekend maintenance windows, or downtime in general can give an entire company heartburn. Survey data shows that web application downtime can cost some companies up to $72,000 per minute. Yet the cost of not constantly rolling out new features and bug fixes can equally penalize a company in the competitive online markets today.

Previously, to upgrade an application on Tomcat and avoid downtime, system administrators would have to set up multiple instances of Tomcat and do some very clever stuff with load balancers. This equals extra hardware costs as a permanent part of the company’s infrastructure.

Now with the advent of parallel deployment in Tomcat 7, you can have multiple versions of the same application installed at the same time on a single server. Users with active sessions can continue to use the old application and new users will be routed to the new version. This way, no user sessions will be interrupted, and the old application can gracefully phase out.

Setting Up Parallel Deployment

Parallel deployment is a function of the Context Container. The Context element represents a web application, which in turn specifies the context path to a particular Web Application Archive (WAR) file that is the application logic. Parallel deployment allows you to deploy multiple versions of a web application with the same context path concurrently. When choosing what version of the application for any given request, Tomcat will:

  1. Route new requests to the latest version, so new sessions go to the new application.
  2. If session information is in the request, check the session manager for a matching version, so existing sessions will go to the original application for the request.
  3. If session information is in the request, but the corresponding application is no longer present, it will route to the latest version.

Read More

91 comments   |  

5
Rating
  |  

Operations | Parallel Deployment, Tomcat 7, Tomcat Admin

Blog : Crawler Session Manager Valve

posted by mthomas on May 18, 2011 07:25 AM

For organizations with large publically searchable websites, such as those found in ecommerce companies with large product catalogues or companies with active online communities, web crawlers or bots can trigger the creation of many thousands of sessions as they crawl these large sites. Normally crawling sites without relying on cookies or session IDs, these bots can create a session for each page crawled which, depending on the size of the site, may result in significant memory consumption. New in Apache Tomcat 7, a Crawler Session Manager Valve ensures that crawlers are associated with a single session - just like normal users - regardless of whether or not they provide a session token with their requests.

A Relevant Example

One of the roles I play in the Apache Tomcat project is managing the issues.apache.org servers which run the two Apache issue trackers we have—two instances of Bugzilla and one instance of JIRA. Not surprisingly, JIRA runs on Tomcat. A few months ago, while looking at the JIRA management interface, I noticed that we were seeing around 100,000 concurrent sessions. Given that there are only 60,000 registered users and less than 5,000 active users any month, this number appeared extremely inflated.

After a bit of investigation, the access logs revealed that when many of the webcrawlers (e.g., googlebot, bingbot, etc) were crawling the JIRA site, they were creating a new session for every request. For our JIRA instance, this meant that about 95% of the open sessions were left over from a bot creating a single request. For instance, a bot requesting 100 pages, would open 100 sessions. Each one of these requests would hang around in memory for about 4 hours, chewing up tremendous memory resources on the server.

Read More

112 comments   |  

0
Rating
  |  

Developers, Operations | JIRA, Tomcat 7, Tomcat Admin

Blog : Tomcat Expert's Top 10 of 2010

posted by joannad on December 30, 2010 08:53 PM

2010 has been an exciting year for the Tomcat Expert community site. Created by the Apache Tomcat Experts at SpringSource, Tomcat Expert was launched in March to improve the adoption, performance and value of Apache Tomcat for enterprise users. After almost ten months of operation, we’ve been able to provide you with content from Tomcat Expert Contributors weighing in on top Apache Tomcat news and topics, including several relating to June's release of Tomcat 7.0.0 Beta, the first Tomcat 7 release.  As the year winds down, we've put together a list of the most popular blog posts of the year. Additionally, we're asking you to tell us what topics you'd like to see covered more in 2011 with a content request form below. 

Read More

28 comments   |  

0
Rating
  |  

Developers, Executives | Tomcat 7, Tomcat Admin, Tomcat Cloud

Blog : How To Migrate JEE Applications to Apache Tomcat

posted by avanabs on July 7, 2010 05:42 AM

I’ve been sharing some thoughts about what’s become a significant trend in many IT organizations, and in particular with my clients…converting Java applications from JEE Application Servers to Tomcat, and more typically Tomcat+add-ons.

Many IT organizations are re-thinking their commitment to commercial JEE Application Servers, due to both challenging business environments that drive the need for more cost effective application architectures and the need to transition to more responsive/agile applications development. When IT organizations talk about “migrating” their applications, I’ve noted that they generally are focusing on one or more of three distinct situations. These are:

  • Moving existing applications (or slices of applications) off of JEE servers and onto lightweight, modular, horizontally scalable container infrastructures
  • Expanding access to existing JEE applications by adding services layers in lightweight containers.
  • Transitioning new development away from JEE application servers and focusing on light weight containers

I’ve been focusing on the migration of existing JEE applications to the most popular of the light weight containers, Tomcat. There are many excellent reasons to consider moving applications off of the commercial JEE servers sold by Oracle/BEA, IBM, etc. While we are focusing on the migration process, many of the business and technical decision factors apply equally well to the second and third situations.

This time, I will be discussing the technologies involved in migrating JEE application code from commercial JEE servers to Tomcat. I’d like to thank the kind (and very expert) folks at SpringSource, as well as a number of other friends around the industry, for their valuable insight regarding the technologies involved. Any errors (and opinions) are mine alone. Additionally, some of the material draws on information published by SpringSource and other open source materials found on the internet.

Read More

23 comments   |  

0
Rating
  |  

Developers, Operations | EJB, Hibernate, JEE

Blog : Migrating JEE Applications to Apache Tomcat: Motivation for Migrating

posted by avanabs on June 3, 2010 03:09 PM

In my prior blog on migrating JEE to Tomcat, I discussed the fact that organizations are increasingly migrating from JEE Application Servers to other lighter weight, simpler, faster, more scalable, and definitely less costly JAVA deployment environments. Today, I’ll take a more detailed look at the reasons for such a change and the associated costs.

Reasons to Migrate from JEE to Tomcat

Organizations that choose to migrate existing applications to a new application server are typically motivated by one or more of the following goals:

  • Costs—Infrastructure costs are frequently mentioned as a primary motivator for migration, and are certainly important. That said, these costs can be subtle, particularly since in most cases the license itself is a “sunk cost” and all the maintenance fees probably continue if you use any of your licenses (contract “non-retirement” provisions).
    • Capacity Expansion—The need to expand deployment of an application in a cost effective way frequently drives interest in alternative infrastructures
    • Application Replacement—When an application “wears out” and is being replaced entirely, there are opportunities to consider alternatives
    • Vendor Replacement-—While relatively rare, some IT organizations are choosing to replace their IT infrastructure vendors, for a variety of reasons. The cost advantages of replacing obsolete architectures and equipment are an important part of the cost analysis.

Read More

22 comments   |  

0
Rating
  |  

Developers, Operations | JEE, migration, Tomcat Admin

Knowledge Base : WAR Files Not Completely Extracted on Deployment on Windows

posted by SpringSource on May 26, 2010 07:15 AM

File locking in Windows may prevent the directory from being deleted.

File locking on Windows is different than file locking on Unix in that on Windows a file can be locked on read. If you are redeploying a second version of an already deployed web application, you may see something like the following web application:

 webapps
- myapp.war
- myapp

Read More

204 comments   |  

0
Rating
  |  

Operations | deployment, Tomcat Admin, Tomcat Memory

Knowledge Base : Setting up mod_proxy_balancer

posted by SpringSource on May 25, 2010 03:54 PM

Setting up Apache Web Server and Tomcat Application Server for load balancing using mod_proxy_balancer.

This example uses mod_proxy_balancer as the load balancer. This configuration is useful for applications that are stateless and therefore do not require clustering or sticky-sessions. For more information, you can also check the page:

http://httpd.apache.org/docs/2.2/mod/mod_proxy_balancer.html

Read More

0 comments   |  

1.5
Rating
  |  

Operations | load balancing, mod_proxy_balancer, Tomcat

Blog : Migrating JEE Applications to Apache Tomcat: Deciding to Move Forward

posted by avanabs on May 24, 2010 03:07 PM

I’ve been researching one of the most interesting trends in IT development and deployment architectures; the migration of development/deployment architectures from JEE Application Servers to light weight JAVA containers. Many IT organizations have been re-thinking their commitment to commercial JEE Application Servers, due to both challenging business environments that drive the need for more cost effective application architectures and, more importantly, the need to transition to more responsive/agile applications development. When we hear IT organizations talk about “migrating” their applications, they generally are focusing on one or more of three distinct situations. These are:

  • Migrating existing applications. Moving existing applications (or slices of applications) off of their commercial JEE servers and onto lightweight, modular, horizontally scalable container infrastructures. This trend has been accelerating, particularly with the emergence of JAVA frameworks that replace the “only a computer scientist could love” JEE standards with far more productive (and performant) technologies.
  • Extending existing applications and services. Expanding access to existing JEE applications by adding services layers in lightweight containers. This is an even more important trend, which gained significant momentum when IT consulting firms went SOA crazy back in the mid 2000’s. Even without the overheads and complexity of SOA products (from many of the same folks that brought us JEE) the idea of horizontally scaled distributed services is an excellent one. Even where JEE servers maintain their hold on the “back office” business systems, the trend is to convert them to service providers, enabling far more flexible and agile development.
  • New development on better architecture. Transitioning new development away from JEE application servers and focusing on light weight containers. Let’s face it, JEE was just plain hard. It required writing lots of redundant (and mostly unneeded) structure and learning to use overly complex interfaces. Today’s JAVA frameworks offer most of the useful power of JEE, with code sizes running as much as 50% smaller, dramatic increases in developer productivity, and in most case significant performance improvements.

In the next few blogs, I'll be focusing on the migration of existing JEE applications to the most popular of the light weight containers, Apache Tomcat. There are many excellent reasons to consider moving applications off of commercial JEE servers sold by Oracle/BEA, IBM, etc. While we are concentrating on the JEE to Tomcat migration process, many of the business and technical decision factors apply equally well to the second and third situations and many IT organizations are doing some/all of them in parallel.

Read More

16 comments   |  

0
Rating
  |  

Developers, Operations | Application Servers, java, JEE

Syndicate content