Skip to main content

Solved: Unable to Locate Spring Namespace Handler

I attempted to run a Spring WebMVC application, and when starting up the application complained that it didn't know how to handle the MVC namespace in my XML configuration. The project runs JDK 7 and Spring 4.0.6 using Maven as the build system.

The following is my XML configuration file:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"       xmlns:mvc="http://www.springframework.org/schema/mvc"       xsi:schemaLocation="        http://www.springframework.org/schema/beans        http://www.springframework.org/schema/beans/spring-beans.xsd        http://www.springframework.org/schema/mvc        http://www.springframework.org/schema/mvc/spring-mvc.xsd">
    
    <mvc:annotation-driven/>
    
</beans>
I have a few more beans than this, but their details aren't especially relevant to this treatment. The application startup failed with a message indicating that Spring does not know how to process the MVC namespace specified in the XML configuration file:

org.springframework.beans.factory.parsing.BeanDefinitionParsingException: Configuration problem: Unable to locate Spring NamespaceHandler for XML schema namespace [http://www.springframework.org/schema/mvc]
After a good bit of searching, I found the answer on StackOverflow. I am using Maven Shade plugin to make a bundled jar file, but the bundle omitted some files in META-INF necessary for Spring to resolve the MVC namespace. The solution was to add some additional transformations into the Shade configuration:
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">   <resource>META-INF/spring.handlers</resource></transformer><transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">   <resource>META-INF/spring.schemas</resource></transformer>
Here is the full discussion on StackOverflow. It was frustrating that it was so hard to find this solution online - so hopefully this post saves somebody else some effort.




Comments

  1. Informative post indeed, I’ve being in and out reading posts regularly and I see alot of engaging people sharing things and majority of the shared information is very valuable and so, here’s my fine read.
    click here to connect
    click here to check the status of your arn
    click here for citizen registration
    click here to continue your registration
    click here to confirm

    ReplyDelete

Post a Comment

Popular posts from this blog

ReactJS, NPM and Maven

I'm just starting to get into working with ReactJS, Facebook's open source rendering framework. My project uses SpringBoot for annotation-driven dependency injection and MVC. I thought it would be great if I could use a bit of ReactJS to enhance the application. If you're looking for a basic conceptual intro, I recommend ReactJS for Stupid People and of course the official documentation  is quite good. In full disclosure, I still have no idea how to do "flux" yet. As an experienced Java backend developer, I'm pretty decent at hacking Maven builds - which is precisely what this blog post is going to be about. First, a word about how React likes to be built. Like many front-end tools, there is a toolkit for the node package manager (NPM). From the command prompt, one might run npm install -g react-tools  which installs the jsx command. The  jsx  command provides the ability to transform JSX syntax into ordinary JavaScript, which is precisely what I want. O

Spark Cassandra Connector Tip

We're using Databricks as our provider for Spark execution, and we've been struggling to get the Spark Cassandra connector to work outside of the local development environment. The connector was attempting to connect to 127.0.0.1 even though we were passing the new host information into the getOrCreate(..) call. After working with Ganesh at Databricks support, we figured it out. The realization is that in Databricks, calls to getOrCreate() from a fat jar don't create a new SparkContext object. Thus, the configuration passed in gets ignored. If you want to update the Cassandra host information for the connector, you must update it after  the call to getOrCreate() instead. Add the configuration directly to the context and you'll be good to go!