Friday, June 29, 2012

FileAdapter pipelines and valves for pre- and postprocessing

Introduction

In a previous post I've written about using the Spring component in order to process files, which the FileAdapter cannot process (http://javaoraclesoa.blogspot.nl/2012/03/file-processing-using-spring-component.html). Another alternative for this is using a pre-processing step in the FileAdapter by implementing pipelines and valves. This is described on; http://docs.oracle.com/cd/E17904_01/integration.1111/e10231/adptr_file.htm#BABJCJEH

I of course had to try this out in order to make an informed judgement when asked about the preferred method for a specific use case. I've used the example provided by Oracle in their documentation; encrypting and decrypting files. I created two processes. One for encrypting and one for decrypting.

The below image from the Oracle documentation shows how the mechanism of using pipelines and valves works. The FileAdapter can be configured (a property in the JCA file) to call a pipeline (described in an XML file). The pipeline consists of references to valve Java classes and some configuration properties. Valves can be chained. It is also possible to do some debatching in the form of a so-called Re-Entrant Valve. This can for example be used if the FileAdapter picks up a ZIP file and the separate files need to be offered to the subsequent processing steps one at a time. I would suggest reading the documentation on this.



In this post I will describe my tryout of the FileAdapter pipelines and valves and the problems I've encountered. I will describe the steps which I have done and provide a sample project for download. In order to describe the steps, I will repeat some of the actions described in the manual.

Implementation

Java

First you need to create a Java project containing the code for the (custom) valves. You need to include the libraries as shown in the screenshot below. I needed to add the bpm-infra.jar. It is located in; <JDev home>\soa\modules\oracle.soa.fabric_11.1.1

I noticed the SimpleEncryptValve example code provided by Oracle missed some code. In the sample projects which are for download at the end of this post, I've corrected this.

When you've created the valves package and added the Java code examples, you can create a new JAR deployment profile in order to package the files.


When you have the JAR, you can put it in; $MW_HOME/user_projects/domains/soainfra/lib on the application server.

In the Oracle supplied example, the Cipher key needs to be 8 bytes long, else (in case you for example use 9 bytes) the following error will occur;

faultName: {{http://schemas.oracle.com/bpel/extension}remoteFault} messageType: {{http://schemas.oracle.com/bpel/extension}RuntimeFaultMessage} parts: {{ summary=<summary>Exception occured when binding was invoked. Exception occured during invocation of JCA binding: "JCA Binding execute of Reference operation 'Write' failed due to: Unable to execute outbound interaction. Unable to execute outbound interaction. Unable to execute outbound interaction. Please make sure that the file outbound interaction has been configured correctly. ". The invoked JCA adapter raised a resource exception. Please examine the above error message carefully to determine a resolution. </summary> ,detail=<detail>Invalid key length: 9 bytes</detail> ,code=<code>null</code>}

A nice feature which can be used to obtain the filename and path from the context inside a valve can be found here; https://forums.oracle.com/forums/thread.jspa?messageID=10410343. inputStreamContext.getMessageOriginReference() returns a String which contains filename/path.

SCA

In order to configure the FileAdapter to call the created valves, there are two options. You can specify them comma separated as the jca FileAdapter property PipelineValves. For example;

<property name="PipelineValves" value="valves.SimpleUnzipValve,valves.SimpleDecryptValve"/>

This is however not very flexable; it is not possible to specify additional parameters. The second option is to create a pipeline definition and refer to that definition with the property PipelineFile. For example;

<property name="PipelineFile" value="simpleencryptpipeline.xml"/>

Pipeline definition: simpleencryptpipeline.xml;

<?xml version="1.0"?>
<pipeline xmlns="http://www.oracle.com/adapter/pipeline">
<valves>
        <valve>valves.SimpleEncryptValve</valve>
</valves>
</pipeline>


If a valve is reentrant (can be called more then once returning a new InputStreamContext when for example unzipping multiple files), you can specify that as follows;

<?xml version="1.0"?>
<pipeline xmlns="http://www.oracle.com/adapter/pipeline">
<valves>
        <valve reentrant="true">valves.ReentrantUnzipValve</valve>
        <valve> valves.SimpleDecryptValve </valve>
</valves>
</pipeline>


My tryout did not result in correct encoding and decoding back to the original file. After having encoded the file and offering it to the decoder, the result differs from the original file. Because the decoded result is different from the encoded file I offered to the process and I did not do any further processing on the file I read, one can conclude that the valve did get executed, however the logic in the valve is incorrect

Since I'm not interested in diving deeply into security algorithms (that's a different although related specialty), I've not spend more time on finding out what the actual problem is. Suggestions are welcome ;)

Conclusion

Using pipelines and valves allows pre- and post processing of files. This allows the FileAdapter to be used in more situations, which can limit the requirement to build certain functionality from scratch in (for example) a Spring component when the input/output files differ slightly from what the FileAdapter can handle.

Valves and pipelines also have several other nice options for usage as for example listed on; http://technology.amis.nl/2011/10/24/soa-suite-file-adapter-pre-and-post-processing-using-valves-and-pipelines/

Valves however when placed on the application server classpath and not deployed as part of a composite, become available to all deployed composites. This limits the flexibility; replacing the valves will impact all composites using them.

If application specific libraries are required, putting Jar's in a composite and use them in a Spring component, can be preferable to making these libraries available to all applications by implementing them in valves put on the application server.

However, debugging and error handling of pipelines/valves is quite nice. Error messages are clear and you can use properties as defined in the composite in valves by using methods like; getPipeline().getPipelineContext().getProperty("myCipherKey"). These properties can be maintained at runtime; http://beatechnologies.wordpress.com/tag/persist-the-values-of-preferences-in-bpel/. When using Spring components, you don't have the SCA context available without feeding it as parameters to the component (maybe it is available but I did not spend enough time looking for it. Please correct me if I'm wrong on this).

You can download my sample projects here

1 comment:

  1. Hi, is it possible to keep the valves classes in the project and deploy it all together?

    I have tried compiling my valve in SOA/zipValves/Compress.class and to call it from the relative xml pipeline, but in the weblogic logs I find:

    [reference_name: ftpWrite] Error in creating pipeline[[ oracle.tip.pc.services.pipeline.PipelineException: Invalid class at oracle.tip.pc.services.pipeline.PipelineFactory.getValveInstance(PipelineFactory.java:288)
    ...
    Caused by: java.lang.ClassCastException: zipValves.Compress cannot be cast to oracle.tip.pc.services.pipeline.Valve

    Could you add more details on read and write adapter settings?
    Thanks

    ReplyDelete