Showing posts with label scm. Show all posts
Showing posts with label scm. Show all posts

Thursday, April 16, 2020

Artifact Dashboard: Make it pretty

In this blog post I'll cover how to use webjars in conjunction with sbt. The plan is to download bootstrap and maybe other web techniques via maven artefacts. 

This blog post is a successor to another blog post I made only recently, I had the feel it should have better looks, so here we go.

Motivation

I'm well aware that there are package managers for javascript, at least I've heard of them - not overly good things - but I heard of them. Chances are high that if you continue to read to the end of this post there has been a new one invented.

In my bubble, in the past - well, many years - I had to fiddle around with JVM de factor standard build systems like maven or gradle, and if i neglect countless hours of fustration with them I have to admit they served me well. As such, I'll stay with this ecosystem, at least I use it in creative ways. Well, sort of. I'll use sbt to download dependencies and build the example project, I hope I got your head spinning in the meantime.

Reality chck: One thing I would advise to avoid is to have too many package managers in your corporate setting, it is a maintenance burden which gets multiplied by each new system. Naturally, every build system wants to outmatch the other ones, either in complexity or in the wealth of features it provides. (One advice: Sometimes it pays off to bet on the most conservative option, but I doubt you'll find such a thing in javascript land. 😉)

In some bright moments however, different worlds work together, and this blog post wants to shed a light on how to combine web techniques with tools used for corporate settings.

To put it bluntly, I try to download a jar file containing prepacked static content containing a bunch of javascript and other resources and put it at the right place - on demand.

For example, I want to use bootstrap in my project - just go to their website, download it, extract things you are interested in, move on? 

Of course this is a valid - and most probably - the most used approach by a majority of developer teams, but it has some drawbacks. 

First, it is unclear where such frameworks come from. It is also hard to tell 'afterwards' which file belongs to which framework. You are left to best guessing based on directory or zip file names. A concern might be the licensing situation - how to tell if a bunch of code maybe inflicts legal issues in your code? What about if you want to update a certain library - often you can be glad if you even know which version you currently use. 

Of course you can document it, but odds are high that this documentation is not really in sync with your code, at least you have to check manually that this is the case. Another anti pattern which comes to mind is that third party code will be checked in your code repository, no matter how hard you fight it, by some innocent but mindless colleague. The best thing about is, that you'll surely find the same library downloaded and checked in multiple times, maybe also in different versions, open doors for subtle bugs, o my god, in PRODUCTION. Insane! Mind blowing! (Yes I play bass and yes I slap and yes I know davie504)

There has to be a reason why such a zoo of package managers exist, for each programming language several of them, along with their ecosystems, everybody fighting for the attention of the very confused poor Mr. Bob Developer.

As such, for all of the good reasons cited above, there is an excuse to automate it yet again and write a blog post about it, mainly for you who reads this with intense boredom waiting for the anticlimax which is yet to come.

Implementation

I'll continue to work on the artifact-dashboard project, and enhance version 0.4 by adding necessary commands to the sbt build file to download bootstrap.

Before reading on, maybe you want to inform yourself about webjars.org, whose mission statement is 
WebJars are client-side web libraries (e.g. jQuery & Bootstrap) packaged into JAR (Java Archive) files.
This is exactly what we need, what I tried to convey above with my ranting about problems if you don't use any sort of package manager. Webjars.org is a really cool project for all JVM engineers who don't want to get their hands too dirty with front end development, stay cozy in their own world, not to touch those strange javascript package managers alltogether.

Besides, source code management is trivial, right?

Don't worry, to download bootstrap via sbt is far less text than the rant above, but maybe more interesting.

To download and unpack it, not much is necessary, witness it here:


enablePlugins(ScalaJSPlugin)

name := "artifact-dashboard"
scalaVersion := "2.12.10"

// This is an application with a main method
scalaJSUseMainModuleInitializer := true

libraryDependencies ++=
  Seq("org.scala-js" %%% "scalajs-dom" % "1.0.0"
    , "org.webjars" % "bootstrap" % "4.4.1")

// defines sbt name filters for unpacking
val bootstrapMinJs: NameFilter = "**/bootstrap.min.js"
val bootstrapMinCss: NameFilter = "**/bootstrap.min.css"
val bootstrapFilters: NameFilter = bootstrapMinCss | bootstrapMinJs

// magic to invoke unpacking stuff in the compile phase
resourceGenerators in Compile += Def.task {
  val jar = (update in Compile).value
    .select(configurationFilter("compile"))
    .filter(_.name.contains("bootstrap"))
    .head
  val to = (target in Compile).value
  unpackjar(jar, to, bootstrapFilters)
  Seq.empty[File]
}.taskValue

// a helper function which unzips files defined in given namefilter
// to a given directory, along with some reporting
def unpackjar(jar: File, to: File, filter: NameFilter): File = {
  val files: Set[File] = IO.unzip(jar, to, filter)
  // print it out so we can see some progress on the sbt console
  println(s"Processing $jar and unzipping to $to")
  files foreach println
  jar
}

This is part of v0.5 of the artifact-dashboard project, and this is the complete sbt file which is necessary to download a version of bootstrap - at the time of writing the most current one - unpacking it to the right place and make it available for the javascript code.

As usual, you only have to provide the correct paths in your html file to the javascript code, and you can profit from all the goodies bootstrap has to offer.

As a side note, I tweaked also a little bit how Scala.js is used, have a look in the source code / html file.

I think it is a nice combination of technologies, don't know if it is popular, at least it works for me.

Using bootstrap, our download page looks far better, users will surely be more willing to click on a download link than ever before.

Here is a little screenshot of the current state of affairs with v0.5 of artifact dashboard in action for my test instance:

Screenshot of Artifact Dashboard v0.5


Thanks for reading.

Wednesday, January 6, 2016

Encapsulate OpenCV 3.1 as Android AAR

In this post I describe how to encapsulate OpenCV as an Android AAR package such that it is easier to include it as a maven dependency.

[High Street, Guildford, England]  (LOC)


Disclaimer: Apparently there exist many other tutorials about OpenCV, my approach is a little bit unconventional. I also have to mention that there is a very well maintained library called JavaCV which does essentially the same.

The motivation for this blog post is that I want to have an convenient way to use OpenCV with my Android applications. Below I describe what I had to do to achieve this.

Step 1: Download the OpenCV library


On www.opencv.org there is a link to download the library for Android. Download it, unpack it.

If you've followed the post about compiling OpenCV yourself, you already have a directory in your
homedirectory somewhere:

opencv/
opencv/opencv-3.1/
opencv/build/

now, add the unpacked Android SDK:

opencv/
opencv/opencv-3.1/
opencv/build/
opencv/OpenCV-android-sdk-3/

You'll find the usual suspects in the directory, some samples, javadoc for the API, already pre build apk's:

apk/
samples/
sdk/

Now I want to discuss briefly the contents of those directories.

Directory apk: OpenCV Manager

OpenCV encourages you to use a separate application called OpenCV Manager which sole purpose is to make sure you have installed a compatible OpenCV library on your phone. This approach is fine but requires your users to install a second app on their phone. For the technical inclined this is no problem, but for end users this may seem a little bit awkward. I prefer to deliver a self contained app which has no apparent third party dependencies.

The apk directory contains this Manager application for environments where you don't want to use the OpenCV Manager from the play store.


Directory samples: OpenCV Android Samples


The samples directory contains several example apps which demonstrate various aspects of the OpenCV API for android.

./samples/example-15-puzzle.apk
./samples/example-camera-calibration.apk
./samples/example-color-blob-detection.apk
./samples/example-face-detection.apk
./samples/example-image-manipulations.apk
./samples/example-tutorial-1-camerapreview.apk
./samples/example-tutorial-2-mixedprocessing.apk
./samples/example-tutorial-3-cameracontrol.apk

I recommend to install some of the apk's on your device:

  adb install <example.apk>

This is the best way to get a feeling what can be done with the OpenCV Android API, so I suggest to play around with the samples. The source code for those samples is also included.

Directory sdk: Android OpenCV Java API

Here you'll find what you will need for your own app. There are  pre-built android libraries for various architectures and one java API to use the native code. The directory structure you'll find on the top of this directory looks like follows:

etc/ ... some configuration for special routines you could use 
java/ ... the java glue code you will program against
native/ ... prebuilt binaries for the android platform

Since I use maven for most of my projects and all of my open source stuff, I need some way to use the provided java glue code and the binaries in my projects. As long as you just use the default API for OpenCV, you can take the code provided almost off the shelf.

Create an OpenCV AAR ready to use with Maven

The following approach shows how you can create a maven module containing the OpenCV bindings - thanks to the android-maven-plugin it can then be used like a 'normal' maven dependency. The plugin will take care about including the AAR in the final APK, you just have to declare it as a dependency (see below).

For this to happen, I've restructured the source code in the following way:

Restructuring of the sdk subfolder
This is the default structure which works together with the android-maven-plugin and includes only code and binaries you'll need at runtime. The pom looks as follows:

pom.xml for an opencv aar

You can see that I'm referring to the standard android api of a certain version - this is needed in order to properly compile the OpenCV Java API.

In order to get the standard android api, you have to clone yet another project named maven-android-sdk-deployer and install the proper API level in your local maven repository. This can be done for
example by issuing following command:

mvn install -P 5.0

A prerequisite for this command to finish successfully is however that you have already installed the Android SDK itself.

Hint: It seems that the OpenCV 3.1 bindings for Android need at least API level 21, maybe you save some time by just downloading this API Level.

Anyway, if you look closely at the pom.xml you'll notice it is using a custom packaging method, namely 'aar' - this is possible since the android-maven-plugin provides the capabilities for maven to properly create such a file type.

Aar's are bundles which contain libraries (Java code, resources or native code) ready to use in Android Applications. Luckily, android-maven-plugin makes it possible to use aar's like normal maven dependencies.

By using this approach you can deploy the OpenCV bindings in your maven repository. OpenCV can then be treated like any other maven dependency, which is a nice thing.

To recap:

After a successful deploy or local install of this maven module (with mvn install) all you need is to include it in the dependency list of your main app, just like shown below:

dependency declaration for your homebrew opencv maven module

That's all there is to it - you should be able now to use OpenCV in your project. Of course, the maven coordinates change depending on which you've chosen before.

One nice aspect is that the download of the OpenCV Manager is not needed anymore. The drawback is of course that your apk is getting bigger - nothing comes without a price.

For a complete example have a look at the SudokuFX project. Thanks for reading.


Wednesday, July 2, 2014

Akka 2.3.4 and Scala 2.11.1 - on Android

This post outlines what is necessary to set up a Maven project to deploy a Scala application which utilizes Akka to Android.


Is Akka not a server framework?

Well, it is. I just wanted to see if I could manage to come up with a configuration for Maven and its Android plugin - not to forget the wonder tool every Android developer is always very happy to configure - Proguard :)

Besides, there are also others who want to get it to run on the mobile devices or did it already.

Why Scala on Android? Maven???

I do a little bit of Android programming also in my spare time, I even <shameless plug>created an app which you definitely should check out</shameless plug>, albeit I'm not using Akka for this one.
You can tell by reading my blog that I'm struggling with Maven on a daily basis - that is why I gave it a shot (because of work/life balance you know.)

I know that the 'normal' way to build scala projects is to "use the sbt", and like others  I do use it for some projects successfully. Sometimes however Maven is just ... here.

If you have the choice you should check out sbt and one of the android plugins available. The  proguard step is mandatory in the presented setup, without it it won't work. You can ease the pain of waiting for the proguard step to finish a little bit by tweaking the proguard configuration file and comment out the optimization steps.

If you still want to do it with Maven, read on or jump directly to the source on github.

Prerequisites

For this project, I used the exciting maven-android-sdk-deployer project which deploys everything you need in your local maven repository. Just follow the installation instructions on the project website.

Macroid Akka Fragments

An important piece in the puzzle is also the macroid project - check it out. For the example project I've stolen borrowed some code from the macroid-akka-fragment project which provides the necessary glue code for Akka.

In order to compile the example project, you don't need to do anything, the links were provided for reference.

Some remarks on the pom

Have a look at the configuration of the scala-maven-plugin in the project pom.xml - there are some interesting flags which enable various warnings/features for the scala compiler which help you write more correct code. See this blog post for more details.

In my opinion, the most precious thing about the whole post is the proguard configuration file. In my experience, it is quite cumbersome to come up with a working proguard configuration.

Rant: The typical user experience is that in the first few minutes you hunt for an "optimal" proguard file, after fiddling around for some hours you'll turn of all warnings and stick to the first "working" configuration.

Finally ... I get a compile error.

The maven configuration for the scala plugin is set to "incremental" - This incremental mode only works if the zinc server is also running. That means you have to start it via "zinc -start" once. I used zinc 0.3.5-SNAPSHOT for the compilation.

In theory, after cloning the repository from here and after entering

mvn clean package android:deploy

You should see something similar like this:

[INFO] --- scala-maven-plugin:3.1.6:compile (default) @ android-akka ---
[INFO] Using zinc server for incremental compilation

[info] Compiling 2 Scala sources and 2 Java sources to /Users/lad/temp/android-akka/target/classes...

- dont forget to connect your android device and activate the usb debugging mode

The app will look like this:

screenshot of the app (created with genymotion, a great android emulator!)
The app doesn't do anything interesting - it uses just one Actor (and one button), but it could serve as a start project for your own experiments with Akka, Scala, Maven and Android.

Thanks for reading!

Monday, February 3, 2014

Building Visual Studio 2012 projects with Maven

In this post I want to describe how you can setup an automated build for Visual Studio C++ projects using the build toolchain based around maven.

java build in c++ land


Disclaimer: I'm aware of Microsofts's Team Foundation Server ALM stack. It is a powerful and  wonderful toolsuite for software development. On the other hand, if you don't want (or can't afford) TFS, the presented approach may be an alternative.

Goals

  • It should be possible to compile Visual Studio projects on a build server
  • The build should be automated
  • Tests should be executed alongside with compiling the code
  • Compiled artifacts should be delivered to the developers on demand
  • There should be a minimal impact for the workflow of the C++ developers
  • Code should be partitioned into modules
  • Modules have a public interface
  • Different modules can have different release cycles
  • If there are build errors or test failures, the development team should be informed
  • ...
This sounds like a continuous integration story -  in the Java world it would be a no brainer - just use Jenkins, Maven, a server to distribute your artifacts and a Unit Testing Framework - everything is set up and ready to go.

As it turns out, this can also be true for compiling C++ using Visual Studio projects. The only difference is that you have to know how to configure the maven build - that is to say how to organize your source code and tinker with some xml files. It imposes some work on the initial setup, but after that it works quite well, if not good enough.

A cornerstone of this approach is that the workflow for the C++ guys is the same as if the whole build process would not exist. What I mean here is that the C++ experts can configure their solution in (almost) any way they want, and use the tool they are comfortable with for doing so: Visual Studio. Likewise, they shouldn't be bothered configuring anything related to the build process.

The only concession they have to make is to understand the basic maven workflow (mvn install, mvn generate-resources) which can be hidden behind some custom tool actions in VS.

Because of those considerations,  I won't involve the otherwise well suited Maven NAR plugin for the build, since this would involve configuring dependencies on project level - this would interfere with the configuration in the solution files. Moreover, this plugin supports OS independent builds, but this is not a goal which I'm pursuing here.

Example scenario


programmers like rectangles and arrows


Suppose you have two applications which consist of three modules. App 1 only needs module A and has some extra functionality. App 2 on the other hand has three modules (module A, B, and C) and also its own program logic. Let's assume that app 2 is more complex than app 1, but still they share the same code (module A).

Code sharing using C++ can be either done by linking it statically or dynamically. In the first case, you will have lib files, which end up in the executable, or, in the second case, you have dynamic link libraries (dlls) which can be shared for more than one application (you just have to make sure that they can be found by the exe file at runtime.)

Typically, all components have a development lifecycle of their own - meaning for example that module A has to be extended for app 2, since requirements may have changed. App 1 doesn't need those changes, it would happily work with the "old" version of module A. Even worse, app 1 may not even work with the changes made to module A.

This describes the general problem that certain parts of your software stack have different release cycles. One module may not change for weeks or months (then it is either a very well written module or just plain uninteresting ;-) ), other modules are hotspots of activity.  A mechanism to reference a module in a certain version would be very handy.

You can address the versioning issue by branching and tagging module A source code and check out the appropriate version and compile module A every time (or at least once thanks to the incremental compile features of Visual Studio).

However, there may be other constraints which make this approach not feasible, like needing a special compile step which has high demands on the machine or licensing costs for involved compilers. Maybe the compilation of the dependencies just takes too long - you get the idea.

All in all, you are just interested in linking module A and be able to interface it - the compilation step for module A should be done by the domain expert who knows all the quirks to compile or maintain it. Ideally and in general the build should be automated and run on a mouse click. The binaries should be downloaded from a repository - maven and it's infrastructure are very good with those things.

The idea is now that you create a pom which delegates the compile and test phase to the cpp compiler, and fetches its dependencies with the maven dependency plugin. Typically, the dependencies have to be fetched in both Debug and Release mode, alongside with pdb's or other compile artefacts you'll need to be able to create or debug the final application.

After some experimentation I came to the conclusion that you get the best results if you define that every solution is a module on it's own (each vcxproj should be part of only one solution). For every solution file you define one pom which is used to describe the module's dependencies and it's artifacts which the compile step is producing at the end of the day for the given solution.

Prerequisites


As mentioned above, you'll need Visual Studio, Java, Maven and a versioning system (git, subversion) on the developer machines. On the build server you should install Jenkins and also Visual Studio. Finally Nexus would be a good choice for distributing the artifacts.

upload to nexus is easy


External dependencies (libs, dll's and header files) can be put into a zip file with a decent directory structure and uploaded to the nexus.

How to setup your source code

example setup for the code organisation

The figure above depicts a possible structure for the example application. You can see that for every module you have to define a pom file, and also every application has a pom file. Also every module or application has one solution file. There is the convention that the solution file of a module only references project files which are contained within this module. This makes it possible also to take advantage of the release process which comes for free with maven.

Every module has an interface (header files) and a file which defines what are the output artifacts of this specific module. The details are all to be configured using the facilities which are available in the solution files. Like this every developer who is owner of a module can setup it's specific compilation without having to know specifics about maven.

An advantage of this is that you can work with different versions of the modules, and the dependencies are both documented and part of the build by residing in the pom.xml files.

How are module dependencies defined in the solution files?


Like mentioned above a module should not reference any files outside of its scope. This means that for example source code in module-b-p2 can reference source code in module-b-p1 using relative paths (although I would say that this is bad practice) - but it is not allowed for module-b source code to reference module-a source code directly - this would break the module barriers.

The question is: how can we now reference from one module to the other? The trick is now that module artifacts (libs,dlls,pdbs and interface) are fetched by the maven dependency plugin to the target folder of a module, and by exploiting the $(SolutionDir) variable which is available in Visual Studio we have something similar like the ${project.basedir} in maven.

Lets have a look at following figure:

app1 folder after mvn initialize
What one can see above is that there is a new folder in the app1 directory called 'target' where, by convention, all build artifacts, temporary files and dependencies for the app1 build are stored. By configuring the maven dependency plugin in a certain way it is quite trivial to put the build output from module-a in the target\deps\$(Configuration) directory. If you configure additional link library directories and special include directories in Visual Studio, the compiler will happily compile your files.

By convention, the build outputs should be placed under target\rt\$(Configuration) folder. To be able to properly debug the application, the build process should also place all runtime dependencies to this directory.


Example pom file



Example artefact assembly file



Example interface assembly file



Screenshots of Visual Studio projects (properties)



General configuration properties for a Visual Studio project


linker include paths have to be adapted


How to get acceptance in the development team


Building applications in a heterogeneous environment is not always easy, everybody has to leave his comfort zone and adopt something new. For programmers who are not acquainted with the maven build system (and I assume most of the C++ crowd doesn't know about it) it is better not to make them type in things like "mvn clean" or "mvn package", they want a better integration in their IDE.

Luckily enough you can customize Visual Studio in many ways, and one way I would suggest to integrate maven commands as easily accessible buttons in their IDE. This can be done by configuring this once per seat.

Tools -> External Tools : configure maven

This custom tool can be placed on a button which essentially reduces all maven magic to one mouse click for the uninterested developer. If you have three of them (one for mvn install, one to "fetch dependencies" and additionally one for clean up (mvn clean)) the devs will be happy. The normal workflow for a developer will then be an update of the code with the given versioning system and a mouse click to get the newly built dependencies.

But it is so much overhead!


I don't really think it is. If you follow certain conventions like outlined in this post, creating new modules becomes a fairly easy task. Typically, you won't introduce new modules every day.

Depending on the size of your development team, only the senior dictator developer will decide when to jump to a new version of a module with potential breaking changes. In fact, pinning down dependencies like that is very successful in java land, so why shouldn't it make sense also for c++ projects?

One should not forget that, leaving those module versionings and module depency definitions aside, nothing really changes for the average developer. Typically, most of the devs work in one module (which may contain dozens of subprojects which may be arbitrarily complex), and only the tech leads compose modules together.

Advantages


If you do it right you get goodies like being able to release your cpp code with the maven release plugin. This is a huge win and definitely worth the trouble of setting up the initial build. You could profit by the plethora of possibilities which the maven ecosystem gives - for example easy filtering of resources, arbitrary plugins, reporting (for example integrate doxygen reports in your build or use the not so popular but still very cool "site" feature for writing versionable documentation) ... and it's free.

I hope someone finds this useful, thanks for reading!

Sunday, April 7, 2013

Use your webcam with JavaFX and openCV - Part I

This time I want to show you how to use your Webcam with JavaFX and OpenCV. 

Jones & Laughlin Steel Corp.

Attention: I've revisited this topic some years later, see for example javacv-webcam with GraalVM

There are quite some approaches to use the MacBook Pro webcam Isight camera in a Java application, but embarrassingly enough I couldn't get them to work.

Attempt #1 : rococoa

After setting up the project with a simple hello world example, I always got a nullpointer when trying to    load a qt movie. When checking out the sources and building them myself I had some troubles with failing tests, looking at the developer mailing list I saw that this project is pretty "dormant" to say the least. All of those points don't say anything about that it is not possible with rococoa and mountain lion to take snapshots of the screen camera, but I didn't have a good feeling and thus I searched on for another solution.

Attempt #2 : vlcj

The well known vlc project has also java bindings, but the website says
... it does also work just fine on Windows and should work on Mac - but for Mac you likely need a bleeding-edge release of vlc...
This didn't sound too promising. At least I've tried and I run into this issue. At least it seems to work with a specific version of the vlc media player and a specific version of the vlcj wrapper. Maybe I'll return to this library when I need more than just a snapshot picture of my camera.

Solution: 3rd party tool

The solution I came up with was to just use the imagesnap program, which can be installed via macports by issuing
sudo port install imagesnap
This puts a little helper program in your path which enables you to take pictures from your webcam.

As a Java guy, I'm not really satisfied with this, as a pragmatic programmer I would say:

Anyhow, the aspect "how to get the image from a source" should be encapsulated anyway in an application, so maybe in the future I'll come up with a more adequate way avoiding the 3rd party dependency. The main motivation for me to use the webcam as input source is to do some image processing with it, and this is now possible.

Executing a 3rd party application and grabbing its output

After the decision to go with the imagesnap program, it is more or less standard procedure to get to the image data. All you need is to execute the application and give it suitable command line parameters.

For example, like this:


You can see that you can use the input stream directly from the imagesnap program, which comes in handy for reusing it for an Image object in JavaFX.

To make it a little more interesting, you can now combine the opencv hello world code and you will get a nice setup for further image processing experiments with yourself in front row.

In order to be able to use maven as dependency management system, you will have to install the opencv.jar in your local maven repository. This can be done like this:


mvn install:install-file -Dfile=/opt/local/share/OpenCV/java/opencv-244.jar \
                         -DgroupId=org.opencv \
                         -DartifactId=opencv-java \
                         -Dtype=jar \
                         -Dversion=2.4.4 \
                         -Dpackaging=jar
Still, the native libs have to be in  /opt/local/share/OpenCV/java/.

And here is the slightly modified code for running opencv with your isight camera using JavaFX Image:


Here is an example result of me hiding behind a book about the beautiful country of Bhutan with face detection applied.



Check out the whole project including pom.xml on the github repository.

Note that this project is pretty much mac only, since it depends on the native library location of opencv, opencv itself, and the native image grabber. It shouldn't be much of a problem to use the same concepts with linux or windows, though.

Update (the day afterwards):

A better solution: just use OpenCV!


After having some sleep and a lot of try and error, I found a solution which doesn't depend on the 3rd party tool but only uses OpenCV to create snapshots of the video input source, the ISight webcam. In fact, it is very easy using the new Desktop Java Bindings after all.

Here is a tiny code snippet to grab images using only OpenCV:


That's it!

This solution is far superior than the 3rd party tool, you can grab more images in a shorter time, it is better integrated and easier to deploy. (The deployment of such applications is still a bit of magic  since you need native libraries which have to reside somwhere on your desktop system and not in the distributed jar....  More on this maybe in a follow up posting).


What about windows?


I tried the solution also on Windows8, the code above works without change also on this platform. All you need is to include the proper DLL for your architecture and of course the openCV jars. Both are provided in the openCV distribution archive in the subfolders build/java.


Thanks for reading :)


Thursday, April 4, 2013

OpenCV on MacOSX - with Java support

You surely know that OpenCV has now first class java support since version 2.4.4. What you may not know is that literally since yesterday it is quite easy to install it on MacOsX, given that you use MacPorts.


Factory Floor
Factory Floor

box:lad$ sudo port selfupdate
Password:
--->  Updating MacPorts base sources using rsync
MacPorts base version 2.1.3 installed,
MacPorts base version 2.1.3 downloaded.
--->  Updating the ports tree
--->  MacPorts base is already the latest version

The ports tree has been updated. To upgrade your installed ports, you should run
  port upgrade outdated
box:lad$ sudo port install opencv +java
--->  Computing dependencies for opencv
--->  Dependencies to be installed: apache-ant cmake pkgconfig
--->  Fetching archive for apache-ant
--->  Attempting to fetch apache-ant-1.9.0_0.darwin_12.noarch.tbz2 from http://lil.fr.packages.macports.org/apache-ant
--->  Attempting to fetch apache-ant-1.9.0_0.darwin_12.noarch.tbz2.rmd160 from http://lil.fr.packages.macports.org/apache-ant
--->  Installing apache-ant @1.9.0_0
--->  Activating apache-ant @1.9.0_0
--->  Cleaning apache-ant
--->  Fetching archive for cmake
--->  Attempting to fetch cmake-2.8.10_1.darwin_12.x86_64.tbz2 from http://lil.fr.packages.macports.org/cmake
--->  Attempting to fetch cmake-2.8.10_1.darwin_12.x86_64.tbz2.rmd160 from http://lil.fr.packages.macports.org/cmake
--->  Installing cmake @2.8.10_1
--->  Activating cmake @2.8.10_1
--->  Cleaning cmake
--->  Fetching archive for pkgconfig
--->  Attempting to fetch pkgconfig-0.27.1_2.darwin_12.x86_64.tbz2 from http://lil.fr.packages.macports.org/pkgconfig
--->  Attempting to fetch pkgconfig-0.27.1_2.darwin_12.x86_64.tbz2.rmd160 from http://lil.fr.packages.macports.org/pkgconfig
--->  Installing pkgconfig @0.27.1_2
--->  Activating pkgconfig @0.27.1_2
--->  Cleaning pkgconfig
--->  Fetching archive for opencv
--->  Attempting to fetch opencv-2.4.4_3+java.darwin_12.x86_64.tbz2 from http://lil.fr.packages.macports.org/opencv
--->  Attempting to fetch opencv-2.4.4_3+java.darwin_12.x86_64.tbz2 from http://mse.uk.packages.macports.org/sites/packages.macports.org/opencv
--->  Attempting to fetch opencv-2.4.4_3+java.darwin_12.x86_64.tbz2 from http://packages.macports.org/opencv
--->  Fetching distfiles for opencv
--->  Attempting to fetch OpenCV-2.4.4a.tar.bz2 from http://ignum.dl.sourceforge.net/project/opencvlibrary/opencv-unix/2.4.4
--->  Verifying checksum(s) for opencv
--->  Extracting opencv
--->  Applying patches to opencv
--->  Configuring opencv
--->  Building opencv
--->  Staging opencv into destroot
--->  Installing opencv @2.4.4_3+java
--->  Deactivating opencv @2.4.4_2
--->  Cleaning opencv
--->  Activating opencv @2.4.4_3+java
--->  Cleaning opencv
--->  Updating database of binaries: 100.0%
--->  Scanning binaries for linking errors: 100.0%
--->  No broken files found.
box:lad$ port contents opencv | grep java
  /opt/local/share/OpenCV/java/libopencv_java244.dylib
  /opt/local/share/OpenCV/java/opencv-244.jar
box:lad$ 

Some Scala code to use it:



Many thanks to Andrew Stromnov to make this possible, since compiling yourself OpenCV with Java Support is not something the average Java guy will do. (I did it. It was a pleasure. ;-) )


Update:



Keep in mind that the port command compiles the jar file with the currently available JDK. If you run the port command in verbose mode you'll see that the jar file is assembled using ant. In order to force the port command to use a certain JDK you can patch the ant script:

 80 # OS specific support.  $var _must_ be set to either true or false.
 81 cygwin=false;
 82 darwin=false;
 83 mingw=false;
 84 case "`uname`" in
 85   CYGWIN*) cygwin=true ;;
 86   Darwin*) darwin=true
 87            if [ -z "$JAVA_HOME" ] ; then
 88                if [ -x '/usr/libexec/java_home' ] ; then
 89                    JAVA_HOME=`/usr/libexec/java_home -v 1.7`
 90                elif [ -d "/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home" ]; then
 91                    JAVA_HOME=/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home
 92                fi
 93            fi
 94            ;;
 95   MINGW*) mingw=true ;;
 96 esac

Like this, the JDK 1.7 on my machine will be used for generating the jar file. 

Tuesday, April 2, 2013

Upgrading gcc to gcc48 on MacOsX

This post describes how to update your gcc installation to gcc48 on MacOsX.

Dublin, but where? Main Street in Blackrock!

Short:

sudo port install gcc48 +universal

You can follow the instructions found here to update your installation of gcc. At the time of writing, gcc in version 4.8 is the current (experimental) version.

Be sure to change the default gcc command to the newly installed by issuing

sudo port select --set gcc mp-gcc48

and then, afterwards

hash gcc 

(to rehash it, see this link)

Test your gcc installation by issuing

gcc --version

which should give you an output like this.

gcc (MacPorts gcc48 4.8-20130328_0+universal) 4.8.1 20130328 (prerelease)
Copyright (C) 2013 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

Your gcc installation on the command line should now be current.

Monday, January 21, 2013

Calling Ant from Maven using a property file

Consolidating builds for several projects can be a process which sometimes needs some creativity combining the tools already available.

Most of the time it is too risky and expensive to write everything from scratch - even if it is very tempting to do so. In case of Maven and Ant, there is the well known antrun-plugin which makes it possible to integrate Ant based builds into the Maven ecosystem. 

Below is an example how to call Ant from Maven using a property file to store values which may change often.

Maybe somebody finds it useful.

Friday, January 4, 2013

Compile OpenJFX RT on MacOsX

This time I want to describe what was necessary for me to compile the OpenJFX Project on my laptop, a MacBook Pro running Mountain Lion.
Quoting from the page:
As you can imagine, building a UI toolkit for many different platforms is quite complex. It requires platform specific tools such as C compilers as well as portable tools like ant. The build and project structure of JavaFX like most large projects has evolved over time and is constantly being revised.

I'm writing this early January 2013, maybe things will change in the future. I would love to see a mavenized build for OpenJFX, or even for the whole JDK which would increase adoption tremendously for the whole OpenJDK in my opinion.   

Step 1 to 10 - Checkout the source and do the rest

Mercurial, java, and ant should be installed, then the getting started guide gives you following recipe:

Here is the recipe for building OpenJFX (use a Cygwin shell on Windows):

  • Download the latest JavaFX Developer Preview binary
  • Unzip the binary and put it in ~/closed-jfx
  • mkdir -p ~/open-jfx
  • cd ~/open-jfx
  • hg clone http://hg.openjdk.java.net/openjfx/2.1/master
  • cd master
  • mkdir -p artifacts/sdk/rt
  • cp -r ~/closed-jfx/javafx-sdk2.1.0-beta/rt artifacts/sdk
  • hg clone http://hg.openjdk.java.net/openjfx/2.1/master/rt
  • cd rt
  • Edit build-defs.xml (comment out '<propertycopy name="javac.debuglevel" from="${ant.project.name}.javac.debuglevel" silent="true" override="true"/>')
  • cd javafx-ui-controls
  • ant
..... One hour later: I've followed those instructions, but got in trouble with step one (bravely continued with step 2, though). Searched for a macos binary (didn't know what to search for really?). Decided to go on. Sadly enough, I got stuck some time later with compile errors (Abstract methods not implemented by ... ?! well - this was rather strange.)

But ...


Just before giving it up, I've found after some googling the website of Peter Pilgrim, who just briefly mentioned that the getting started guide was somewhat outdated and after following his instructions I happily compiled openjfx the first time on my laptop!

For further reference, here are the necessary steps to compile OpenJFX:

mkdir openjfx-2.2/
cd openjfx-2.2/
hg clone http://hg.openjdk.java.net/openjfx/2.2/master
cd master
mkdir -p artifacts/sdk/rt/lib
cp $JAVA_HOME/jre/lib/jfxrt.jar artifacts/sdk/rt/lib
hg clone http://hg.openjdk.java.net/openjfx/2.2/master/rt
cd rt
-> edit common.properties, set property javac.debuglevel=lines,vars,source
ant clean dist


The story doesn't end here, in fact I've discovered that there are already efforts on mavenizing openjfx, at least I've found a pom.xml in the rt subdirectory.

With
hg log pom.xml
I've found out that in RT-19825 Adam Bien contributed those pom files.

After commenting the system dependency for the jfxrt.jar (since I've copied the jfxrt.jar into the jre/lib/ext directory by using zonskis maven fix classpath plugin) it happily compiled (by skipping the tests ) using following command :

mvn clean package -Dmaven.test.skip

(Disclaimer: I don't know if the maven build does the same as the ant build does - after inital commit those files seem to be untouched.)

Bottom line for me is that building/compiling (parts of/all of??) OpenJFX was easier than expected. Although I'm sure that this is not the end of the story. I would suspect that the native stuff wasn't compiled?! At least in the 'rt' directory there are only java sources...

The answer to this seems to be that this was only a small part of OpenFX. If you look at the Mercurial index page there are lots of different versions and components for OpenFX:


As you can see, I was only in one sub module of the whole game, but I hope you got the idea how to check out and build the project - it is as simple as described above!

For example, have a look in the openjfx 2.2.6 development:



mkdir openjfx-2.2.6/
cd openjfx-2.2.6/
hg clone http://hg.openjdk.java.net/openjfx/2.2.6/master
cd master
mkdir -p artifacts/sdk/rt/lib
cp $JAVA_HOME/jre/lib/jfxrt.jar artifacts/sdk/rt/lib
hg clone http://hg.openjdk.java.net/openjfx/2.2.6/master/rt
cd rt
-> edit common.properties, set property javac.debuglevel=lines,vars,source
ant clean dist

For the JDK8 branch the procedure works as described, but make sure you set your path to a JDK8. You could of course compile this also from source, but it's maybe easier to get a full JDK8 early access build from here.


ant clean dist for OpenFX-8 (in green)

Small update:

Only recently the structure of the repositories was explained a little bit, quoting Kevin Rushforth:

Each of the following should be treated as a separate forest. You would only grab one of these forests depending on which one you want.
1. The controls team forest:
openjfx/8/controls   openjfx/8/controls/rt   openjfx/8/controls/tests
2. The graphics team forest:
openjfx/8/graphics   openjfx/8/graphics/rt   openjfx/8/graphics/tests
3. The master forest:
openjfx/8/master   openjfx/8/master/rt   openjfx/8/master/tests
The team forests is where the work happens. Each team integrates into the master forest regularly (typically weekly).

Update February 2013:


Note: This tutorial was written early 2013, maybe in the meantime things have changed. There are efforts going on to restructure the build using gradle scripts, so maybe things have changed. This wiki page should be up to date and reflecting the current status of the build.

Have also a look at this video (after reading the whole blog post ;-) on how to do it:




This video shows that compiling JavaFX from source is just some clicks away. 

Wednesday, August 29, 2012

Using Proguard for JDK 1.7 with Maven


Given that you have following scenario:
The Reader at Palais des Congres
photo by Tim Cooper
  • a project based on a  maven build
  • using the proguard plugin
  • with jdk 1.6

and you want to upgrade to jdk 1.7 for whatever reasons (maybe this one), you may encounter following problem:

 [proguard] Error: Can't read [/Library/Java/JavaVirtualMachines/jdk1.7.0_06.jdk/Contents/Home/jre/lib/rt.jar] (Can't process class [WrapperGenerator$1.class] (Unsupported version number [51.0] for class format))

It seems like proguard has some problems with the class version number.

Great. What to do? Why me? Again?

After some googling you'll start to wonder why the proguard maven plugin uses an rather old proguard binary itself (remember: the maven-proguard-plugin is just a "starter script" for giving an independent proguard main jar file the appropriate parameters).

You'll find that on the web page of the maven-proguard-plugin there is a section which states that you can provide a different proguard version number. I thought this would solve my issues, so I followed the instructions like told and .... nothing happened. The plugin is still using the proguard version 4.3 for obfuscating code.

There has been some modularization in more artifacts:

newer versions of proguard don't seem to in the maven central repository.

Ok, proguard artifact itself is here until 4.4, then there are several other with 4.8 which is at the time of writing the most up to date proguard version.

This change of packaging may well be the reason that the described way to use another proguard version doesn't work as promised (?):

The answer would be in the source of the maven-proguard-plugin source code ("the mojo").

(As it seems there is already a patch but the proguard maven plugin isn't published yet, at least not in maven central.)

To make a long story short, this rather innocent looking project comes to the rescue. Together with following configuration it will happily use the proguard 4.8 binary, at least it worked for me.


<build>
 <plugin>
    <groupId>com.github.wvengen</groupId>
    <artifactId>proguard-maven-plugin</artifactId>
    <version>2.0.5</version>

    <dependencies>
       <dependency>
           <groupId>net.sf.proguard</groupId>
           <artifactId>proguard-base</artifactId>
           <version>4.8</version>
           <scope>runtime</scope>
       </dependency>
    </dependencies>

    <executions>
       <execution>
         <phase>package</phase>
         <goals>
            <goal>proguard</goal>
         </goals>
       </execution>
    </executions>
    <configuration>
      <proguardVersion>4.8</proguardVersion>
       .... <!-- your configuration goes here -->
    </configuration>
 </plugin>
</build>

Pay attention to the different artifact id of the proguard binary. (check also the patch mentioned above).