Sunday, June 9, 2013

Leap Motion Controller and JavaFX: A new touch-less approach

Hi, it's been a while since my last post, but this first half of the year I've been quite busy at work. So I had to put on hold most of my JavaFX, Raspberry Pi and Arduino projects.

In all this time I could (even) afford only one distraction, because the device really deserve it!

In April I had the chance to get involved in the Leap Motion Developer program (thanks for that to Jim Weaver and Simon Ritter, and of course, to the Leap Motion staff), and since I received the little but powerful device at home, I've been playing around with it in several JavaFX based projects. 

So this post is a little briefing of the few projects I've done with the Leap Motion controller and JavaFX, most of them just as a Proof of Concept.

As the Leap SDK is private for now (though they intend to make it public soon), I won't release any code, just snippets, few screenshots and short videos.

At a glance, this is what I'll cover:
  • The Leap Motion Controller, what you can expect: hands, fingers, tools and basic gestures.
  • The JavaFX and the Leap threads, change listeners to the rescue.
  • POC #1. Moving 2D shapes on a JavaFX scene, trapping basic gestures with the Leap.
  • POC #2. Physics 2D worlds and Leap, a JavaFX approach.
  • POC #3. JavaFX 3D and Leap, with JDK8 early preview and openJFX Project.
I won't go in much detail regarding the Leap itself. There're plenty of videos out there. If you don't know about it yet, check these, for instance:
For those of you already on the pre-order list, the 22nd of July it's already there... be (just a little bit more) patient! For those who haven't decided to buy one yet, maybe this reading will help you make up your mind.

Let's go!

1. The Leap Motion Controller

After you plug in the Leap Motion device in your USB port (Windows, Mac and Linux OS), and download and install its sofware, you can try the Leap Visualizer, a bundled application which allows you to learn and discover the magic of the Leap.



As soon as you launch it, you can virtually see your hands and fingers moving around the screen. It's really impressive because of the high precision of the movements due to the high frequency in which the Leap scans.

Activating hands and fingers visualization you can realize those are the basics of the model provided: the Leap will detect none, one or several hands, and several fingers in each one of them. For each hand, it will show, for instance, its position, where it points at (hand direction) and its palm normal. For fingers, you'll get their position and where they point at. Also you can get hand or fingers velocity.

These positions, directions and velocities of your real hands and fingers are 3D vectors, refered to a right-handed Cartesian coordinate system, with the origin at the center of the device and the X and Z axes lying in the horizontal plane and the Y axis is vertical.

It's important to notice you'll have to convert these coordinates to the ones of the screen if you want to display and move anything on it. For that, you need to calculate where the vector of the hand or finger direction intersects with the plane of the screen. 

The Leap device performs complete scans of its surroundings, with an effective range approximately from 25 to 600 millimeters above the device (1 inch to 2 feet). Each scan defines a frame, with all the data associated.

The scan rate is really high, that's what makes the Leap so impressive and accurate compared to similar devices. Depending on your CPU and the amount of data analyzed, the range of processing latency goes from 2 ms to 33 ms, giving rates from 30 to 500 fps.


Besides directions, a basic collection of gestures is also provided: key tap or screen tap, swipe and circle gestures are tracked by comparing the finger movements through different frames.


The good thing of having access to all frames data is that you can define your custom gestures, and try to find them analyzing a relative short collection of frames, all over again.

To end this brief intro to the great Leap Motion device, let's say that being on the Developer Program you can get your SDK for many programming languages, such as Java, JavaScript, C++, C#, Objetive C or Phyton. 

In terms of Java code, all you need to do is extend the Listener class provided by the SDK and basically override the onFrame method, and let the magic begin.

2. The JavaFX and the Leap threads
 
Having a Leap Motion Controller means you can interact with your applications in a very different way you're used to. For that, you just need to integrate the Leap events, in terms of movement or actions, in your apps.

In a JavaFX based application, one easy way to do this is by adding ObjectProperty<T> objects to the LeapListener class in order to set desired values at every frame using Vector, Point2D, Point3D, CircleGesture,... and then implement their related public ObservableValue<T> methods. 

Then, in the JavaFX thread, an anonimous ChangeListener<T> class can be added to listen for any change in the ObservableValue. Special care must be taken here, as anything related to the UI must be deal by Platform.runLater().

The next proof of concept samples will try to explain this.

3. POC #1. Moving 2D shapes on a JavaFX scene

Let's say we want to move a node in the scene with our hand as a first simple POC.

We create two classes: SimpleLeapListener class, that extends Listener, where we just set at every frame the coordinates of the screen where the hand points at:

public class SimpleLeapListener extends Listener {

    private ObjectProperty<Point2D> point=new SimpleObjectProperty<>();
    
    public ObservableValue<Point2D> pointProperty(){ return point; }
    
    @Override
    public void onFrame(Controller controller) {
        Frame frame = controller.frame();
        if (!frame.hands().empty()) {
            Screen screen = controller.calibratedScreens().get(0);
            if (screen != null && screen.isValid()){
                Hand hand = frame.hands().get(0);
                if(hand.isValid()){
                    Vector intersect = screen.intersect(hand.palmPosition(),hand.direction(), true);
                    point.setValue(new Point2D(screen.widthPixels()*Math.min(1d,Math.max(0d,intersect.getX())),
                            screen.heightPixels()*Math.min(1d,Math.max(0d,(1d-intersect.getY())))));
                }
            }
        }
    }
}

And LeapJavaFX, our JavaFX class, that listen to changes in this point and reflect them on the scene:

 
public class LeapJavaFX extends Application { 
    private SimpleLeapListener listener = new SimpleLeapListener();
    private Controller leapController = new Controller();
    
    private AnchorPane root = new AnchorPane();
    private Circle circle=new Circle(50,Color.DEEPSKYBLUE);
    
    @Override
    public void start(Stage primaryStage) {
        
        leapController.addListener(listener);        
        circle.setLayoutX(circle.getRadius());
        circle.setLayoutY(circle.getRadius());
        root.getChildren().add(circle);
        final Scene scene = new Scene(root, 800, 600);        
        
        listener.pointProperty().addListener(new ChangeListener<point2d>(){
            @Override 
            public void changed(ObservableValue ov, Point2D t, final Point2D t1) {
                Platform.runLater(new Runnable(){
                    @Override 
                    public void run() {
                        Point2D d=root.sceneToLocal(t1.getX()-scene.getX()-scene.getWindow().getX(),
                                                    t1.getY()-scene.getY()-scene.getWindow().getY());
                        double dx=d.getX(), dy=d.getY();
                        if(dx>=0d && dx<=root.getWidth()-2d*circle.getRadius() && 
                           dy>=0d && dy<=root.getHeight()-2d*circle.getRadius()){
                            circle.setTranslateX(dx);
                            circle.setTranslateY(dy);                                
                        }
                    }
                });
            }
        });
        
        primaryStage.setScene(scene);
        primaryStage.show();
    }
    @Override
    public void stop(){
        leapController.removeListener(listener);
        
    }
}

Pretty simple, isn't it? This short video shows the result.


Here goes a second sample, based on the same idea, one circle per detected hand it's displayed, with its radius growing or shrinking according the Z distance of the hand to the Leap. When key tap kind of gestures are detected, a shadow circle is shown where the tap occurs, moved from the previous tap location with an animation of the changes in both transition and scale properties. 

Here you can see it in action:
 


4. POC #2. Physics 2D worlds and Leap, a JavaFX approach

When Toni Epple saw this video, he suggested me to add some physics to the mix, so I started learning from his blog posts about JavaFX and JBox2D, the Java port of the popular Box2D physics engine. Using his amazing work I was able to create a simple World, add some dynamic bodies and static walls to the boundaries, and a static big circle which I could move with the Leap, as in the previous samples. Thank you, Toni, your work is really inspiring!

Here is a code snippet of the JavaFX class.

public class PhysicsLeapJavaFX extends Application { 
    private SimpleLeapListener listener = new SimpleLeapListener();
    private Controller leapController = new Controller();
    
    private Button button=new Button("Add Ball");
    private AnchorPane root = new AnchorPane();
    private AnchorPane pane = new AnchorPane();
    private Body myCircle=null;
    
    private World world=null;
    private WorldView worldView=null;
    private final float worldScale=50f;
    private final float originX=4f, originY=8f;
    private final float radius=1f;

    @Override
    public void start(Stage primaryStage) {
        
        leapController.addListener(listener); 
        
        world = new World(new Vec2(0, 0f)); // No gravity
        // 200x400 -> world origin->(4f, 8f), Y axis>0 UP        
        worldView=new WorldView(world, originX*worldScale, originY*worldScale, worldScale);
        
        AnchorPane.setBottomAnchor(pane, 20d); AnchorPane.setTopAnchor(pane, 50d);
        AnchorPane.setLeftAnchor(pane, 20d);   AnchorPane.setRightAnchor(pane, 20d);
        // root: 800x600, pane: 760x530, worldScale= 50 -> world dimensions: 15.2f x 10.6f 
        pane.getChildren().setAll(worldView);        
                
        NodeManager.addProvider(new MyNodeProvider());
        
        button.setLayoutX(30); button.setLayoutY(15);
        button.setOnAction(new EventHandler<ActionEvent>(){
            @Override
            public void handle(ActionEvent t) {
                Body ball=new CircleShapeBuilder(world).userData("ball")
                            .position(0f, 4f)
                            .type(BodyType.DYNAMIC).restitution(1f).density(0.4f)
                            .radius(0.5f).friction(0f)
                            .build();
                ball.setLinearVelocity(new Vec2(4,2));
                ball.setLinearDamping(0f);
            }            
        });
        
        myCircle=new CircleShapeBuilder(world).userData("hand1").position(0f, 2f)
                .type(BodyType.STATIC).restitution(1f).density(1)
                .radius(radius).friction(0f)
                .build();
        new BoxBuilder(world).position(3.6f, 8f).restitution(1f).friction(0f)
                              .halfHeight(0.1f).halfWidth(7.7f).build();
        new BoxBuilder(world).position(3.6f, -2.6f).restitution(1f).friction(0f)
                              .halfHeight(0.1f).halfWidth(7.7f).build();
        new BoxBuilder(world).position(-4f, 2.7f).restitution(1f).friction(0f)
                              .halfHeight(5.4f).halfWidth(0.1f).build();
        new BoxBuilder(world).position(11.2f, 2.7f).restitution(1f).friction(0f)
                              .halfHeight(5.4f).halfWidth(0.1f).build();
        
        root.getChildren().addAll(button, pane);
        final Scene scene = new Scene(root, 800, 600);        
        
        listener.pointProperty().addListener(new ChangeListener<point2D>(){
            @Override 
            public void changed(ObservableValue<? extends Point2D> ov, Point2D t, final Point2D t1) {
                Platform.runLater(new Runnable(){
                    @Override 
                    public void run() {
                        Point2D d=pane.sceneToLocal(t1.getX()-scene.getX()-scene.getWindow().getX()-root.getLayoutX(),
                                                    t1.getY()-scene.getY()-scene.getWindow().getY()-root.getLayoutY());
                        double dx=d.getX()/worldScale, dy=d.getY()/worldScale;
                        if(dx>=0.1 && dx<=pane.getWidth()/worldScale-2d*radius-0.1 && 
                           dy>=0.1 && dy<=pane.getHeight()/worldScale-2d*radius-0.1){
                            myCircle.setTransform(new Vec2((float)(dx)-(originX-radius),
                                                           (originY-radius)-(float)(dy)),
                                                  myCircle.getAngle());
                        }
                    }
                });
            }
        });
        listener.keyTapProperty().addListener(new ChangeListener<Boolean>(){
            @Override public void changed(ObservableValue<? extends Boolean> ov, Boolean t, final Boolean t1) {
                if(t1.booleanValue()){
                    Platform.runLater(new Runnable(){
                        @Override public void run() {
                            button.fire();
                        }
                    });
                }
            }
        });

        primaryStage.setTitle("PhysicsLeapJavaFX Sample");
        primaryStage.setScene(scene);
        primaryStage.show();
    }
    @Override
    public void stop(){
        leapController.removeListener(listener);
        
    }
}

I've added some gesture recognition to fire the button when a key tap gesture it's done. Besides, it's quite convinient to smooth the readings from the Leap, taking the average of the last positions instead of the position for every frame. So let's modify the SimpleLeapListener class adding a size limited LinkedList collection to store the last 30 positions, and also enable the key tap gestures:

public class SimpleLeapListener extends Listener {

    private ObjectProperty<Point2D> point=new SimpleObjectProperty<>();
    public ObservableValue<Point2D> pointProperty(){ return point; }
    private LimitQueue<Vector> positionAverage = new LimitQueue<Vector>(30);
    
    private BooleanProperty keyTap= new SimpleBooleanProperty(false);
    public BooleanProperty keyTapProperty() { return keyTap; }

    @Override
    public void onFrame(Controller controller) {
        Frame frame = controller.frame();
        if (!frame.hands().empty()) {
            Screen screen = controller.calibratedScreens().get(0);
            if (screen != null && screen.isValid()){
                Hand hand = frame.hands().get(0);
                if(hand.isValid()){
                    Vector intersect = screen.intersect(hand.palmPosition(),hand.direction(), true);
                    positionAverage.add(intersect);
                    Vector avIntersect=Average(positionAverage);
                    point.setValue(new Point2D(screen.widthPixels()*Math.min(1d,Math.max(0d,avIntersect.getX())),
                            screen.heightPixels()*Math.min(1d,Math.max(0d,(1d-avIntersect.getY())))));
                }
            }
        }
        keyTap.set(false);
        GestureList gestures = frame.gestures();
        for (int i = 0; i < gestures.count(); i++) {
            if(gestures.get(i).type()==Gesture.Type.TYPE_KEY_TAP){
                keyTap.set(true); break;
            }
        }
    }
    
    private Vector Average(LimitQueue<Vector> vectors)
    {
        float vx=0f, vy=0f, vz=0f;
        for(Vector v:vectors){
            vx=vx+v.getX(); vy=vy+v.getY(); vz=vz+v.getZ();
        }
        return new Vector(vx/vectors.size(), vy/vectors.size(), vz/vectors.size());
    }
    
    private class LimitQueue<E> extends LinkedList<E> {
        private int limit;
        public LimitQueue(int limit) {
            this.limit = limit;
        }

        @Override
        public boolean add(E o) {
            super.add(o);
            while (size() > limit) { super.remove(); }
            return true;
        }
    }
}

And finally, here you can see it in action:

 

5. POC #3. JavaFX 3D and Leap, with JDK8

The last part of this post will cover my experiments with the recent early access releases of JDK8, build b92 on the time of this writing, as JavaFX 3D is enabled since b77. Here you can read about the 3D features planned for JavaFX 8. 

Installing JDK8 is easy, and so is creating a JavaFX scene with 3D primitives like spheres, boxes or cylinders, or even user-defined shapes by meshes, defined by a set of points, texture coordinates and faces.



There are no loaders for existing 3D file formats (obj, stl, Maya, 3D Studio, ...). So if you want to import a 3D model, you need one. 

The first place to start looking is in OpenJFX, the open source code home of JavaFX development.

You'll find in their repository between their experiments, as they call them, a 3D Viewer. So download it from their repository, build it and see what you can do!



For instance, you can drag and drop an obj model. The one in the picture is a model of a Raspberry Pi, downloaded from here.

For other formats not yet supported, you can go to InteractiveMesh.org, where August Lammersdorf has released several importers (3ds, obj and stl) for JDK8 b91+. Kudos to him for his amazing work and contributions!

I'll use his 3ds importer and the Hubble Space Telescope model from NASA, to add this  model to a JavaFX scene, and then I'll try to add touch-less rotation and scaling options.

First of all, we need a little mathematical background here, as rotating a 3D model in JavaFX requires a rotation axis and angle. If we have several rotations to make at the same time we need to construct a rotation matrix, and after that get the rotation axis and its angle.

As the Leap provides three rotations from a hand: pitch (around its X axis) , yaw (around its Y axis) and roll (around its Z axis), providing the model is already well orientated (otherwise we'll need to add previous rotations too), the rotation matrix will be: 

 where:
  
So:
  
Then, the angle and the rotation unitary axis components can be easily computed from:


Special care has to be taken when converting Leap roll, pitch and yaw angles values to those required for the JavaFX coordinate system (180º rotated from X axis).

With this equations, we just need to listen to hand rotation changes and compute the rotation axis and angle values on every change to rotate accordingly the 3D model. 

So now we're ready to try our POC 3D sample: import a 3ds model and perform rotations with our hand through the Leap Motion Controller.
 
The following code snippet shows how it is done for the JavaFX class:

public class JavaFX8 extends Application {
    private AnchorPane root=new AnchorPane();
    private final Rotate cameraXRotate = new Rotate(0,0,0,0,Rotate.X_AXIS);
    private final Rotate cameraYRotate = new Rotate(0,0,0,0,Rotate.Y_AXIS);
    private final Translate cameraPosition = new Translate(-300,-550,-700);
    private SimpleLeapListener listener = new SimpleLeapListener();
    private Controller leapController = new Controller();
    
    @Override 
    public void start(Stage stage){
        final Scene scene = new Scene(root, 1024, 800, true);
        final Camera camera = new PerspectiveCamera();
        camera.getTransforms().addAll(cameraXRotate,cameraYRotate,cameraPosition);
        scene.setCamera(camera);
        controller.addListener(listener);

        TdsModelImporter model=new TdsModelImporter();
        try {
            URL hubbleUrl = this.getClass().getResource("hst.3ds");
            model.read(hubbleUrl);
        }
        catch (ImportException e) {
            System.out.println("Error importing 3ds model: "+e.getMessage());
            return;
        }
        final Node[] hubbleMesh = model.getImport();
        model.close();
        final Group model3D = new Group(hubbleMesh);
 
        final PointLight pointLight = new PointLight(Color.ANTIQUEWHITE);
        pointLight.setTranslateX(800);
        pointLight.setTranslateY(-800);
        pointLight.setTranslateZ(-1000);
        root.getChildren().addAll(model3D,pointLight);

        listener.posHandLeftProperty().addListener(new ChangeListener<Point3D>(){
            @Override public void changed(ObservableValue<? extends Point3D> ov, Point3D t, final Point3D t1) {
                Platform.runLater(new Runnable(){
                    @Override public void run() {
                        if(t1!=null){
                            double roll=listener.rollLeftProperty().get();
                            double pitch=-listener.pitchLeftProperty().get();
                            double yaw=-listener.yawLeftProperty().get();
                            matrixRotateNode(model3D,roll,pitch,yaw);
                        }
                    }
                });
            }
        });
    }
    private void matrixRotateNode(Node n, double alf, double bet, double gam){
        double A11=Math.cos(alf)*Math.cos(gam);
        double A12=Math.cos(bet)*Math.sin(alf)+Math.cos(alf)*Math.sin(bet)*Math.sin(gam);
        double A13=Math.sin(alf)*Math.sin(bet)-Math.cos(alf)*Math.cos(bet)*Math.sin(gam);
        double A21=-Math.cos(gam)*Math.sin(alf);
        double A22=Math.cos(alf)*Math.cos(bet)-Math.sin(alf)*Math.sin(bet)*Math.sin(gam);
        double A23=Math.cos(alf)*Math.sin(bet)+Math.cos(bet)*Math.sin(alf)*Math.sin(gam);
        double A31=Math.sin(gam);
        double A32=-Math.cos(gam)*Math.sin(bet);
        double A33=Math.cos(bet)*Math.cos(gam);
        
        double d = Math.acos((A11+A22+A33-1d)/2d);
        if(d!=0d){
            double den=2d*Math.sin(d);
            Point3D p= new Point3D((A32-A23)/den,(A13-A31)/den,(A21-A12)/den);
            n.setRotationAxis(p);
            n.setRotate(Math.toDegrees(d));                    
        }
    }
}

And this code snippet shows how it is done for the Leap Listener class:

public class SimpleLeapListener extends Listener {
    private ObjectProperty<Point3D> posHandLeft=new SimpleObjectProperty<Point3D>();
    private DoubleProperty pitchLeft=new SimpleDoubleProperty(0d);
    private DoubleProperty rollLeft=new SimpleDoubleProperty(0d);
    private DoubleProperty yawLeft=new SimpleDoubleProperty(0d);
    private LimitQueue<Vector> posLeftAverage = new LimitQueue<Vector>(30);
    private LimitQueue<Double> pitchLeftAverage = new LimitQueue<Double>(30);
    private LimitQueue<Double> rollLeftAverage = new LimitQueue<Double>(30);
    private LimitQueue<Double> yawLeftAverage = new LimitQueue<Double>(30);

    public ObservableValue<Point3D> posHandLeftProperty(){ return posHandLeft; }
    public DoubleProperty yawLeftProperty(){ return yawLeft; }
    public DoubleProperty pitchLeftProperty(){ return pitchLeft; }
    public DoubleProperty rollLeftProperty(){ return rollLeft; }
    
    @Override
    public void onFrame(Controller controller) {
        Frame frame = controller.frame();
        if (!frame.hands().empty()) {
            Screen screen = controller.calibratedScreens().get(0);
            if (screen != null && screen.isValid()){
                Hand hand = frame.hands().get(0);
                if(hand.isValid()){
                    pitchLeftAverage.add(new Double(hand.direction().pitch()));
                    rollLeftAverage.add(new Double(hand.palmNormal().roll()));
                    yawLeftAverage.add(new Double(hand.direction().yaw()));                    
                    pitchLeft.set(dAverage(pitchLeftAverage).doubleValue());
                    rollLeft.set(dAverage(rollLeftAverage).doubleValue());
                    yawLeft.set(dAverage(yawLeftAverage).doubleValue());
                    
                    Vector intersect = screen.intersect(hand.palmPosition(),hand.direction(), true);
                    posLeftAverage.add(intersect);
                    Vector avIntersect=Average(posLeftAverage);
                    posHandLeft.setValue(new Point3D(screen.widthPixels()*Math.min(1d,Math.max(0d,avIntersect.getX())),
                            screen.heightPixels()*Math.min(1d,Math.max(0d,(1d-avIntersect.getY()))),
                            hand.palmPosition().getZ()));
                }
            }                
        }
    }
    private Double dAverage(LimitQueue<Double> vectors){
        double vx=0;
        for(Double d:vectors){
            vx=vx+d.doubleValue();
        }
        return new Double(vx/vectors.size());
    }
}

In the following video I've added a few more things, which aren't in the previous code: with the hand Z position we can scale the model, and we look for right hand circle gestures, to start an animation to rotate indefinitely the model, till another circle gesture is found, resuming hand rotations.



Conclusions

With these few samples I think you've already shown the impressive potential of a device like the Leap Motion Controller. 

JavaFX as RIA platform can interact with the Leap Motion device nicely and take UI to the next level.

We're all waiting eagerly to the public release of the device, and the opening of the Airspace market, where we'll find all kind of applications to use the Leap with.

This will change definitely the way we interact with our computers for ever.

Thank you for reading, as always, any comment will be absolutely welcome.

Tuesday, January 8, 2013

NXTBeeFX: A JavaFX based app for Raspberry Pi to control a Lego NXT robot wirelessly

Hi there, and happy new year! 

After my last series of posts about ArduinoFX (Java Embedded on the Raspberry Pi with sensors, Arduino and XBee, and a JavaFX based client app, monitoring the measures), it happened that Oracle released JDK 8 (with JavaFX) for ARM Early Access, and finally we all could try JavaFX on our Rasperry Pi

It also happened that I left the NXTLegoFX post pending...

But, now, with JavaFX running on the Raspberry Pi I had to give it a try, so I decided to use again my kid's Lego Mindstorms NXT 2.0 to have something real to interact with, and resume in part the pending post. 

And just at the time of finishing this post, Lego unveiled their new version EV3 for the summer of 2013... so before ditching the NXT, let's have some fun with it!

A few months ago I bought a NXTBee, the naked version without XBee, from Dexter Industries. Please, read their wiki with further explanations, and their downloads section, to find NXT-G blocks for the official NXT firmware.


It's like the XBee shield for Arduino, but for the NXT. So the project was clear to me: connect wirelessly the NXT with the Raspberry Pi via serial communication, and develop a JavaFX based application for the Pi. No server required this time.

Here you have a little overview:


And this is the bill of materials:
 
Disclaimer: I'm not responsible in any way for what you do on your spare time with what I say here! The following procedures are provided without any warranty. Also, you should know by now that I'm not related to any of the companies mentioned in this blog. 

 1. LeJOS and NXTBee

I've already talked about leJOS, so you'll know by now it's a firmware that replace the NXT one, including a tiny Java Virtual Machine, so it allows you to program the robot with Java. Please, check this for detailed explanation.

The official distribution (by now 0.9.1 beta 3) was released in February 2012 and it didn't have the classes required to interact with the NXTbee. 

Fortunately, Mark Crosbie has developed NXTBee, a leJOS class to interact with the NXTBee, along with some sample Java code in leJOS showing how to send and receive data. Basically, the NXTBee, attached to Port 4 of the NXT, uses a RS485 serial link. The NXTBee class uses a thread as a data pump, continually polling the RS485 port and storing data received on the port in an internal circular buffer, and reading data from the circular buffer and writing it onto the RS485 port. It returns an InputStream and an OutputStream object which can be used by programs to read and write data to and from the NXTBee.

His work has been included for future versions of leJOS, and you can find it in their repository. For now, in our projects, we just need to include NXTBee.java and CircularByteBuffer.java with the rest of our classes, before compiling and downloading to the NXT.

Test #1. Read from and write to NXT
So as a first test we'll just check communication to and from the NXT with the NXTBee and a XBee, to other XBee plugged to the PC via XBee Explorer, with the help of X-CTU software (or equivalent hyperterminal like program).

Please, first read how to set the proper configuration of the XBee antennas here, in case you haven't done it yet, and change the DL parameter in the Coordinator XBee #1:
  • Addressing. DL: Destination Address Low: FFFF
so it will be able to read and write to the XBee #2.


Now, in Netbeans, we'll make a new project, apart from the leJOS samples project. For that, follow these steps:

1. In your usual Java projects folder, create a new folder named NXTBeeTest1, and copy build.properties and build.xml files you'll find in the folder LeJOS NXJ Samples\org.lejos.example. Also, create a src folder inside.

2. In Netbeans, choose New Project->Java->Java Free Form Project. In Name and Location step, browse to the NXTBeeTest1 folder. Change project name to NXTBeeTest1

 
In Build and Run Actions step, in Run Project, select uploadandrun. In Source Package Folders step,  press Add Folder, and select src folder. Finally, in Java Sources Classpath step, press Add Jar/Folder, and browse to C:\Program Files\leJOS NXJ\lib\nxt\classes.jar. Click finish.


3. In the src folder, create package lejos.nxt.addon, download NXTBee.java from here and add it there. Also, create package lejos.internal.io and download CircularByteBuffer.java from here to that folder. Create package org.lejos.jpl.nxtbee and add a new class, NXTBeeTest1 with the following code:

public static void main(String[] args) {

    NXTBee nb = new NXTBee(9600, true, true);

    Thread t = new Thread(nb);
    t.setDaemon(true);
    t.start();

    //
    // PART 1. SEND DATA FROM XBEE #1, RECEIVE IT IN XBEE #2 WITH NXTBEE
    // Press RIGHT on the NXT to finish Part 1
    //
    InputStream is = nb.getInputStream();  
    DataInputStream dis = new DataInputStream(is);

    Delay.msDelay(1000);

    LCD.clear();
    LCD.drawString("NXTBee Receiving", 0, 0);

    byte[] b = new byte[20];
    try {
        while(Button.RIGHT.isUp()){
            if(dis.available() > 0) {
                int bytesRead = dis.read(b);
                LCD.drawString("Read " + bytesRead + " bytes",0, 3);
                String s = new String(b);
                LCD.drawString(s, 0, 5);
            }
            Delay.msDelay(1000);
        }
    } catch(Exception e) {}
    try {
        dis.close();            
    } catch (IOException ex) {}

    //
    // PART 2. SEND DATA FROM XBEE #2 WITH NXTBEE, RECEIVE IT IN XBEE #1
    // Press ENTER on the NXT to finish Part 2
    //
    OutputStream os = nb.getOutputStream();
    DataOutputStream dos = new DataOutputStream(os);

    SensorPort sp = SensorPort.getInstance(0);
    TouchSensor touch = new TouchSensor(sp);

    LCD.clear();
    LCD.drawString("NXTBee Sending", 0, 1);

    try {
        while(Button.ENTER.isUp()){
            if (touch.isPressed()){
                LCD.drawString("Touch on ", 0, 2);
                dos.writeBytes("Touch on");
            } else {
                LCD.drawString("Touch off ", 0, 2);
                dos.writeBytes("Touch off");
            }
            dos.writeByte(13); dos.writeByte(10); // CRLF                
            Delay.msDelay(2000);
        }
    } catch(Exception e) { }
    try {
        dos.close();            
    } catch (IOException ex) {}
}

4. Edit build.properties and change these two lines:

main.class=org.lejos.jpl.nxtbee.NXTBeeTest1
output.basename=NXTBeeTest1

Edit build.xml and change description:

<project name="NXTBeeTest1" default="uploadandrun">
<description>org.lejos.jpl.nxtbee.NXTBeeTest1 build file</description>

Now, plug XBee #2 to the NXTBee, and this one to sensor port #4 on the NXT brick. In port #1 attach also the Touch sensor.

Finally connect the NXT brick to your PC by USB or by Bluetooth and switch on the NXT. Run project. If everything is alright, it should build and download NXTBeeTest1 to the NXT, and start the first part: In the NXT LCD screen you should see "NXTBee Receiving". 


Now, plug the XBee #1 to your PC, via XBee Explorer, and open X-CTU. Read the XBee and go to Terminal tab. Write something, like "Test1". It should appear on the NXT LCD display:


Press firmly the right button on your NXT, and now you should see on the terminal tab the status of the touch sensor (press and release a few times).



2. Robotics: Arbitrator and Behavior

To introduce some robotic way of thinking in our project we can examine the sample BumperCar distributed with the samples bundle. It requieres a wheeled vehicle with two independently controlled motors connected to motor ports A and C, and a touch sensor connected to sensor port 1 and an ultrasonic sensor connected to port 3.

So we build this simple robot. You can follow this fantastic guide, and finally add the ultrasonic sensor.


As we can read here, the concepts of Behavior Programming as implemented in leJOS NXJ are very simple:
  •     Only one behavior can be active and in control of the robot at any time.
  •     Each behavior has a fixed priority.
  •     Each behavior can determine if it should take control.
  •     The active behavior has higher priority than any other behavior that should take control.
Basically, for each task the robot must perform, a behavior class is defined. This class will override the three public methods from Behavior interface:
  • boolean takeControl() indicates if this behavior should become active, returning quickly without performing long calculations.
  • void action() performs its task when the behavior becomes active. A behavior is active as long as its action() method is running, so the action() method should exit when its task is complete, or promptly when suppress() is called.
  • void suppress() immediately terminates the code running in the action() method. It also should exit quickly.
Once all the behaviors are created, they are given to an Arbitrator class, to regulate which behavior should be activated at any time. The order in the array of behaviors determines the priority of each one: 0 index means lowest priority. 

Whent its start() method is called, it begins arbitrating: deciding which behavior will become active. For that, it calls the takeControl() method on each Behavior object, starting with the object with the highest index number in the array, till it finds a behavior that wants to take control. If the priority index of this behavior is greater than that of the current active behavior, the active behavior is suppressed.

In the bumpercar sample, there are two behaviors. The first one defines its primary task, drive forward, and the second orders the robot to reverse and turn whenever the touch sensor strikes an object, or the ultrasonic sensor gets an echo from a close object, with priority over the first behavior:

Behavior b1 = new DriveForward(); // low priority
Behavior b2 = new DetectWall();   // high priority
Behavior[] behaviorList = {b1, b2};
Arbitrator arbitrator = new Arbitrator(behaviorList);
arbitrator.start();

Test #2 The BumperCar

This test is quite simple: if you have built the bumpercar, just run samples project, type "bumpercar" in the dialog, wait till it's downloaded, press a button on your NXT to start, and test if it behaves as expected.

3. Serial communication with the NXT

Now let's insert a new behavior in our robot, one that takes care of remote communication with the NXT, so we can override its autonomous behavior, providing manual control. A rear touch sensor will be added.

We create a new Java Free Form Project, named NXTBeeNXT, in the very same way as in the previous test1 with NXTBeeTest1.

First of all we'll add Brick, a singleton class to wrap the NXT motors and sensors, so they can be accessed from any behavior. Also, from this class any change in the robot status will be notified to the PC or Raspberry Pi, via XBee.

Then we add three behaviors: DriveForward, Remote and DetectWall.

Finally we define the main class, NXTBeeNXT.

Here you can see part of the Remote code. It just listens for commands from the inputstream and takes control or perform actions accordingly. In case an obstacle is found, this behavior will be suppressed and its action will be stopped:

public class Remote implements Behavior
{
    private DataInputStream dis;
    private boolean _suppressed = false;
    private String s;
    
    public Remote(DataInputStream dis){
        this.dis=dis;
    }
  
    public boolean takeControl()
    {
        if(Brick.getInstance().isStopped()){
            // cancel all behaviors, stop the arbitrator
            return false;
        }
      
        if(Brick.getInstance().getBehState()==Brick.state.MANUAL){
            // Take control if we're in Manual mode, don't read remote orders here
            return true;
        }
        
        // Read remote orders from PC/Raspi
        byte[] b = new byte[20];
        try {
            if(dis.available() > 0) {
                dis.read(b);
                s = new String(b);
            }
        } catch(Exception e) {}
    
        // Take control if orders are Stop or change from Auto (driving forward) to Manual mode
        return s!=null && (s.startsWith(Brick.STOP) || s.startsWith(Brick.MANUAL));
    }

    public void suppress()
    {
        _suppressed = true;
    }

    public void action()
    {
        
        _suppressed=false;
        
        LCD.clearDisplay();
        if(s.startsWith(Brick.STOP)){
            // Notify Stop order
            Brick.getInstance().setBehState(Brick.state.STOPPED);
            LCD.drawString("Bumper Car STOP",0,1);
        } else if(s.startsWith(Brick.MANUAL)){
            // Notify order to enter in Manual Mode
            Brick.getInstance().setBehState(Brick.state.MANUAL);
            
            // Start reading serial port and process the orders
            byte[] b = new byte[20];
            try {
                // This action will be suppressed if the robot finds an obstacle
                while(!_suppressed) {                    
                    if(dis.available() > 0) {
                        dis.read(b);
                        s = new String(b);
                        if(s.startsWith(Brick.LEFT)){
                            LCD.drawString("LEFT    ",0,3);
                            // start Motor.C rotating forward, with A stopped, so 
                            // the robot turns left
                            Brick.getInstance().getLeftMotor().stop();
                            Brick.getInstance().getRightMotor().rotate(360, true);
                        } else if(s.startsWith(Brick.RIGHT)){
...
                        } else if(s.startsWith(Brick.AUTO)){
                            LCD.drawString("AUTO    ",0,3);
                            // Return to Auto mode. Motors are stopped
                            Brick.getInstance().getLeftMotor().stop();
                            Brick.getInstance().getRightMotor().stop();
                            // Notify forward (auto) state
                            Brick.getInstance().setBehState(Brick.state.FORWARD);
                            // ends the action
                            _suppressed=true;
                        }
                    }
                    Delay.msDelay(500);
                }
            } catch(Exception e) { }
        }
        s="";
    }
}

All the code for this project can be found in my GitHub repository here.

Test #3 The remotely controlled BumperCar

We add to the bumpercar a second touch sensor, to look for obstacles when the robot moves backward in Manual mode, plugged to port #2. The XBee #2 must be plugged to the NXTBee and this should be plugged to port #4 on the NXT.



If you've just clonned the code from the repository, open the project, and run it, with your NXT switched on. After it has been compiled, built and downloaded, it will start and you'll see in the NXT display "NXTBee waiting...". Leave the NXT on the floor, in a relative clear wide area.

Now plug XBee #1 in the USB port of your PC, via XBee Explorer, and open X-CTU, read the modem configuration and go to Terminal tab. Now press 'S' to start Auto mode in the NXT. Let the BumperCar find some obstacles and react to them. Press 'P' to Stop in case something is wrong, and 'Q' to quit the program. On the NXT, from files menu, select NXTBeeNXT.nxj and click Enter. Now press again 'S' and enter in manual mode with 'M'. Now you can press:
  • 'F' to move forward,
  • 'B' to move backward,
  • 'L' to turn left,
  • 'R' to turn right,
  • 'V' to speed up,
  • 'W' to speed down and
  • 'A' to go to Auto mode

The red numbers appering on the terminal screen after commands are typed are the echo back from the NXT notifying its status: 0 means stopped, 1 driving forward, 2 wall detected and 3 manual mode.

Note that if you drive manually against any obstacle, DetectWall behavior will override your control, trying to avoid the obstacle and returning to auto mode.


  
4. JavaFX for Raspberry Pi

To run JavaFX on the Raspberry Pi, you'll need a hard float Raspbian version. I covered here how to install the soft float version, so follow the same instructions again, but now with this hard float version 2012-12-16-wheezy-raspbian.zip. When the Pi boots for the first time, in the config menu option memory_split, now you should give 128 MB to video.

At the first login, edit /boot/config.txt file and uncomment these two lines, and select the resolution of your display:

framebuffer_width=1280
framebuffer_height=1024


There are already several blogs out there covering how to install Java in a hard float Raspbian Wheezy version, like this or this. Basically you have to follow these steps:

1. Download Oracle JDK 8 (with JavaFX) for ARM Early Access from here in your PC. With ssh, copy the file to a folder in your Pi, like /home/pi/Downloads/. In a terminal window in your PC, or in the Pi, run these commands:

mkdir -p /opt
cd /home/pi/Downloads
sudo tar zxvf jdk-8-ea-b36e-linux-arm-hflt-29_nov_2012.tar.gz -C /opt 
rm jdk-8-ea-b36e-linux-arm-hflt-29_nov_2012.tar

If you type /opt/jdk1.8.0/bin/java -version, you should see "java version '1.8.0-ea'".

2. Download JavaFX samples from here. Copy the file to /home/pi/Downloads by ssh. Run in a terminal window:

cd /home/pi/Downloads
unzip javafx_samples-8_0_0-ea-linux.zip
rm javafx-samples-8.0.0-ea.zip
mv javafx-samples-8.0.0-ea /home/pi/javafx

And that's all!

Test #4. JavaFX sample 

Now we can test several of the included samples. Note that, for the moment, the application takes the whole screen and you can't quit if it doesn't have a button for it.  

So one way to do it is running your sample from a terminal window in a VNC session. You'll see the application in your display connected with HDMI to the Pi, and all the text output (System.out, System.err) in your terminal window. And you can kill the application anytime with Ctrl+C, or with ps -a, find the id of the java application and type kill <javaid>.

To run the sample, plug a mouse in your Pi, and type this line in the terminal window:

sudo /opt/jdk1.8.0/bin/java -Djavafx.platform=eglfb -cp /opt/jdk1.8.0/jre/lib/jfxrt.jar:/home/pi/javafx/StopWatch.jar stopwatch.MainScreen


Press Ctrl+C to finish.

5. Pi4J library

The Pi4j project is intended to provide a bridge between the native libraries and Java for full access to the Raspberry Pi, so you can easily access to GPIO pins for your Java project.

To install the library, follow these steps:

sudo wget http://pi4j.googlecode.com/files/pi4j-0.0.5-SNAPSHOT.deb
sudo dpkg -i pi4j-0.0.5-SNAPSHOT.deb


It will install in /opt/pi4j/lib four jars. Also, in /opt/pi4j/examples you'll find several samples.

Test #5. Control GPIO

Before testing the first sample, please review the GPIO pins labelling here

Now add a LED with a 330 Ω pull-up resistor in a breadboard, and connect anode to pin #1 and cathode to GND.


On your Pi, compile the sample first:

cd /opt/pi4j/examples
sudo /opt/jdk1.8.0/bin/javac -classpath .:classes:/opt/pi4j/lib/'*' ControlGpioExample.java

And now run it:

sudo /opt/jdk1.8.0/bin/java -classpath .:classes:/opt/pi4j/lib/'*' ControlGpioExample 


  
6. The JavaFX application

Finally, we'll design the JavaFX GUI application to remotely control the BumperCar from the Raspberry Pi.

The first thing we should do is install jdk1.8.0-ea in our PC, so we can use our favourite IDE to develope and build the project. Then, we only need to send the jar to the Pi and test it. For that:

1. Unzip jdk-8-ea-b36e-linux-arm-hflt-29_nov_2012.tar.gz in a folder, like C:\Java.
2. In Netbeans go to Tools, select Ant Variables, click Add, and define J8_HOME and browse to the folder "C:\Java\jdk1.8.0". 
3. Create a new JavaFX FXML Application, named NXTBeeFX, and open build.xml. Add at the end this target:

<target depends="-pre-init,-init-private" name="-init-user">
    <property file="${user.properties.file}"/>
    <property name="javac.compilerargs" value="-bootclasspath ${var.J8_HOME}/jre/lib/rt.jar"/>
    <property name="javac.classpath" value="${var.J8_HOME}/jre/lib/jfxrt.jar:
       ${var.J8_HOME}/jre/lib/ext/RXTXcomm-2.2pre2.jar:
       ${var.J8_HOME}/jre/lib/ext/pi4j-core.jar"/>
    <!-- The two properties below are usually overridden -->
    <!-- by the active platform. Just a fallback. -->
    <property name="default.javac.source" value="1.6"/>
    <property name="default.javac.target" value="1.6"/>
</target>

4. Add RXTXcomm-2.2pre2.jar and pi4j-core.jar to the project. Copy them from your Pi to your PC, by ssh, from /usr/share/java and /opt/pi4j/lib.

Now edit NXTBeeFX.fxml in the JavaFX Scene Builder, and add the buttons and labels required:


There'is one issue I've found while creating this app: you can't apply InnerShadow or DropShadows effects. If you do, you'll get a RuntimeException. A bug has already been filed: http://javafx-jira.kenai.com/browse/RT-27464.

Now, in NXTBeeFXController, we create the methods for the buttons, and initialize those from the manual panel to change the text label to a graphic icon:

btnRight.setId("key-button");
icon = new Label();
icon.getStyleClass().add("arrowRight");
btnRight.setText(null);
btnRight.setGraphic(icon);
btnRight.setPrefSize(50, 50);

taken from a SVG path, defined in the css file:

#key-button.button .arrowRight {
    -fx-shape: "M 14.007057,41.909369 C 2.3562491,41.605509 12.432093,7.29355
                31.877087,12.49765 l 0,-9.3754104 16.648482,14.5865794 
                -16.648482,15.29753 0,-9.66838 c -15.93811,-5.71097 
               -9.177528,18.43258 -17.87003,18.5714 z";
    -fx-translate-x: -6;
    -fx-translate-y: -4; 
}

Another issue I've found is that the icon is centered in the button at it's expected when running on my PC, but on the Raspberry Pi it appears to the right and to the bottom of the button, so that's the reason for translating it back to the center.

So this is how the manual panel looks like after a little bit of styling:



In the initialize method, we also start a thread to read the serial port (via XBee) and get all the responds from the NXT.

// Initialize Serial Port, with the XBee #1 connected on the USB port

serial=new Serial();
try {
    System.out.println("Connecting to serial port...");
    serial.connect( "/dev/ttyUSB0" );
} catch( Exception e ) {
    System.out.println("Error connecting to serial port: "+e.getMessage());
}

//
// Service to start reading serial port for NXT Status
// It will stop and close when requested
//
serviceSerial=new Service<Void>(){

    @Override
    protected Task<Void> createTask() {

        return new Task<Void>(){

            @Override
            protected Void call() throws Exception {
                System.out.println("start reading...");
                serial.read();
                return null;
            }    
            @Override protected void cancelled() {
                System.out.println("cancelling...");
                serial.disconnectInput();
                super.cancelled();
            }
        };
    }
};
serviceSerial.start();

Also, the GPIO pines are initialized, to set an alarm in case the NXT finds an obstacle.

    // create gpio controller
    final GpioController gpio = GpioFactory.getInstance();

    // provision gpio pin #01 as an output pin
    final GpioPinDigitalOutput pinRed =
       gpio.provisionDigitalOutputPin(RaspiPin.GPIO_01, "MyLEDRed", PinState.LOW);

    // provision gpio pin #07 as an output pin
    final GpioPinDigitalOutput pinGreen = 
       gpio.provisionDigitalOutputPin(RaspiPin.GPIO_07, "MyLEDGreen", PinState.LOW);
    
    public void setAlarmOn() {
        pinRed.high();
        pinGreen.low();
    }
    
    public void setAlarmOff() {
        pinRed.low();
        pinGreen.high();
    }

Note that a second LED is added to pin 7.


All the code for this project can be found in my GitHub repository here. Clone it, build it, and then you must send the jars to the Pi, by ssh:



To run the project you can create a file typing nano bash.sh. There you should type:

#!/bin/bash

sudo /opt/jdk1.8.0/bin/java -Djavafx.platform=eglfb -Djava.library.path="/usr/lib/jni" -cp /opt/jdk1.8.0/jre/lib/jfxrt.jar:/home/pi/javafx/NXT/RXTXcomm-2.2pre2.jar:/home/pi/javafx/NXT/pi4j-core.jar:/home/pi/javafx/NXT/NXTBeeFX.jar  nxtbeefx.NXTBeeFX

Save (Ctrl+O) and exit (Ctrl+X). Now check all is in place and you can run it:

/home/pi/Downloads/javafx/NXT/bash.sh




CONCLUSION

As a short conclusion after this long post, let me just say that JavaFX on the Raspberry Pi performs really well, being just an Early Access release. It will be improved and several issues will be fixed, and things that were not included, like media, will be in.

So if you have a Raspberry Pi, don't wait any longer and start testing your JavaFX apps.

Again, leJOS has proved to be a really mature platform, enabling a really easy integration between Java/JavaFX and robotics applications for Lego Mindstorms NXT. 

Finally, in this video you'll find most of the details I've been talking about in this post. 

If you have the chance, take your NXT, grab the code, and give it a try! Any comment will be absolutely welcome.