Thursday, March 19, 2015

JavaFX on mobile, a dream come true!

Hi there!

It seems is time for a new post, but not a long and boring one as usual. I'll post briefly about my first experience bringing JavaFX to Google Play store.

Yes, I made it with 2048FX, a JavaFX version of the original game 2048, by Gabriel Cirulli, that Bruno Borges and I started last year.

Now, thanks to the outstanding work of JavaFXPorts project, I've adapted that version so it could be ported to Android. 

And with the very last version of their plugin, I've managed to succesfully submit it to Google Play store.

After a week in beta testing mode, today the app is in production, so you can go and download it to your Android device, and test it for yourself.

For those of you eager to get the app, this is the link. Go and get it, and add a nice review ;)

If you want to read about the process to make it possible, please keep on reading. These are the topics I'll cover in this post:
  • 2048FX, the game
  • JavaFXPorts and their mobile plugin
  • New Gluon plugin for NetBeans
  • 2048FX on Android
  • Google Play Store

2048FX, the game


Many of you will now for sure about the 2048 game by Gabriel Cirulli. Last year it was a hit.
Many of us got really addicted to it...


In case you are not one of those, the game is about moving numbers in a 4x4 grid,  and when equal numbers clash while moving the blocks (up/down/left/right), they merge and numbers are added up. The goal is reaching the 2048 tile (though you can keep going on looking for bigger ones!).

At that time, Bruno started a Java/JavaFX 8 version, given that the game was open sourced. I jumped in inmediately, and in a few weeks we had a nice JavaFX working version


Since we used (and learned) the great new features of Java 8, we thought it was a good proposal for JavaOne, and we end up presenting it in a talk (video) and doing a Hands on Lab session.

And we'll talk about it again next week at JavaLand. If you happen to be there, don't miss our talk

JavaFXPorts and their mobile plugin


Since the very beginning of JavaFX (2+), going mobile has been on the top of list of the most wanted features. We've dreamed with the possibility of making true the WORA slogan, and it's only recently since the appearance of the JavaFXPorts project, that this dream has come true.

Led by Johan Vos, he and his team have given to the community the missing piece, so now we can jump to mobile devices with the (almost) same projects we develop for desktop.

While Johan started this adventure at the end of 2013, his work on porting JavaFX to Android, based on the OpenJFX project, has been evolving constantly during 2014, until recently in Febrary 2015 he announced a join effort between his company, LodgOn, and Trillian Mobile, the company behind RoboVM, the open source project to port JavaFX to iOS.

As a result, jfxmobile-plugin, the one and only gradle JavaFX plugin for mobile was created and freely available through the JavaFXPorts repository.

With this plugin you can target from one single project three different platforms: Desktop, Android and iOS.

An it's as simple as this sample of build.gradle:
 
buildscript {
    repositories {
        jcenter()
    }
    dependencies {
        classpath 'org.javafxports:jfxmobile-plugin:1.0.0-b5'
    }
}

apply plugin: 'org.javafxports.jfxmobile'

repositories {
    jcenter()
}

mainClassName = 'org.javafxports.project.MainApplicationFX'

jfxmobile {
    ios {
        forceLinkClasses = [ 'org.javafxports.**.*' ]
    }
}

For Android devices, it is required Android SDK, and Android build-tools as you can read here. The rest of the depencencies (like Dalvik SDK and the Retrolambda plugin) are taking care by the plugin itself.

Note the plugin version 1.0.0-b5 is constantly evolving, and at the time of this writting 1.0.0-b7 is now available. Check this frequently to keep it updated.

With this plugin, several tasks are added to your project, and you can run any of them. Among others, these are the main ones:
  • ./gradlew android creates an Android package
  • ./gradlew androidInstall installs your Android application on an Android device that is connected to your development system via a USB cable.
  • ./gradlew launchIOSDevice launches your application on an iOS device that is connected to your development system
  • ./gradlew run launches your application on your development system.

New Gluon plugin for NetBeans


Setting up a complex project, with three different platforms can be a hard task. Until now,  the best approach (at least the one I followed) was cloning the HelloPlatform sample, and changing the project and package names.

But recently, a new company called Gluon, with Johan as one of his founding fathers,  has released a NetBeans plugin that extremelly simplifies this task.

Once you have installed the plugin, just create a JavaFX new project, and select Basic Gluon Application.



Select valid names for project, packages and main class, and you will find a bunch of folders in your new project:


Change the jfxmobile-plugin version to 1.0.0-b5 (or the most recent one), select one of the tasks mentioned before, and see for yourself.

 2048FX on Android


I've been using previous versions of the plugin, and it was a hard task to get everything working nicely. In fact, I had a working version of 2048FX on Android before the announcement of the jfxmobile-plugin. But it was a separated project from the desktop one.

Now with the plugin, everything binds together magically. So I have a single project with the desktop and the android version of the game.

Java 8?


There's a main drawback in all this process: Dalvik VM doesn't support Java 8 new features. For Lambdas, we can use the Retrolambda plugin, that takes care of converting them to Java 6 compatible bytecode. But Streams or Optional are not supported. This means that you have to manually backport them to Java 6/7 compatible version.

While the primary object of the 2048FX project was basically learning these features, for the sake of going mobile, I backported the project, though this didn't change its structure or its appearance.

The project: Desktop and Android altogether


This is how the project structure looks like:



A PlatformProvider interface allows us to find out in which platform we are running the project, which is extremely useful to isolate pieces of code that are natively related to that plaftorm.

For instance, to save the game session in a local file in the Android device, I need to access to an internal folder where the apk is installed, and for that I use an FXActivity instance, the bridge between JavaFX and Dalvik runtime, that extends Android Context.  This Context can be used to lookup Android services. 

One example of this is FileManager class, under Android packages:

import javafxports.android.FXActivity;
import android.content.Context;
import java.io.File;

public class FileManager {
    private final Context context;
    
    public FileManager(){
        context = FXActivity.getInstance();
    }
    
    public File getFile(String fileName){
        return new File(context.getFilesDir(), fileName);
    }
    
}

Now the PlatformProvider will call this implementation when running on Android, or the usual one for desktop.

After a few minor issues I had a working project in both desktop and Android.




Google Play Store


Bruno asked me once to go with this app to Google Play store, but at that time the project wasn't mature enough. But last weekend I decided to give it a try, so I enrolled myself in Google Play Developers, filled a form with the description and several screenshots, and finally submitted the apk of the game... what could go wrong, right?

Well, for starters, I had a first error: the apk had debug options enabled, and that was not allowed.

The AndroidManifest.xml


 This hidden file, created automatically by the plugin, contains important information of the apk. You can retrieve it after a first built, and modify it to include or modify different options. Then you have to refer this file in the build.gradle file.

On the application tag is where you have to add android:debuggable="false".

There you can add also the icon of your app: android:icon="@mipmap/ic_launcher", where mipmap-* are image folders with several resolutions.

Signing the apk


Well, that part was easy. Second attempt, second error... The apk must be signed for release. "Signed" means you need a private key, and for that we can use keytool.

And "release" means that we need to add to build.gradle the signing configuration... and that was not possible with the current plugin version b5.

So I asked Johan (on Friday night) about this, and he answered me (Saturday afternoon) that they've been working precisely on that but it was not ready yet. Later that evening, Joeri Sykora from LodgON, told me that it was in a branch... so with the invaluable help of John Sirach (the PiDome guy) we spent most of the Saturday night trying to build locally the plugin to add the signing configuration. 

It end up being something like this:

jfxmobile {
    android {
        signingConfig {
            storeFile file("path/to/my-release-key.keystore")
            storePassword 'STORE_PASSWORD'
            keyAlias 'KEY_ALIAS'
            keyPassword 'KEY_PASSWORD'
        }
        manifest = 'lib/android/AndroidManifest.xml'
        resDirectory = 'src/android/resources'
    }
}

It was done! It was almost 2 a.m., but I tried uploading the signed apk for the third time, and voilà!! No more errors. The app when to submission and in less than 10 hours, on Sunday morning it was already published!!

  

Beta Testing Program


Instead of going into production I chose the Beta Testing program, so during this week only a few guys have been able to access to Google Play to download and test the application.

Thanks to their feedback I've made a few upgrades, like fixing some issues with fonts and Samsung devices (thanks John) or changing the context menu to a visible toolbar (thanks Bruno).



2048FX on Google Play Store


And the beta testing time is over. As of now, the application is on production. 

What are you waiting for? Go and get it!! 

Download it, play with it, enjoy, and if you have any issue, any problem at all, please report it, so we can work on improving its usability in all kind of devices.

Final Thanks


Let me finish this post taking the word of the whole JavaFX community out there, saying out loud:


THANK YOU, JavaFXPorts !!!

Without you all of this wouldn't be possible at all.

Big thanks to all the guys already mentioned in this post, and also to Eugene Ryzhikov, Mark Heckler and Diego Cirujano, for helping along the way.

And finally, thanks to the OpenJFX project and the JavaFX team.

UPDATE

Thanks to the work of Jens Deters, 2048FX has made it to Apple Store too!



Go and install it from here!

And since today (15th May 2015), we are open sourcing all the project, so anyone can have a look at it and find out about the last final details required to put it all together and make it successfully to Google Play or Apple Store:

https://github.com/jperedadnr/Game2048FX

Enjoy!
 

Thursday, January 8, 2015

Creating and Texturing JavaFX 3D Shapes

Hi there!

It's been a while since my last post, and it seems I've just said the same on my last one... but you know, many stuff in between, and this blog has a tradition of long posts, those that can't be delivered on a weekly/monthly basis. But if you're a regular reader (thank you!) you know how this goes.

This post is about my last developments in JavaFX 3D, after working in close collaboration with a bunch of incredible guys for the last months.

For those of you new to this blog, I've already got a few posts talking about JavaFX 3D. My last recent one about the Rubik's Cube: RubikFX: Solving the Rubik's Cube with JavaFX 3D,  and other about the Leap Motion controller: Leap Motion Controller and JavaFX: A new touch-less approach.

I'll cover in this post the following topics:
Before getting started, did I mention my article "Building castles in the Sky. Use JavaFX 3D to model historical treasures and more" has been published in the current issue of Java Magazine?

In a nutshell,  this article describes a multi model JavaFX 3D based application, developed for the virtual immersion in Cultural Heritage Buildings, through models created by Reverse Engineering with Photogrammetry Techniques. The 3D model of the Menéndez Pelayo Library in Santander, Spain, is used throughout the article as an example of a complex model.



You can find this application and, thanks to Óscar Cosido, a free model of the Library here.

Leap Motion Skeletal Tracking Model


Since my first post about Leap Motion, I've improved the 3D version after Leap Motion released their  version 2 that includes an skeletal tracking model. 

I haven't had the chance to blog about it, but this early video shows my initial work. You can see that the model now includes bones, so a more realistic hand can be built.


I demoed a more advanced version at one of my JavaOne talks with the incredibles James Weaver, Sean Phillips and Zoran Sevarac. Sadly, Jason Pollastrini couldn't make it, but he was part of the 3D-team.

 

If you are interested, all the code is available here. Go, fork it and play with it if you have a Leap Motion controller.

 Yes, we did have a great time there.


The session was great. In fact you can watch it now at Parleys.

We had even a special guest: John Yoon, a.k.a. @JavaFX3D


Skinning Meshes and Leap Motion

And then I met Alexander Kouznetsov.

It was during the Hackergarten 3D session, where Sven Reimers and I were hacking some JavaFX 3D stuff, when he showed up, laptop in backpack, ready for some hacking. There's no better trick than asking a real developer:  I bet you're not able to hack this... to get it done!

So the challenge was importing a rigged hand in JSON format to use a SkinningMesh in combination with the new Leap Motion skeletal tracking model. As the one and only John Yoon would show later in his talk:
"In order to animate a 3D model, you need a transform hierarchy to which the 3D geometry is attached. 
The general term for this part of the pipeline is “rigging” or “character setup”.
Rigging is the process of setting up your static 3D model for computer-generated animation, to make it animatable."
He was in charge of animating the Duke for the chess demo shown at the Keynote of JavaOne 2013. As shown in the above picture, this required a mesh, a list of joints, weights and transformations, binding the inner 'bones' with the surrounding external mesh, so when the former were moved the latter was deformed, creating the desired animation effect.

The SkinningMesh class in the 3DViewer project was initially designed for Maya, and we had a rigged hand in Three.js model in JSON format.

So out of the blue Alex built an importer, and managed to get the mesh of the hand by reverse engineering. Right after that he solved the rest of the components of the skinningMesh. The most important part was the binding of the transformations between joints.


        Affine[] bindTransforms = new Affine[nJoints];
        Affine bindGlobalTransform = new Affine();
        List<Joint> joints = new ArrayList<>(nJoints);
        List<Parent> jointForest = new ArrayList<>();
        
        for (int i = 0; i < nJoints; i++) {
            JsonObject bone = object.getJsonArray("bones").getJsonObject(i);
            Joint joint = new Joint();
            String name = bone.getString("name");
            joint.setId(name);
            JsonArray pos = bone.getJsonArray("pos");
            double x = pos.getJsonNumber(0).doubleValue();
            double y = pos.getJsonNumber(1).doubleValue();
            double z = pos.getJsonNumber(2).doubleValue();
            joint.t.setX(x);
            joint.t.setY(y);
            joint.t.setZ(z);
            bindTransforms[i] = new Affine();
            int parentIndex = bone.getInt("parent");
            if (parentIndex == -1) {
                jointForest.add(joint);
                bindTransforms[i] = new Affine(new Translate(-x, -y, -z));
            } else {
                Joint parent = joints.get(parentIndex);
                parent.getChildren().add(joint);
                bindTransforms[i] = new Affine(new Translate(
                        -x - parent.getLocalToSceneTransform().getTx(), 
                        -y - parent.getLocalToSceneTransform().getTy(), 
                        -z - parent.getLocalToSceneTransform().getTz()));
            }
            joints.add(joint);
            joint.getChildren().add(new Axes(0.02));
        }

This was the first animation with the model:

 

The axes are shown at every joint. Observe how easy is to deform a complex mesh just by rotating two joints:

       Timeline t = new Timeline(new KeyFrame(Duration.seconds(1), 
                new KeyValue(joints.get(5).rx.angleProperty(), 90),
                new KeyValue(joints.get(6).rx.angleProperty(), 90)));
        t.setCycleCount(Timeline.INDEFINITE);
        t.play(); 

With a working SkinningMesh, it was just time for adding the skeletal tracking model from Leap Motion. 

First, we needed to match Bones to joints, and then we just needed to apply the actual orientation of every bone to the corresponding joint transformation.


        listener = new LeapListener();
        listener.doneLeftProperty().addListener((ov,b,b1)->{
            if(b1){
                List<finger$gt; fingersLeft=listener.getFingersLeft();
                Platform.runLater(()->{
                    fingersLeft.stream()
                        .filter(finger -> finger.isValid())
                        .forEach(finger -> {
                            previousBone=null;
                            Stream.of(Bone.Type.values()).map(finger::bone)
                                .filter(bone -> bone.isValid() && bone.length()>0)
                                .forEach(bone -> {
                                    if(previousBone!=null){
                                        Joint joint = getJoint(false,finger,bone);
                                        Vector cross = bone.direction().cross(previousBone.direction());
                                        double angle = bone.direction().angleTo(previousBone.direction());
                                        joint.rx.setAngle(Math.toDegrees(angle));
                                        joint.rx.setAxis(new Point3D(cross.getX(),-cross.getY(),cross.getZ()));
                                    }
                                    previousBone=bone;
                            });
                    });
                    ((SkinningMesh)skinningLeft.getMesh()).update();
                });
            }
        });

The work was almost done! Back from JavaOne I had the time to finish the model, adding hand movements and drawing the joints:



This video sums up most of what we've accomplished:


If you are interested in this project, all the code is here. Feel free to clone or fork it. Pull requests will be very wellcome.

 TweetWallFX

One thing leads to another... And Johan Vos and Sven asked me to join them in a project to create a Tweet Wall with JavaFX 3D for Devoxx 2014. JavaFX 3D? I couldn't say no even if I wasn't attending!

Our first proposal (not the one Sven finally accomplished) was based on the F(X)yz library from Sean and Jason: a SkyBox as a container, with several tori inside, where tweets were rotating over them:


Needless to say, we used the great Twitter4J API for retrieving new tweets with the hashtag #Devoxx.

The first challenge here was figuring out how to render the tweets over each torus. The solution was based on the use of an snapshot of the tweet (rendered in a background scene) that would serve as the diffuse map image of the PhongMaterial assigned to the torus.

To second was creating a banner effect rotating the tweets over they tori. To avoid artifacts, a segmented torus was built on top of the first one,  cropping the faces of a regular torus, so the resulting mesh will be textured with the image.

This is our desired segmented torus. 



In the next section, we'll go into details of how we could accomplish this shape.

Creating new 3D shapes

Note to beginners: For an excelent introduction to  JavaFX 3D, have a look to the 3D chapters on these books: JavaFX 8 Introduction by Example an JavaFX 8 Pro: A Definitive Guide to Building Desktop, Mobile, and Embedded Java Clients.

To create this mesh in JavaFX 3D we use a TriangleMesh as a basis for our mesh, where we need to provide float arrays of vertices and texture coordinates and one int array of vertex and texture indices for defining every triangle face.

Since a torus can be constructed from a rectangle, by gluting both pairs of opposite edges together with no twists, we could use a 2D rectangular grid in a local system ($\theta$,$\phi$), and map every point with these equations:

\[X=(R+r \cos\phi) \cos\theta\\Z=(R+r \cos\phi) \sin\theta\\Y=r \sin\phi\]

So based on this grid (with colored borders and triangles for clarity):

 we could create this torus (observe how the four corners of the rectangle are joinned together in one single vertex):


Now if we want to segment the mesh, we can get rid of a few elements from the borders. From the red inner grid, we could have a segmented torus now:



Vertices coordinates

As we can see in the SegmentedTorusMesh class from the F(x)yz library, generating the vertices for the mesh is really easy, based in the above equations, the desired number of subdivisions (20 and 16 in the figures) and the number of elements cropped in both directions (4):

       
    private TriangleMesh createTorus(int subDivX, int subDivY, int crop, float R, float r){    
        TriangleMesh triangleMesh = new TriangleMesh();

        // Create points
        List<Point3D> listVertices = new ArrayList<>();
        float pointX, pointY, pointZ;
        for (int y = crop; y <= subDivY-crop; y++) {
            float dy = (float) y / subDivY;
            for (int x = crop; x <= subDivX-crop; x++) {
                float dx = (float) x / subDivX;
                if(crop>0 || (crop==0 && x<subDivX && y<subDivY)){
                    pointX = (float) ((R+r*Math.cos((-1d+2d*dy)*Math.PI))*Math.cos((-1d+2d*dx)*Math.PI));
                    pointZ = (float) ((R+r*Math.cos((-1d+2d*dy)*Math.PI))*Math.sin((-1d+2d*dx)*Math.PI));
                    pointY = (float) (r*Math.sin((-1d+2d*dy)*Math.PI));
                    listVertices.add(new Point3D(pointX, pointY, pointZ));
                }
            }
        }

Note that we have to convert this collection to a float array. Since there is no such thing as FloatStream, trying to use Java 8 streams, I asked a question at StackOverflow, and as result now we use a very handy FloatCollector to do the conversion:

        float[] floatVertices=listVertices.stream()
            .flatMapToDouble(p->new DoubleStream(p.x,p.y,p.z))
            .collect(()->new FloatCollector(listVertices.size()*3), FloatCollector::add, FloatCollector::join)
            .toArray();

        triangleMesh.getPoints().setAll(floatVertices);

In case anybody is wondering why we don't use plain float[], using collections instead of simple float arrays allow us to perform mesh coloring (as we'll see later), subdivisions, ray tracing,...using streams and, in many of these cases, parallel streams.

Well, in Jason's words: why TriangleMesh doesn't provide a format that incorporates the use of streams by default...??

Texture coordinates

In the same way, we can create the texture coordinates. We can use the same grid, but now mapping (u,v) coordinates, from (0.0,0.0) on the left top corner to (1.0,1.0) on the right bottom one.


We need extra points for the borders.

        int index=0;
        int width=subDivX-2*crop;
        int height=subDivY-2*crop;
        float[] textureCoords = new float[(width+1)*(height+1)*2];
        for (int v = 0; v <= height; v++) {
            float dv = (float) v / ((float)(height));
            for (int u = 0; u <= width; u++) {
                textureCoords[index] = (float) u /((float)(width));
                textureCoords[index + 1] = dv;
                index+=2;
            }
        }
        triangleMesh.getTexCoords().setAll(textureCoords);

Faces

Once we have defined the coordinates we need to create the faces. From JavaDoc:
The term face is used to indicate 3 set of interleaving points and texture coordinates that together represent the geometric topology of a single triangle.
One face is defined by 6 indices: p0, t0, p1, t1, p2, t2, where p0, p1 and p2 are indices into the points array, and t0, t1 and t2 are indices into the texture coordinates array.

For convenience, we'll use two splitted collections of points indices and texture indices.

Based on the above figures, we go triangle by triangle, selecting the three indices position in specific order. This is critical for the surface orientation. Also note that for vertices we reuse indices at the borders to avoid the formation of seams.

        List<Point3D> listFaces = new ArrayList<>();
        // Create vertices indices
        for (int y =crop; y<subDivY-crop; y++) {
            for (int x=crop; x<subDivX-crop; x++) {
                int p00 = (y-crop)*((crop>0)?numDivX:numDivX-1) + (x-crop);
                int p01 = p00 + 1;
                if(crop==0 && x==subDivX-1){
                    p01-=subDivX;
                }
                int p10 = p00 + ((crop>0)?numDivX:numDivX-1);
                if(crop==0 && y==subDivY-1){
                    p10-=subDivY*((crop>0)?numDivX:numDivX-1);
                }
                int p11 = p10 + 1;
                if(crop==0 && x==subDivX-1){
                    p11-=subDivX;
                }                
                listFaces.add(new Point3D(p00,p10,p11));                
                listFaces.add(new Point3D(p11,p01,p00));
            }
        }

        List<Point3D> listTextures = new ArrayList<>();
        // Create textures indices
        for (int y=crop; y<subDivY-crop; y++) {
            for (int x=crop; <subDivX-crop; x++) {
                int p00 = (y-crop) * numDivX + (x-crop);
                int p01 = p00 + 1;
                int p10 = p00 + numDivX;
                int p11 = p10 + 1;
                listTextures.add(new Point3D(p00,p10,p11));                
                listTextures.add(new Point3D(p11,p01,p00));
            }
        }
       
Though now we have to join them. The adventages of this approach will be shown later.

        // create faces
        AtomicInteger count=new AtomicInteger();
        int faces[] = return listFaces.stream()
            .map(f->{
                Point3D t=listTexture.get(count.getAndIncrement());
                int p0=(int)f.x; int p1=(int)f.y; int p2=(int)f.z;
                int t0=(int)t.x; int t1=(int)t.y; int t2=(int)t.z;
                return IntStream.of(p0, t0, p1, t1, p2, t2);
            }).flatMapToInt(i->i).toArray();
        triangleMesh.getFaces().setAll(faces);
    
        // finally return mesh
        return triangleMesh;
    }

This picture shows how we create the first and last pairs of faces. Note the use of counterclockwise winding to define the front faces, so we have the normal of every surface pointing outwards (to the outside of the screen).


Finally, we can create our banner effect, adding two tori, both solid (DrawMode.FILL) and one of them segmented and textured with an image. This snippet shows the basics:

        SegmentedTorusMesh torus = new SegmentedTorusMesh(50, 42, 0, 500d, 300d); 
        PhongMaterial matTorus = new PhongMaterial(Color.FIREBRICK);
        torus.setMaterial(matTorus);
        
        SegmentedTorusMesh banner = new SegmentedTorusMesh(50, 42, 14, 500d, 300d); 
        PhongMaterial matBanner = new PhongMaterial();
        matBanner.setDiffuseMap(new Image(getClass().getResource("res/Duke3DprogressionSmall.jpg").toExternalForm()));
        banner.setMaterial(matBanner); 
     
        Rotate rotateY = new Rotate(0, 0, 0, 0, Rotate.Y_AXIS);
        torus.getTransforms().addAll(new Rotate(0,Rotate.X_AXIS),rotateY);
        banner.getTransforms().addAll(new Rotate(0,Rotate.X_AXIS),rotateY);
   
        Group group.getChildren().addAll(torus,banner);        
        Group sceneRoot = new Group(group);
        Scene scene = new Scene(sceneRoot, 600, 400, true, SceneAntialiasing.BALANCED);
        primaryStage.setTitle("F(X)yz - Segmented Torus");
        primaryStage.setScene(scene);
        primaryStage.show(); 

        final Timeline bannerEffect = new Timeline();
        bannerEffect.setCycleCount(Timeline.INDEFINITE);
        final KeyValue kv1 = new KeyValue(rotateY.angleProperty(), 360);
        final KeyFrame kf1 = new KeyFrame(Duration.millis(10000), kv1);
        bannerEffect.getKeyFrames().addAll(kf1);
        bannerEffect.play();

to get this animation working:

 

Playing with textures

The last section of this long post will show you how we can hack the textures from a TriangleMesh to display more advances images over the 3D shape. This will include:
  • Coloring meshes (vertices or faces) 
  • Creating contour plots
  • Using patterns
  • Animating textures
This work is inspired by a question from Álvaro Álvarez on StackOverflow, about coloring individual triangles or individual vertices from a mesh. The inmediate answer would be: no, you can't easily, since for one mesh there's one material with one diffuse color, and it's not possible to assing different materials to different triangles of the same mesh. You could create as many meshes and materials as colors, if this number were really small.

Using textures, was the only way, but for that, following the standard procedure, you will need to color precisely your texture image, to match each triangle with each color. 

In convex polihedra there's at least one net, a 2D arrangement of polygons that can be folded into the faces of the 3D shape. Based on an icosahedron (20 faces), we could use its net to color every face:



And then use the image as texture for the 3D shape:


This was my first answer, but I started thinking about using another approach. What if instead of the above colored net we could create on runtime a small image of colored rectangles, like this:


and trick the texture coordinates and texture indices to find their values in this image instead? Done! The result was this more neat picture:



(The required code to do this is in my answer, so I won't post it here). 

And going a little bit further, if we could create one palette image, with one color per pixel, we could also assign one color to each vertex, and the texture for the rest of the triangle will be interpolated by the scene graph! This was part of a second answer:


Color Palette

With this small class we can create small images with up to 1530 unique colors. The most important thing is they are correlative, so we'll have smooth contour-plots, and there won't be unwanted bumps when intermediate values are interpolated.



To generate on runtime this 40x40 image (2 KB) we just use this short snippet:

        Image imgPalette = new WritableImage(40, 40);
        PixelWriter pw = ((WritableImage)imgPalette).getPixelWriter();
        AtomicInteger count = new AtomicInteger();
        IntStream.range(0, 40).boxed()
                .forEach(y->IntStream.range(0, 40).boxed()
                        .forEach(x->pw.setColor(x, y, Color.hsb(count.getAndIncrement()/1600*360,1,1))));

With it, we can retrieve the texture coordinates for a given point from this image and update the texture coordinates on the mesh:

    public DoubleStream getTextureLocation(int iPoint){
        int y = iPoint/40; 
        int x = iPoint-40*y;
        return DoubleStream.of((((float)x)/40f),(((float)y)/40f));
    }

    public float[] getTexturePaletteArray(){
        return IntStream.range(0,colors).boxed()
            .flatMapToDouble(palette::getTextureLocation)
            .collect(()->new FloatCollector(2*colors), FloatCollector::add, FloatCollector::join)
            .toArray();
    }

    mesh.getTexCoords().setAll(getTexturePaletteArray());

Density Maps

Half of the work is done. The other half consists in assigning a color to every vertex or face in our mesh, based on some criteria. By using a mathematical function that for any $(x,y,z)$ coordinates we'll have a value $f(x,y,z)$ that can be scaled within our range of colors.

So let's have a function:

    @FunctionalInterface
    public interface DensityFunction<T> {
        Double eval(T p);
    }

    private DensityFunction<Point3D> density;

Let's find the extreme values, by evaluating the given function in all the vertices, using parallel streams:

    private double min, max;

    public void updateExtremes(List<Point3D> points){
        max=points.parallelStream().mapToDouble(density::eval).max().orElse(1.0);
        min=points.parallelStream().mapToDouble(density::eval).min().orElse(0.0);
        if(max==min){
            max=1.0+min;
        }
    }

Finally, we assign the color to every vertex in every face, by evaluating the given function in all the vertices, using parallel streams:
    
    public int mapDensity(Point3D p){
        int f=(int)((density.eval(p)-min)/(max-min)*colors);
        if(f<0){
            f=0;
        }
        if(f>=colors){
            f=colors-1;
        }
        return f;
    }

    public int[] updateFacesWithDensityMap(List<Point3D> points, List<Point3D> faces){
        return faces.parallelStream().map(f->{
                int p0=(int)f.x; int p1=(int)f.y; int p2=(int)f.z;
                int t0=mapDensity(points.get(p0));
                int t1=mapDensity(points.get(p1));
                int t2=mapDensity(points.get(p2));
                return IntStream.of(p0, t0, p1, t1, p2, t2);
            }).flatMapToInt(i->i).toArray();
    }

    mesh.getFaces().setAll(updateFacesWithDensityMap(listVertices, listFaces));

Did I say I love Java 8??? You can see now how the strategy of using lists for vertices, textures and faces has clear adventages over the float arrays.

Let's run some example, using the IcosahedronMesh classs from F(X)yz:
    
    IcosahedronMesh ico = new IcosahedronMesh(5,1f);
    ico.setTextureModeVertices3D(1600,p->(double)p.x*p.y*p.z);
    Scene scene = new Scene(new Group(ico), 600, 600, true, SceneAntialiasing.BALANCED);
    primaryStage.setScene(scene);
    primaryStage.show();     

This is the result:


Impressive, right? After a long explanation, we can happily say: yes! we can color every single triangle or vertex on the mesh!

And we could even move the colors, creating an smooth animation. For this we only need to update the faces (vertices and texture coordinates are the same). This video shows one:


More features

More? In this post? No! I won't extend it anymore. I just post this picture:




And refer you to all these available 3D shapes and more at F(X)yz repository. If I have the time, I'll try to post about them in a second part.

Conclusions

JavaFX 3D API in combination with Java 8 new features has proven really powerful in terms of rendering complex meshes. The API can be easily extended to create libraries or frameworks that help the developer in case 3D features are required.

We are  far from others (Unity 3D, Three.js, ... to say a few), but with the collaboration of the great JavaFX community we can shorten this gap.

Please, clone the repository, test it, create pull requests, issues, feature requests, ... get in touch with us, help us to keep this project alive and growing.

Also visit StackOverflow and ask questions there using these tags: javafx, javafx-8 and the new javafx-3d). You never know where a good question may take you! And the answers will help others developers too.

A final word to give a proper shout-out to Sean Phillips and Jason Pollastrini, founders of the F(x)yz library, for starting an outstanding project.