Augmented Reality: bring out the Bat-Signal!

After playing around with AR, I’ve decided to make a B4A library from an existing AR library: NyARToolkit (http://nyatla.jp/nyartoolkit/).

Some functions did not work properly and I had to rewrite some of them so they could fit in a B4A library. I also wanted to use the GLSurfaceView from Andrew Graham in basic4android. To do this, I had to extract everything that had to do with 3D and OpenGL. It took some fiddling around, but I think I cracked it.

With a couple of lines in b4a, I can now find the markers and put a 3D model on top of it. The speed and accuracy of the NyARToolkit is ok and it is certainly possible to make some fun projects.

Just to play around, I’ve put one of my favorite superheroes on top of the marker. Batman to the rescue!

Example of the code needed to get the markers:

Sub ABAR_MarkersFound()
	
	Dim Vect As ABARVector2D
	Dim a As Int
	Dim b As Int
	Dim Markers As List
	Markers.Initialize
	Markers = ABAR.GetMarkers()
	Dim corners As List
	corners.Initialize
	Dim Msg As String
	
	FoundMarkers = markers.Size - 1
	For a = 0 To foundMarkers
		
		Mark = Markers.Get(a)
		conf(a) = Mark.Confidence
	
		counter = 0
		Msg = "Marker: " & Mark.ARCodeIndex & CRLF
		Msg = Msg & "x:" & Mark.Center.x & " y:" & Mark.Center.y & CRLF
		Msg = Msg & "Confedence:" & Mark.Confidence & CRLF
		Corners = Mark.Corners
		Vect = Corners.Get(0)
		Msg = Msg & "Corner1=x:" & vect.x & "y:" & vect.y & CRLF
		Vect = Corners.Get(1)
		Msg = Msg & "Corner2=x:" & vect.x & "y:" & vect.y & CRLF
		Vect = Corners.Get(2)
		Msg = Msg & "Corner3=x:" & vect.x & "y:" & vect.y & CRLF
		Vect = Corners.Get(3)
		Msg = Msg & "Corner4=x:" & vect.x & "y:" & vect.y & CRLF
		Markfound = True
		For b = 0 To 15
			tmpresultf(a,b) = Mark.resultf(b)
			tmpcameraRHf(b) = Mark.cameraRHf(b)
		Next
		Msg = Msg & CRLF
			
		useRHfp = True
		drawp = True

		Log(Msg)		
	Next
	glsv.RequestRender
End Sub
Advertisement

Augmented Reality: Taking away the rainbow

For the Augmented Reality project, I’ll need to grayscale the image.

There are 3 types of grayscaling available in the class but the one I’ll need will be BT709. It turns out it gives the best results for searching glyphs.

It’s a simple conversion but already we have something to take into account when we use Java: the byte type. A byte in java has a value of -128 to 127 instead of 0 to 255. Therefor we will need to do & 0xFF when we return the grayscaled bitmap.

I’ll have to remember this for future calculations on the imageData array!

Here is the class. De imageData is kept in memory as I’ll need those pixels anyway to do the next step: Edge detection!

// Copyright © Alain Bailleul, Alwaysbusy's Corner 2011
//
// Grayscale type values from:
//
// AForge Image Processing Library
// AForge.NET framework
// http://www.aforgenet.com/framework/

import android.graphics.Bitmap;
import android.graphics.Color;

public static class ABImage
{
    //data that will contain the grayscaled pixels
    public byte[] imageData;
    public int width=0;
    public int height=0;
    private double RedCoefficient;
    private double GreenCoefficient;
    private double BlueCoefficient;

    //3 types of grayscaling
    public void SetGrayscaleBT709() {
        RedCoefficient   = 0.2125;
        GreenCoefficient = 0.7154;
        BlueCoefficient  = 0.0721;
    }

    public void SetGrayscaleRMY() {
        RedCoefficient   = 0.5000;
        GreenCoefficient = 0.4190;
        BlueCoefficient  = 0.0810;
    }

    public void SetGrayscaleY() {
        RedCoefficient   = 0.2990;
        GreenCoefficient = 0.5870;
        BlueCoefficient  = 0.1140;
    }

    // create a 8 bit byte array with the pixels of the grayscaled image
    public void FromBitmap( Bitmap image )
    {
        width = image.getWidth();
        height = image.getHeight();

        int[] pixelData = new int[width * height];
        int i = 0;

        image.getPixels(pixelData, 0,width, 0, 0, width, height);

        for (int pixeldatai: pixelData)
        {
             imageData[i++] = (byte) ( RedCoefficient * Color.red(pixeldatai)
                                    + GreenCoefficient * Color.green(pixeldatai)
                                    + BlueCoefficient * Color.blue(pixeldatai));
        }
    }

    //return the grayscale byte array back to a 32 bit bitmap
    public Bitmap ToBitmap()
    {
        int[] pixelData = new int[width * height];
        int i=0;

        for (byte imageDatai: imageData)
        {
            pixelData[i++] = ((255 & 0xFF) << 24) | //alpha
                ((imageDatai & 0xFF) << 16) | //red
                ((imageDatai & 0xFF) << 8 )  | //green
                ((imageDatai & 0xFF) << 0 ); //blue
            }
        return Bitmap.createBitmap(pixelData, width, height, Bitmap.Config.ARGB_8888);
    }
}

Augmented Reality on Android, a start…

I’m looking into Augmented reality on my Android phone and tablet. The main functions will be written in java and then used as a library in B4A. I’m looking into AForge.NET and I’ll probably will port some code from their C# library to java.

The first step I do is searching for glyphs in an image. This is an example of one of those glyphs:

For a human it is no problem to locate the glyphs, but for a computer it is quite a challenge.

These are the steps I’m planning to make:

1. Downscale the image to a smaller picture (for speed)

2. Convert the picture to an array of pixels

3. Greyscale the image using the pixel array

4. Run some Edge detection algorithm to help me find the blobs

5. examine the blobs and look which ones look quadrangular

6. Check if they are an actual glyph

7. calculate its position in 3D space

8. use OpenGL to show an object on the glyph

And all of this as fast as possible!

I’ll keep you posted.