Trie Data Structure in Scala

Trie is a data structure can be used for auto complete feature. Following a a rudimentary implementation of a Trie in Scala.

Trie can be constructed similar to a tree structure. Following code snippet illustrated a node in the Trie tree *

package ashika.datastructure.trie

case class Node(content: Char,
                children: collection.mutable.Map[Char,Node],
                words:collection.mutable.Set[String],
                var metaData: Map[String,String] = null)

 

Tree implementation including addWord() and find() methods are implemented as follows :

 

package ashika.datastructure.trie

import scala.collection.mutable
import scala.collection.mutable.ListBuffer
import util.control.Breaks._

case class Entry(words: Set[String], meta: Map[String,String])
class Trie {
  val root : Node = Node(Char.MinValue,mutable.Map[Char,Node](), collection.mutable.Set());

  /**
    * find matches for a given prefix
    * @param str
    * @return
    */
  def find(str: String) : Seq[Entry] = {
    val lb = ListBuffer[Entry]()
    var i = 0
    var node = root

    breakable {
      while (i < str.length) {
        val ch = str.charAt(i)

        val nnode = node.children.get(ch.toLower)
        if (nnode.isDefined) {
          if (nnode.get.words.nonEmpty && i>=str.length) {
            lb.addOne(Entry(nnode.get.words.toSet, nnode.get.metaData))
          }
          node=nnode.get
        } else {
           return lb.toList
        }
        i+=1
      }
    }
    //break
    def dfs(node: Node, charLst : Seq[Char]) : Unit = {
      if (node.words.nonEmpty) {
        lb.addOne(Entry(node.words.toSet, node.metaData))
      }
      node.children.foreach(elm => {
        dfs(elm._2, charLst:+ elm._2.content)
      })
    }
    dfs(node,Seq[Char]())
    lb.toSeq
  }

  /**
    * add new word to the Trie with optional meta data
    * @param str
    */
  def addWord(str: String, meta: Map[String,String] = null): Unit = {
    var node = root

    var i = 0
    breakable {
      while (i < str.length) {
        val ch = str.charAt(i)
        val containerNode = node.children.get(ch.toLower)
        //node exists
        if (containerNode.isDefined) {
          node = containerNode.get
          if(i == str.length-1) {
            node.words.addOne(str)
            node.metaData = meta
          }
        } else {
          //no node for this char
          break
        }
        i += 1
      }
    }
    while(i < str.length) {
      val ch = str.charAt(i)
      val nnode : Node = if (i==str.length-1) {
        Node(ch,mutable.Map[Char,Node](), collection.mutable.Set(str), meta)
      } else {
        Node(ch,mutable.Map[Char,Node](), collection.mutable.Set(), meta)
      }
      node.children.addOne(ch.toLower,nnode)
      node = nnode
      i+=1
    }
  }
}

Sample driver program :

 

object Trie {

  def main(args:Array[String]) = {
    val trie = new Trie
    trie.addWord("Ashika")
    trie.addWord("Ashola")
    trie.addWord("Airplane")
    trie.addWord("Asしか")
    trie.addWord("Ash")
    trie.addWord("ash")
    trie.addWord("AsH",Map("id"->"productId"))
    trie.addWord("日本語")
    trie.addWord("日本人")
    trie.addWord("cat")
    trie.addWord("catterpillar", Map("id"->"a"))

    trie.addWord("日本")
    trie.addWord("日人")
    (1 to 11).foreach(i => {
      trie.addWord("dummy name here with some name"+i)
    })


    (trie.find("as").foreach(e => {
      println(e.words, e.meta)
    }))

  }

}

OpenGLES2 on Andriod

3D graphics has been one of my favourite areas when it comes to programming.So recently, I wanted study what’s new in the field specially in OpenGL.I decided to play around with OpenGLES2 on Android.Since OpenGLES2 rendering pipeline is entirely runs on shaders and has removed all fixed pipeline logic from the spec.This is quite challenging but a good way to understand the shaders and may be later time I can move in to OpenGL4 with this knowledge.

I started developing a 3D engine for Android.At the moment it has very basic functionalities.Following video shows OBJ file load, texturing , directional lights  , FPS camera movement .I had to write a custom widget for “gamepad” controller.

I have shared the source at here : https://bitbucket.org/umanga/projectx2

Download the app at http://dl.dropbox.com/u/17399055/projectX2.apk

 

Decode an audio stream using libavcodec and play through libAO .

FFMPEG is an awesome set of libraries for decoding/encoding audio and video.Since its a highly experimental  and keep updating
regulary , I found that many of the samples that I can find on the net are kind of outdated.
Following code snippet shows ,how to decode an audio stream using libavcodec and play it using crossplatform audio library – libAO .


[sourcecode language=”python” wraplines=”false” collapse=”false”]
your source code goes here
[/sourcecode]

#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#include <math.h>

extern “C” {
#include “libavutil/mathematics.h”
#include “libavformat/avformat.h”
#include “libswscale/swscale.h”
#include <ao/ao.h>

}

void die(const char *msg)
{
fprintf(stderr,”%s\n”,msg);
exit(1);
}

int main(int argc, char **argv)
{

const char* input_filename=argv[1]

av_register_all();

AVFormatContext* container=avformat_alloc_context();
if(avformat_open_input(&container,input_filename,NULL,NULL)<0){
die(“Could not open file”);
}

if(av_find_stream_info(container)<0){
die(“Could not find file info”);
}

av_dump_format(container,0,input_filename,false);

int stream_id=-1;
int i;
for(i=0;i<container->nb_streams;i++){
if(container->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO){
stream_id=i;
break;
}
}
if(stream_id==-1){
die(“Could not find Audio Stream”);
}

AVCodecContext *ctx=container->streams[stream_id]->codec;
AVCodec *codec=avcodec_find_decoder(ctx->codec_id);

if(codec==NULL){
die(“cannot find codec!”);
}

if(avcodec_open(ctx,codec)<0){
die(“Codec cannot be opended!”);
}

//initialize AO lib
ao_initialize();

int driver=ao_default_driver_id();

ao_sample_format sformat;
AVSampleFormat sfmt=ctx->sample_fmt;
//assign device sample rate depend on the input stream
if(sfmt==AV_SAMPLE_FMT_U8){
sformat.bits=8;
}else if(sfmt==AV_SAMPLE_FMT_S16){
sformat.bits=16;
}else if(sfmt==AV_SAMPLE_FMT_S32){
sformat.bits=32;
}

sformat.channels=ctx->channels;
sformat.rate=ctx->sample_rate;
sformat.byte_format=AO_FMT_NATIVE;
sformat.matrix=0;

ao_device *adevice=ao_open_live(driver,&sformat,NULL);
//end of init AO LIB

//data packet read from the stream
AVPacket packet;
av_init_packet(&packet);

int buffer_size=AVCODEC_MAX_AUDIO_FRAME_SIZE+ FF_INPUT_BUFFER_PADDING_SIZE;;
uint8_t buffer[buffer_size];
packet.data=buffer;
packet.size =buffer_size;

//frame ,where the decoded data will be written
AVFrame *frame=avcodec_alloc_frame();

int len;
int frameFinished=0;
while(av_read_frame(container,&packet)>=0)
{

if(packet.stream_index==stream_id){
int len=avcodec_decode_audio4(ctx,frame,&frameFinished,&packet);

if(frameFinished){
//play the decoded bytes
ao_play(adevice, (char*)frame->extended_data[0],frame->linesize[0] );
}else{
//
}

}else {
//someother stream,probably a video stream
}

}

av_close_input_file(container);
ao_shutdown();
return 0;
}


Configuring CMake for QT development

I know how much its confusing  when you need to configure some new technologies for the first time.Its just , you know how everything works and connects ,but no idea about those little tricky stuff.

In this blog post I am going to write about how to configure QT  with CMake. I have uploaded a sample CMake project which will helpful for much clearer understanding.

Download sample project from : http://dl.dropbox.com/u/17399055/cmake-qt-sample.zip

Key points :

  • QT comes with its own build tool called “qmake”.So, for Cmake to utilize that , you need to add the folder where “qmake” locates to your PATH environment variable.

Here are important snippets from the CakeList.txt file of  the sample project.

  • FIND_PACKAGE(Qt4 REQUIRED)
  • – This will find QT library folder,header files..etc using `qmake` .

  • INCLUDE(${QT_USE_FILE})
  • – This will include QT related header folders to the make file.

  • QT4_WRAP_CPP(myprj_moh_srcs ${myprj_moh_hdrs})
  • – This compiles QT related class headers with QT MOH compiler and generate standard C++ code.Technically any class which uses QTs signal slots (basically classes which inherits `QObjec`) should be passed to this.

  • QT4_ADD_RESOURCES(myprj_rccs_srcs ${myprj_rccs})
  • – This compiles `QT resource files` and wrap all resources into C++ header.This is useful if you want to embed your resources into the executable.

  • QT4_WRAP_UI(myprj_ui_hdrs ${myprj_uis})
  • – This compiles QT UI files into standard C++ code.QT UI files contain user designed interfaces using QT Creator or QT IDE plugins .(eg- QT Eclipse plugin)

How to covert QT QImage into OpenCV IplImage and vise-versa.

These days,I am working on developing an application for “High Voltage Electron Microscope”.Basically it takes images from “High Voltage Electron Microscope” and do tomographic reconstruction and 3D reconstruction…etc I use QT4 for user interface and and OpenCV for image processing functionalities and OpenGL as core frameworks.

I came up with big obstacle since it utilise many frameworks and have to handle conversion in between many image-data-structures.Eg : QImage in QT4 , IplImage in OpenCV and some custom image data structure for internal algorithms.

I came up with following conversion between QImage to IplImage.Hope it will helpful to someone whose spending hours on google.

IplImage* QImage2IplImage(QImage *qimg)
{

IplImage *imgHeader = cvCreateImageHeader( cvSize(qimg->width(), qimg->height()), IPL_DEPTH_8U, 4);
imgHeader->imageData = (char*) qimg->bits();

uchar* newdata = (uchar*) malloc(sizeof(uchar) * qimg->byteCount());
memcpy(newdata, qimg->bits(), qimg->byteCount());
imgHeader->imageData = (char*) newdata;
//cvClo
return imgHeader;
}

QImage*  IplImage2QImage(IplImage *iplImg)
{
int h = iplImg->height;
int w = iplImg->width;
int channels = iplImg->nChannels;
QImage *qimg = new QImage(w, h, QImage::Format_ARGB32);
char *data = iplImg->imageData;

for (int y = 0; y < h; y++, data += iplImg->widthStep)
{
for (int x = 0; x < w; x++)
{
char r, g, b, a = 0;
if (channels == 1)
{
r = data[x * channels];
g = data[x * channels];
b = data[x * channels];
}
else if (channels == 3 || channels == 4)
{
r = data[x * channels + 2];
g = data[x * channels + 1];
b = data[x * channels];
}

if (channels == 4)
{
a = data[x * channels + 3];
qimg->setPixel(x, y, qRgba(r, g, b, a));
}
else
{
qimg->setPixel(x, y, qRgb(r, g, b));
}
}
}
return qimg;

}

Maven Jetty Plugin JNDI JDBC / DBCP configuration

Yesterday,I literary spent almost 5 hours just to get JNDI DBCP working properly with maven jetty plugin.
Previously I used tomcat6 ,but I noticed that using maven/jetty I can save lot of deployment time during development. So same as lots of other guys struggled with this problem …it gave me a real pain for too long.. So I thought it would be nice to write a blog just in case this might helpful to anyone out there.

In short here are the important points:

1) JNDI configuration : Like META-INF/context.xml in Tomcat ; jetty uses WEB-INF/jetty-env.xml (There are other ways like configure in the jetty-server.xml , but i am not going to discuss them here since my idea is to simply move the project between tomcat and jetty)

Here is the jetty-env.xml:
<?xml version=”1.0″?>

<!DOCTYPE Configure PUBLIC “-//Mort Bay Consulting//DTD Configure//EN” “http://jetty.mortbay.org/configure.dtd”&gt;

<Configure class=”org.mortbay.jetty.webapp.WebAppContext”>

<New id=”immunodb” class=”org.mortbay.jetty.plus.naming.Resource”>
<Arg>jdbc/ImmunoDB</Arg>

<Arg>

<New class=”org.apache.commons.dbcp.BasicDataSource”>

<Set name=”driverClassName”>com.mysql.jdbc.Driver</Set>

<Set name=”url”>jdbc:mysql://localhost:3306/ImmunoDB</Set>

<Set name=”username”>root</Set>

<Set name=”password”>immuno</Set>

</New>

</Arg>
</New>

</Configure>

Note that theres only two <Arg> arguments here ,and if you went through Jetty JNDI Documentation you might find you need to put 3 <Arg> tags.
This depends on the number of arguments in the constructor of “org.apache.commons.dbcp.BasicDataSource”,which changes with the jetty/jetty-maven-plugin version you use.

2) Jetty/Jetty-Maven version:
In POM add the jetty-maven-plugin , I prefer version 6.9.1,since in some versions there’s a bug which JNDI resources are not getting registered properly.

<build>

<plugins>

<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>
<version>6.1.9</version>
</plugin>
</plugins>
</build>

3) I case of JDBC/DBCP , put your jdbc-driver.jar and apache-dbcp.jars in the POM dependencies.
With tomcat you have to put these jars in $TOMCAT/libs folder (wont work if they are in WEB-INF/libs) , but in jetty you can put them in WEB-INF/libs.

Thats it..now u can access you JNDI source as follows”

Context initCtx = new InitialContext();
Context envCtx = (Context) initCtx.lookup(“java:comp/env”);
DataSource ds = (DataSource)
envCtx.lookup(“jdbc/ImmunoDB”);

Hope this helps.

Clean FTP library for Java – FTP4J

Recently I was been assigned to write a FTP layer for a project at work.It needed to be supported all the funky FTP stuff like active,passive mode all kind of proxies like FTP,HTTP,SOCKS…etc.

At first I was little worried cause even in Apache Commons Net, it doesn’t support clear architecture for all those weird FTP stuff.Then I came up with this library FTP4J which saved my life.

FTP4J is well designed and easy to use library and best thing is its open-source and comes with LGPL license.

Buddhism – The ultimate truth of reality

Recently I’ve been reading about science behind Buddhism and ultimate reality.I thought it would be nice to blog some of few facts that I understood so far.

It is clear that recent researches and studies of modern science have revealed many great parallelisms between Buddhism and science. Especially with Quantum physics, which look at the universe and at the reality in a complete different perspective , scientists amazed that these new results show a great similarities with Buddhas teachings which had been taught over 2500 years ago.

One of the amazing discoveries that scientists have made was the emptiness in matters.By experiments like ‘double slit experiment ‘,it is concluded that fundamental particles are empty of inherent existence and exist in an undefined state of potentialities. They become ‘real’ when a mind interacts with them and give them a meaning. Whenever there is not mind there is no meaning or no reality.This quantum emptiness is similar to the Sunyata in Buddhism,which is discussed in ‘Prajna Paramita Hridaya Sutra.’

Above mentioned undefined state of potentialities behave as a field of energy. According to Quantum physics, this field is considered as the fundamental level of the universe , which is called ‘Quantum Field’ or ‘Unified Field’. It is considered that fluctuations of this field give rise to so called ‘Realities’.So according to that the ‘Reality’ that we live in, is just another one of infinite number of realities.

Quantum physics have changed the way we look at reality. It is amazed to see that almost every finding in Quantum mechanics has a connection with teaching in Buddhism.

But scienstists have lots of things to learn.I think Quantum physics is still in its child stage.There are infinite amount of pure knowledge that scientists can get from teachings of Lord Buddha – The greatest “Quantum Physicist” ever.

Following are few of selected sources which I found useful to anyone who’s interested :

  • Where Science and Buddhism Meet 1/2:

http://www.youtube.com/watch?v=qj_i7YqDwJA

Great video, simply explain emptiness,interconnectivity and nature of reality.

  • Quantum Knowledge: Link
  • Every thought has a frequency : Link
  • Discovery of Unified Field : Link
  • Illusion of Reality : Link

Ubuntu is dying , Debian Lenny is the Future!

I used Debian Etch for a long time even though several unstable versions of Lenny has been relased.So I thought to give it a shot for new Debian Lenny.So i downloaded Lenny Beta 2 DVD and the installation was smooth and easy.

Lenny comes with Gnome 2.22 and built in compiz and I  think people that complained that Debian is not that much attractive would change there attitude.I heard that Ubuntu making there releases half a year using Debian testing code base , so its obvious that, even this Beta is stable than Ubuntu8.I am not condemning Ubuntu ,but I have some bad experiences with Ubuntu.I used it about ~2 months at work and it really gave me a headache.I couldn’t run one instance of Eclipse and Tomcat with some heavy weight applications (Sun IDM…etc) .System behaved even worst than Window$.Yeah, I same applications ran smoothly in Window$ than in Ubuntu.

Anyway I always liked Debian , because its giving me the “pure” “GNUish” and “open source” feeling.There is no doubt that Debian is the future of OS.Ubuntu well…my feeling is its sinking…It has become to a something like “linux for dummies”.

Whatever the linux distro you are using ,take a look at this advertisement by IBM :

Beryl with flgrx + AIGLX in Debian Etch

Beryl 

I have suffered over months to get Beryl working on my Toshiba Satellite with ATI Radeon x700.I found out that this was an issue every ATI mobility x700 users encountered.(Click here if you don’t know what beryl is)
Now finally, I managed to get it working with new ATI proprietary ‘flgrx’ driver.After installing the driver you have to configure the xorg.conf.

In the ServerLayout section add the option ‘AIGLX’

Section “ServerLayout”
     .
     .
    Option “AIGLX” “true”
EndSection

In the section ‘Module’ you should have ‘dri’,’glx’.In my xorg.conf its like

Section “Module”
   Load  “i2c”
   Load  “bitmap”
   Load  “ddc”
   Load  “dri”
   Load  “extmod”
   Load  “freetype”
   Load  “glx”
   Load  “int10”
   Load  “type1”
   Load  “vbe”
EndSection

In the ‘Device’ section you shoud have follwing options,notice that I am using the ‘flgrx’ driver

Section “Device”
   Identifier  “aticonfig-Device[0]”
   Driver      “fglrx”
   Option     “XAANoOffscreenPixmaps”
   Option      “AGPMODE” “4”
   Option      “AGPFastWrite” “true”
   Option      “DisableGLXRootClipping” “true”
   Option     “AddARGBXVisuals” “true”
   Option     “AllowGLXWithComposite” “true”
   Option     “EnablePageFlip” “true”
EndSection

Also you should have following sections:

Section “DRI”
      Mode         0666
EndSection

and

Section “Extensions”
    Option     “Composite” “Enable”
EndSection

That’s it.Now you can run ‘beryl-manager’ and enjoy the eye candy.
(beryl debian packages  can be found at http://debian.beryl-project.org )

For more info visit beryl home page http://www.beryl-project.org

To download ATI proprietary linux driver (flgrx) visit www.ATI.com