Trie Data Structure in Scala

Trie is a data structure can be used for auto complete feature. Following a a rudimentary implementation of a Trie in Scala.

Trie can be constructed similar to a tree structure. Following code snippet illustrated a node in the Trie tree *

package ashika.datastructure.trie

case class Node(content: Char,
                children: collection.mutable.Map[Char,Node],
                words:collection.mutable.Set[String],
                var metaData: Map[String,String] = null)

 

Tree implementation including addWord() and find() methods are implemented as follows :

 

package ashika.datastructure.trie

import scala.collection.mutable
import scala.collection.mutable.ListBuffer
import util.control.Breaks._

case class Entry(words: Set[String], meta: Map[String,String])
class Trie {
  val root : Node = Node(Char.MinValue,mutable.Map[Char,Node](), collection.mutable.Set());

  /**
    * find matches for a given prefix
    * @param str
    * @return
    */
  def find(str: String) : Seq[Entry] = {
    val lb = ListBuffer[Entry]()
    var i = 0
    var node = root

    breakable {
      while (i < str.length) {
        val ch = str.charAt(i)

        val nnode = node.children.get(ch.toLower)
        if (nnode.isDefined) {
          if (nnode.get.words.nonEmpty && i>=str.length) {
            lb.addOne(Entry(nnode.get.words.toSet, nnode.get.metaData))
          }
          node=nnode.get
        } else {
           return lb.toList
        }
        i+=1
      }
    }
    //break
    def dfs(node: Node, charLst : Seq[Char]) : Unit = {
      if (node.words.nonEmpty) {
        lb.addOne(Entry(node.words.toSet, node.metaData))
      }
      node.children.foreach(elm => {
        dfs(elm._2, charLst:+ elm._2.content)
      })
    }
    dfs(node,Seq[Char]())
    lb.toSeq
  }

  /**
    * add new word to the Trie with optional meta data
    * @param str
    */
  def addWord(str: String, meta: Map[String,String] = null): Unit = {
    var node = root

    var i = 0
    breakable {
      while (i < str.length) {
        val ch = str.charAt(i)
        val containerNode = node.children.get(ch.toLower)
        //node exists
        if (containerNode.isDefined) {
          node = containerNode.get
          if(i == str.length-1) {
            node.words.addOne(str)
            node.metaData = meta
          }
        } else {
          //no node for this char
          break
        }
        i += 1
      }
    }
    while(i < str.length) {
      val ch = str.charAt(i)
      val nnode : Node = if (i==str.length-1) {
        Node(ch,mutable.Map[Char,Node](), collection.mutable.Set(str), meta)
      } else {
        Node(ch,mutable.Map[Char,Node](), collection.mutable.Set(), meta)
      }
      node.children.addOne(ch.toLower,nnode)
      node = nnode
      i+=1
    }
  }
}

Sample driver program :

 

object Trie {

  def main(args:Array[String]) = {
    val trie = new Trie
    trie.addWord("Ashika")
    trie.addWord("Ashola")
    trie.addWord("Airplane")
    trie.addWord("Asしか")
    trie.addWord("Ash")
    trie.addWord("ash")
    trie.addWord("AsH",Map("id"->"productId"))
    trie.addWord("日本語")
    trie.addWord("日本人")
    trie.addWord("cat")
    trie.addWord("catterpillar", Map("id"->"a"))

    trie.addWord("日本")
    trie.addWord("日人")
    (1 to 11).foreach(i => {
      trie.addWord("dummy name here with some name"+i)
    })


    (trie.find("as").foreach(e => {
      println(e.words, e.meta)
    }))

  }

}

OpenGLES2 on Andriod

3D graphics has been one of my favourite areas when it comes to programming.So recently, I wanted study what’s new in the field specially in OpenGL.I decided to play around with OpenGLES2 on Android.Since OpenGLES2 rendering pipeline is entirely runs on shaders and has removed all fixed pipeline logic from the spec.This is quite challenging but a good way to understand the shaders and may be later time I can move in to OpenGL4 with this knowledge.

I started developing a 3D engine for Android.At the moment it has very basic functionalities.Following video shows OBJ file load, texturing , directional lights  , FPS camera movement .I had to write a custom widget for “gamepad” controller.

I have shared the source at here : https://bitbucket.org/umanga/projectx2

Download the app at http://dl.dropbox.com/u/17399055/projectX2.apk

 

Decode an audio stream using libavcodec and play through libAO .

FFMPEG is an awesome set of libraries for decoding/encoding audio and video.Since its a highly experimental  and keep updating
regulary , I found that many of the samples that I can find on the net are kind of outdated.
Following code snippet shows ,how to decode an audio stream using libavcodec and play it using crossplatform audio library – libAO .


[sourcecode language=”python” wraplines=”false” collapse=”false”]
your source code goes here
[/sourcecode]

#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#include <math.h>

extern “C” {
#include “libavutil/mathematics.h”
#include “libavformat/avformat.h”
#include “libswscale/swscale.h”
#include <ao/ao.h>

}

void die(const char *msg)
{
fprintf(stderr,”%s\n”,msg);
exit(1);
}

int main(int argc, char **argv)
{

const char* input_filename=argv[1]

av_register_all();

AVFormatContext* container=avformat_alloc_context();
if(avformat_open_input(&container,input_filename,NULL,NULL)<0){
die(“Could not open file”);
}

if(av_find_stream_info(container)<0){
die(“Could not find file info”);
}

av_dump_format(container,0,input_filename,false);

int stream_id=-1;
int i;
for(i=0;i<container->nb_streams;i++){
if(container->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO){
stream_id=i;
break;
}
}
if(stream_id==-1){
die(“Could not find Audio Stream”);
}

AVCodecContext *ctx=container->streams[stream_id]->codec;
AVCodec *codec=avcodec_find_decoder(ctx->codec_id);

if(codec==NULL){
die(“cannot find codec!”);
}

if(avcodec_open(ctx,codec)<0){
die(“Codec cannot be opended!”);
}

//initialize AO lib
ao_initialize();

int driver=ao_default_driver_id();

ao_sample_format sformat;
AVSampleFormat sfmt=ctx->sample_fmt;
//assign device sample rate depend on the input stream
if(sfmt==AV_SAMPLE_FMT_U8){
sformat.bits=8;
}else if(sfmt==AV_SAMPLE_FMT_S16){
sformat.bits=16;
}else if(sfmt==AV_SAMPLE_FMT_S32){
sformat.bits=32;
}

sformat.channels=ctx->channels;
sformat.rate=ctx->sample_rate;
sformat.byte_format=AO_FMT_NATIVE;
sformat.matrix=0;

ao_device *adevice=ao_open_live(driver,&sformat,NULL);
//end of init AO LIB

//data packet read from the stream
AVPacket packet;
av_init_packet(&packet);

int buffer_size=AVCODEC_MAX_AUDIO_FRAME_SIZE+ FF_INPUT_BUFFER_PADDING_SIZE;;
uint8_t buffer[buffer_size];
packet.data=buffer;
packet.size =buffer_size;

//frame ,where the decoded data will be written
AVFrame *frame=avcodec_alloc_frame();

int len;
int frameFinished=0;
while(av_read_frame(container,&packet)>=0)
{

if(packet.stream_index==stream_id){
int len=avcodec_decode_audio4(ctx,frame,&frameFinished,&packet);

if(frameFinished){
//play the decoded bytes
ao_play(adevice, (char*)frame->extended_data[0],frame->linesize[0] );
}else{
//
}

}else {
//someother stream,probably a video stream
}

}

av_close_input_file(container);
ao_shutdown();
return 0;
}


Maven Jetty Plugin JNDI JDBC / DBCP configuration

Yesterday,I literary spent almost 5 hours just to get JNDI DBCP working properly with maven jetty plugin.
Previously I used tomcat6 ,but I noticed that using maven/jetty I can save lot of deployment time during development. So same as lots of other guys struggled with this problem …it gave me a real pain for too long.. So I thought it would be nice to write a blog just in case this might helpful to anyone out there.

In short here are the important points:

1) JNDI configuration : Like META-INF/context.xml in Tomcat ; jetty uses WEB-INF/jetty-env.xml (There are other ways like configure in the jetty-server.xml , but i am not going to discuss them here since my idea is to simply move the project between tomcat and jetty)

Here is the jetty-env.xml:
<?xml version=”1.0″?>

<!DOCTYPE Configure PUBLIC “-//Mort Bay Consulting//DTD Configure//EN” “http://jetty.mortbay.org/configure.dtd”&gt;

<Configure class=”org.mortbay.jetty.webapp.WebAppContext”>

<New id=”immunodb” class=”org.mortbay.jetty.plus.naming.Resource”>
<Arg>jdbc/ImmunoDB</Arg>

<Arg>

<New class=”org.apache.commons.dbcp.BasicDataSource”>

<Set name=”driverClassName”>com.mysql.jdbc.Driver</Set>

<Set name=”url”>jdbc:mysql://localhost:3306/ImmunoDB</Set>

<Set name=”username”>root</Set>

<Set name=”password”>immuno</Set>

</New>

</Arg>
</New>

</Configure>

Note that theres only two <Arg> arguments here ,and if you went through Jetty JNDI Documentation you might find you need to put 3 <Arg> tags.
This depends on the number of arguments in the constructor of “org.apache.commons.dbcp.BasicDataSource”,which changes with the jetty/jetty-maven-plugin version you use.

2) Jetty/Jetty-Maven version:
In POM add the jetty-maven-plugin , I prefer version 6.9.1,since in some versions there’s a bug which JNDI resources are not getting registered properly.

<build>

<plugins>

<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>
<version>6.1.9</version>
</plugin>
</plugins>
</build>

3) I case of JDBC/DBCP , put your jdbc-driver.jar and apache-dbcp.jars in the POM dependencies.
With tomcat you have to put these jars in $TOMCAT/libs folder (wont work if they are in WEB-INF/libs) , but in jetty you can put them in WEB-INF/libs.

Thats it..now u can access you JNDI source as follows”

Context initCtx = new InitialContext();
Context envCtx = (Context) initCtx.lookup(“java:comp/env”);
DataSource ds = (DataSource)
envCtx.lookup(“jdbc/ImmunoDB”);

Hope this helps.

Clean FTP library for Java – FTP4J

Recently I was been assigned to write a FTP layer for a project at work.It needed to be supported all the funky FTP stuff like active,passive mode all kind of proxies like FTP,HTTP,SOCKS…etc.

At first I was little worried cause even in Apache Commons Net, it doesn’t support clear architecture for all those weird FTP stuff.Then I came up with this library FTP4J which saved my life.

FTP4J is well designed and easy to use library and best thing is its open-source and comes with LGPL license.