Pages

Sunday, January 18, 2015

Titanic: Machine Learning from Disaster


For those of you who are not familiar with Kaggle.com, the latter website is a competition website concerned with data science and Machine Learning problems. Several commercial and non-profit companies and organization make their problems open to the public domain in hopes that they can find solutions to their own data problems or improve the performance of their existing ones. 

No matter what your level of expertise is with machine learning, there are many beginner-level problems to learn from and solve. There are also several real-world competitive problems that if solved efficiently would grant the winner teams up to 100,000$ as well as reputation points on the leaderboard. It is also a great source for learning machine learning approaches to solving problems. For more information on kaggle, I suggest browsing their homepage for more details.

Friday, August 31, 2012

Java Unit Testing with JUnit 4.x from Terminal


JUnit


In this tutorial I will guide you through the process of installing and writing a simple java unit test all from the terminal (command line). I'm not going to go through explaining why test driven development is the way to go for most serious programmers out there, but rather explain how tests are created and then performed. Hopefully, when you go through this tutorial you will finally understand the advantages of unit testing yourself if not already so!

This tutorial assumes that you have a basic text-editor installed. In my demonstrations, I will be using the famous Vim text-editor; however, you can use your preferred text-editor let it be NotePad or any other text-editor you find suitable to your needs. Also, it is assumed that you have the Java Compiler and JVM installed.

Thursday, March 8, 2012

Java Web Service Client with JAX-WS in Eclipse

In this tutorial, I will show how simple it is to create a web service client that consumes the web service created in the previous blog post. We will be using JAX-WS annotations to inject a reference to the web service in our web service client code. I will be using the Eclipse IDE (Indigo) to  build the web service client.

First let us start by creating a standard Java Project. I will name the project "SimpleCalculatorWebServiceClient". Click finish to allow Eclipse to create the Java Project.




Tuesday, March 6, 2012

Java EE Web Service with JAX-WS in Eclipse

In this simple tutorial I will demonstrate step-by-step how to build a java web service using Java API for XML Web Services JAX-WS in Eclipse. The JAX-WS API is available in Java SE 1.5 and later. JAX-WS can be used to build web services and web service clients that communicate using XML messages. For the purpose of this tutorial I will be using:
  1. Eclipse IDE (Indigo) to create the web service.
  2. Glassfish application server to host the web service.
Make sure Glassfish is installed and configured in your Eclipse IDE. For more information on how to configure Glassfish in Eclipse refer to this link.

First open Eclipse and create a "Dynamic Web Project". Name the project "SimpleCalculatorWebService". Set the Target Runtime to Glassfish, and the configuration to minimal.




Sunday, December 18, 2011

The Hopfield Neural Network II

In this implementation of the Hopfield Neural Network, I designed the classes in a way that would make the network dynamic. By that I mean that the user can determine the size of the network during run-time of the program. Note that this post is a subsequent post to the original The Hopfield Neural Network I. Please refer to that post if you want to understand the internals of HNN.

Wednesday, December 14, 2011

Hopfield Neural Network

The Hopfield 4 Neurons Single Layer Neural Network
One of the simplest Neural Networks out there. Unlike other more complex Neural Nets, HNN employes a single network layer which is the input and output layer at the same time. This implementation of HNN contains four neurons acting together to recognize any four-digit binary pattern.

When the network is first initialized, it needs to be trained to recognize a specific binary pattern. Depending on the pattern we choose, the system will build what is called a weight matrix - during the training -. This weight matrix will help recognize the specific pattern we chose. When we feed the system with a new pattern and run the network, it will run the new pattern against the weight matrix and identify the pattern if it was the same original pattern or its inverse. Moreover, if the pattern was closely similar it will try to auto-correct the entered pattern. This is the auto-associative property of the Hopfield NN.