Showing posts from July, 2013

OPA for HTTP Authorization

Open Policy Agent[1] is a promising, light weight and very generic policy engine to govern authorization is any type of domain. I found this comparion[2] very attractive in evaluating OPA for a project I am currently working on, where they demonstrate how OPA can cater same functionality defined in RBAC, RBAC with Seperation of Duty, ABAC and XACML.  
Here are the steps to a brief demonstration of OPA used for HTTP API authorization based on the sample [3], taking it another level up.
Running OPA Server First we need to download OPA from [4], based on the operating system we are running on.  For linux, curl -L -o opa Make it executable, chmod 755 ./opa Once done, we can start OPA policy engine as a server.
./opa run --server Define Data and Rules Next we need to load data and authorization rules to the server, so it can make decisions. OPA defines these in files in the format of .rego. Below is a sample …

Hadoop Multi Node Set Up

With this post I am hoping to share the procedure to set up Apache Hadoop in multi node and is a continuation of the post, Hadoop Single Node Set-up. The given steps are to set up a two node cluster which can be then expanded to more nodes according to the volume of data. The unique capabilities of Hadoop can be well observed when performing on a BIG volume of data in a multi node cluster of commodity hardware.    It will be useful to have a general idea on the HDFS(Hadoop Distributed File System) architecture which is the default data storage for Hadoop, before proceed to the set up, that we can well understand the steps we are following and what is happening at execution. In brief, it is a master-slave architecture where master act as the NameNode which manages file system namespace and slaves act as the DataNodes which manage the storage of each node. Also there are JobTrackers which are master nodes and TaskTrackers which are slave nodes.    This Hadoop document inc…