Primary Skills
Expert
Developing Java Classes
7
Using threads / Cconcurrency Issues
7
Implementing Data Access Classes
8
RX Java
1
Spring Boot
2
Hibernate
7
Spring
7
Spring Data
7
Spring Integration
1
JPA
7
JDBC
5
Hudson / Jenkins
4
Maven2
7
Developer
Lambda Expressions
1
Scala
1
Akka
1
Functional Programing
1
SQL
10
Designing Database Schema
10
Creating ERD
3
Writing SQL Statements
10
Optimization and Tuning (Please Specify Database)
3
Search
Kafka
1
MongoDB
1
Cassandra
1
Redis
1
Spring Batch
Spring Security
2
Linux OS - power user
5
Writing scripts - shell / bash / cshell etc
3
MySQL
10
Tomcat
5
Subversion
7
GIT
1
Docker
1
LogCollector
1
Logback
1
Shell Script
2
AWS
1

Professional experience

Data Architect & Implementer
Behalf

Design and implement an ETL pipeline to stream data from salesforce to Bigquery using apache beam on top of google dataflow. The task included understanding the amounts of data, and the api’s that salesforce supports. The delta changes needs to be transferred every 15-30 minutes across all tables, with configuration options to add tables and columns in the future without code changes.. Apache Beam on top of dataflow was chosen as the data pipeline for it’s abstract sdk for processing data. Google App-Engine was used as the scheduling mechanism, and google app script as the monitoring dashboard.

Java Distributed Architect
AngleSense

Help a startup to re architecture a monolithic application to a micro-service platform. This included breaking up application to multiple services with kafka as the communication bus. Help redesign application for scale by introducing rx-java in the software tier along with clustering of servers, and cassandra as a backend database for scaling data. Help introduce more functional style programing using javaslang framework On the devops side, I helped create the Cloud-formation structure on top of AWS for clustering of servers

Java Distributed Architect
Thomson Reuters

Architect for application that downloads and parses documents from multiple sources. /n Support for big load using Spring Boot architecture in the Scala language on top of Akka. /n Application was then packaged in Docker. Tech leader on legacy product written in scala for text NLP analysis. /n Add framework for streaming data to application lucene indexes from server databases.

Java Developer Amdocs
2014

Tech leader on monitoring team of Amdocs Portfolio. /n My job was to bring the product to next level including enhancements in cluster definitions using zookeeper. /n Introduce and create framework for sending dropwizard metrics to graphite using pickel protocol for optimizations. /n Introduce elasticsearch to product for event analysis. Enhance metrics library embedded in portfolio by creating dynamic proxies for metrics.

Java Developer FIS
2013

Joined a team that just started to build a Spring/Hibernate product on a legacy DB./n Helped as tech lead to solve multiple hibernate issues including nested filters and dynamic cache of metadata not supported by the hibernate model. /n Introduced Spring Integration to team, with full flow using JMS, xml transformations, and web service calls (based on classic integration patterns).

LATEST ARTICLES

border:

Javascript For Java Developers Javascript For Java Developers There are times that a java programmer needs to write some javascript functions for an eco system that does not have a framework. For example I am working with BigQuery (google’s big data platform) and writing user functions that are in javascript....

border:

AppEngine Dataflow SpringBoot In a past blog I wrote about Scheduling Dataflow pipeline. There I described how to leverage google's app engine and cron server to schedule dataflow pipelines. For simplicity I had used java spark as the web server. Thought now I have decide that spring boot is more...

border:

Apache Beam Testing So you decided to join the wagon of apache beam. Now that you have written your pipeline, you would like to test it. Apache beam has a full framework for testing your code. You can test each function, Composites, and even a full pipeline. For the full...

tags:
border:

Apache Beam Good Bad and the Ugly I would like to share with you my experience with apache beam on the latest project that I have worked on. Senario My project was an ETL from salesforce to bigquery. Since we need to do this with a lot of tables and...

tags:
border:

BigQuery I have been working on BigQuery for a few months and would like to share what I have learnt. What is BigQuery? BigQuery is a SAS Database platform by google. BigQuery is very similar to RDBMS, and you use SQL to work with it. The main advantage of BigQuery...

border:

Scheduling Dataflow pipelines Google has a product called Dataflow. Dataflow is a engine for processing big data. Google took the dataflow model and externalized it as Apache Beam Model (see https://beam.apache.org). The idea behind Apache Beam, is a generic model that deals with both streaming and batch including windowing with...

border:

Salesforce Data Extractor Conundrum There are many companies that are using Salesforce as their CRM. Though, what is very lacking in the CRM systems is analytics. So many companies need to export their data from Salesforce to another systems like BigQuery. If you need to do this and do not...

border:

Cassandra Scheme Update In the world of RDBMS when we have a database with a scheme we have the issue of how to manage changes in the scheme. When updating our product, we also need the option to update our database scheme. For this we have tools like liquibase and...

tags:
border:

From Monolithic to Microservice in practice From Monolithic to Microservice in practice This blog will help you to migrate your monolithic application to microservices (see Martin Fowler – Microservices) in small steps. The end goal is to have each service totally decoupled from any other service. This means that the...

tags:
border:

MDC - Stack Logback has a very nice feature called Mapped Diagnostic Context (MDC in short). According to the manual what it does is: "It lets the developer place information in a diagnostic context that can be subsequently retrieved by certain logback components" This is very useful for adding additional...