Chaim Turkel
Backend/Data Architect
Holds the position of a Group Leader mentoring fellow workers, and with clients as a Tech Lead to achieve their Technical & Business requirements. With 20 years of experience in the field from very diverse clients, Chaim brings his experience with a passion to solve new challenges in the field of distributed applications and data.
- Follow Chaim
- Backend group
Primary Skills
skill/years
Expert
- Developing Java Classes 7
- Using threads / Cconcurrency Issues 7
- Quarkus 1
- Reactive Programing 2
- Spring Boot 3
- Hibernate 1
- Spring 3
- Spring Data 7
- Spring Integration 7
- JPA 7
- Hudson / Jenkins 7
- Maven 1
- Beam 7
- Airflow 1
- Flink 1
Developer
- Lambda Expressions 1
- Scala 1
- Akka 1
- Functional Programing 5
- SQL 10
- Designing Database Schema 10
- Creating ERD 3
- Writing SQL Statements 10
- Elastic Search 1
- Search 1
- Kafka 1
- MongoDB 1
- Cassandra 1
- Redis 1
- Spring Batch
- Spring Security 2
- Linux OS - power user 5
- MySQL 10
- Tomcat 5
- Subversion 7
- GIT 1
- Docker 3
- LogCollector 1
- Logback 1
- Shell Script 3
- AWS 2
- GCP 2
Portfolio
Microservice Scale Architect @ Seculert
Design and implement scale scraping of data from cloud providers (AWS, Azure…) using technologies of microprofile with quarkus implementation. Design infrastructure for all scrappers using reactive programming. Design infrastructure for storage of data on elastic search. Enhance enrichment of data using Flink.
BI Platform Architect @ Behalf
Design and implement data lake on top of Google cloud storage. Architecture included data lake as a data interface to all R&D projects including different data sources, such as Salesforce, MongoDB, Postgres, and others. Use Google Dataflow for ETL to bring and process data from Salesforce, MongoDB to BiggQuery.
Build warehouse on top of BigQuery using Airflow as an orchestrator.
Data Architect & Implementer @ Behalf
Design and implement an ETL pipeline to stream data from Salesforce to Bigquery using apache beam on top of google dataflow. The task included understanding the amounts of data, and the API’s that salesforce supports. The delta changes need to be transferred every 15-30 minutes across all tables, with configuration options to add tables and columns in the future without code changes. Apache Beam on top of dataflow was chosen as the data pipeline for its abstract SDK for processing data. Google App-Engine was used as the scheduling mechanism, and Google app script as the monitoring dashboard.
Java Distributed Architect @ AngleSense
Help a startup to re-architecture a monolithic application to a micro-service platform. This included breaking up an application to multiple services with Kafka as the communication bus. Help redesign application for scale by introducing rx-java in the software tier along with clustering of servers, and Cassandra as a backend database for scaling data. Help introduce more functional style programming using javaslang framework On the DevOps side, I helped create the Cloud-formation structure on top of AWS for clustering of servers
Java Distributed Architect @ Thomson Reuters
Build architecture for an application that downloads and parses documents from multiple sources. Support for big load using Spring Boot architecture in the Scala language on top of Akka for concurrency. The application was then packaged in Docker. Tech leader on a legacy product written in scala for text NLP analysis. Add framework for streaming data to application Lucene indexes from server databases.
Java Developer @ Amdocs
WSF - Dynamic platform for web service creation for customization teams. Supplying tools for customization team using maven archetypes and CLI tools for WSDL generation. Application Server framework based on spring CXF & security. AMF - Monitoring platform for Amdocs applications. Added support for real-time history streaming via graphite & elastic search. Using zookeeper for configuration & cluster synchronization.
Java Developer @ FIS
Support legacy database with spring & hibernate framework. Due to the legacy issue job involved deep diving with hibernate. This included creating smart hibernate filters per stack level. All SOAP support via dozer framework to reduce code duplication. Rest API versioning. Rest API documentation via enunciate. Generate generic framework for the internal protocol using jxpath, javaassit